EU Targets TikTok’s Addictive Algorithm, Warns of Billions in Fines Under Digital Services Act
The European Union has launched one of its strongest regulatory assaults yet on TikTok, accusing the social media giant of deliberately engineering addictive user behavior and seeking legal actions against the company under the EU's Digital Services Act (DSA).
In its provisional assessment, the European Commission claims that TikTok’s core design features — including infinite scrolling feeds, automatic video playback, persistent notifications, and highly personalized content recommendations — create compulsive usage patterns that undermine user autonomy and pose risks to mental health, especially among children and teenagers.
According to the Commission, TikTok has not adequately assessed or mitigated the risks stemming from its engagement-driven algorithm. Regulators argue that the platform’s continuous stream of short-form videos encourages prolonged, often unconscious use, making it difficult for users to disengage. Officials warned that such patterns can disrupt sleep, impair concentration, and negatively affect overall wellbeing.
While TikTok offers tools such as screen-time reminders and parental controls, EU authorities found these safeguards insufficient. The Commission noted that not only are time limits easily bypassed, parental controls rely heavily on constant supervision and technical know-how, limiting their effectiveness in real-world use.
Henna Virkkunen, the EU’s Executive Vice-President for Tech Sovereignty, Security, and Democracy, said the case highlights why Europe’s digital rules exist in the first place. Social media overuse, she warned, can have lasting consequences for developing minds, and platforms must be held accountable for the effects of their design choices.
Brussels Demands Algorithm Changes and Forced “Breaks”
Rather than targeting content itself, regulators are zeroing in on TikTok’s algorithm and engagement mechanics. The Commission has called on the company to fundamentally rethink how its platform operates, urging changes that would reduce continuous consumption and give users greater control.
Among the demanded reforms are limits on endlessly loading video feeds, the introduction of meaningful breaks during extended sessions — including at night — and adjustments to recommendation systems that currently funnel users into prolonged viewing loops. The EU argues that without structural changes to how content is delivered, surface-level safety tools will continue to fall short.
TikTok has been invited to respond to the findings, but if it fails to convince regulators that it can bring the platform into compliance, enforcement action could follow. Under the DSA, penalties can reach up to 6% of a company’s annual revenue, making this one of the most financially serious threats TikTok has faced in Europe.
The case comes as TikTok faces intensifying pressure worldwide. Within the EU, Ireland fined the company €530 million last year over unlawful data transfers to China. In the United States, scrutiny of TikTok’s ownership structure and national security risks has pushed ByteDance into negotiations over American control of the app.
At the same time, governments are moving to restrict youth access to social media altogether. Spain has proposed banning under-16s from social platforms, the UK is weighing similar measures, and Australia introduced nationwide age-based restrictions in late 2025. France, Denmark, and Greece are also reviewing minimum-age requirements.
A Defining Test for the Digital Services Act
For Brussels, the TikTok probe is more than a single enforcement action — it is a test case for the EU’s sweeping new tech rulebook. The Digital Services Act, which came into force in 2024, requires large platforms to proactively assess and reduce systemic risks, including addiction, algorithmic harm, and threats to minors.
If the Commission proceeds with penalties, TikTok could become the first major tech platform to feel the full force of the DSA. The message from Europe is clear: engagement at all costs is no longer acceptable, and algorithms that maximize attention may soon carry a regulatory price tag.