Brussels has delivered a stark message to TikTok: the platform’s user experience is engineered in ways that promote addictive behaviour and may breach the European Union’s Digital Services Act. After a two‑year probe, the European Commission issued a preliminary finding in early February that functions such as autoplay and opaque recommendation systems, together with a gamified rewards scheme in TikTok Lite, create heightened risk for minors and were not subject to adequate risk assessment or mitigation.
The Commission singled out several concrete remedies it expects TikTok to adopt if it is to avoid severe penalties. Regulators proposed a night‑time "screen‑use rest" mechanism, stronger and more usable parental controls, limits on infinite scrolling and adjustments to recommendation algorithms to reduce the platform’s capacity to keep users continuously engaged. Under the Digital Services Act, the EU can levy fines up to 6% of a company’s global annual turnover — a sum that would amount to billions of dollars for a major social media business.
EU officials also revisited a separate inquiry opened in April 2024 into TikTok Lite, deployed in France and Spain, where a "tasks and rewards" programme that pays points for behaviours such as watching videos, liking content, following creators or inviting friends raised particular alarms. Regulators argued the scheme had not been properly risk‑assessed for its potential to foster compulsive use, especially among children, and lacked sufficient safeguards.
TikTok responded quickly and forcefully, calling the Commission’s description "completely wrong" and signalling plans to contest the findings. The company has routinely defended its design choices as consistent with user preferences and technical norms across the sector, and it is likely to mount a legal and public relations campaign to blunt Brussels’s leverage.
The EU action is part of a wider global trend: several countries and European governments have moved to tighten rules on minors’ access to social platforms. Australia has enacted a ban on social‑media use for under‑16s, France and Denmark have advanced age limits of 15 or 16 for new accounts, and the UK is debating similar measures. Those policy shifts reflect rising political pressure to put "guardrails" around children’s online exposure, but they also expose practical problems such as reliable age verification, privacy trade‑offs and enforcement across borderless apps.
Beyond the immediate clash between Brussels and a single app, the case matters because it tests the EU’s capacity to regulate the habits mobile platforms cultivate through interface design and algorithmic curation. Forcing changes to core engagement mechanics — autoplay, infinite scroll and recommendation loops — would go beyond conventional content moderation and cut to the business model that underpins social media advertising and retention strategies. The outcome will shape not only TikTok’s operations in Europe but also the template regulators use elsewhere when balancing child protection, privacy and commercial freedoms.
