Advertisement

France Votes to Bar Children Under 15 From Social Media

Ludovic Marin/AFP/Getty Images

“He fell in love with an insubstantial hope, mistaking a mere shadow for a real body.” (Ovid, Metamorphoses 3.417-418, trans. Melville, 1986)

France’s National Assembly passed far-reaching legislation early on January 27, 2026, that would prohibit minors under fifteen from accessing online social networking services, marking one of Europe’s most ambitious attempts to shield children from what lawmakers describe as the seductive and harmful design of digital platforms. The bill, Protecting minors from the risks to which they are exposed by using social networks in the 17th legislature​ (see the full text in French here PROPOSITION DE LOI visant à protéger les mineurs des risques auxquels les expose l’utilisation des réseaux sociaux), focuses on protecting minors from the risks to which they are exposed by the use of social networks, adopted by a vote of 130 to 21 after an overnight session, and now advances to the Senate, where it faces potential amendments before final passage.  If enacted without major changes, the core restriction would take effect on September 1, 2026, with existing accounts granted a four-month grace period.

The Architecture of the Ban

Official Partner

The adopted text introduces a categorical prohibition on under-fifteen access to social network services, while carving out explicit exemptions for online encyclopedias, educational and scientific directories, and platforms dedicated to open-source software development and sharing. Enforcement responsibility falls to ARCOM, France’s audiovisual and digital communications regulator, which must coordinate with the European Union’s Digital Services Act framework, including cross-border cooperation when platforms operate from other member states. This enforcement architecture signals France’s intent to work within existing EU regulatory structures rather than creating an isolated national system.

Age Gates: Redesigning the Digital Environment

“Unwittingly he desired himself, and was himself the object of his own approval, at once seeking and sought, kindling the flame with which he burned.” (Ovid, Metamorphoses 3.425-426, trans. Melville, 1986)

The legislation extends well beyond a simple age gate. Two consumer protection provisions require platforms to ensure minors are not exposed to excessive commercial pressure and prohibit the promotion of products or services likely to harm minors’ physical or mental health on interfaces specifically designed for young users. Perhaps most significantly, the bill creates a novel liability framework for algorithmic recommendation systems. When a platform uses profiling to recommend or rank content for an account identified as belonging to a minor, it may be treated as exercising editorial activity for that promoted content, opening the platform to publisher-level liability. The text defines promotion broadly to include personalized feeds, trending sections, autoplay functions, push notifications, and equivalent features. This provision represents a fundamental shift in how French law treats platform design. Rather than viewing recommendation algorithms as neutral tools, the legislation recognizes them as active architects of the digital reflecting pool.

Like Narcissus captivated by his own image, young users encounter curated versions of themselves through likes, comments, shares, and algorithmic amplification of content that mirrors their identities, anxieties, and aspirations back to them. The platforms construct and maintain this reflection, and France’s bill attempts to hold them accountable for the consequences when the viewer cannot look away.

Restricting the Marketplace of Attention

The adopted provisions also target the commercial ecosystem that recruits young users. Direct and indirect advertising for social network services specifically aimed at minors is banned, including promotional content distributed through influencers or brand partnerships. In a notable enforcement mechanism, the bill requires influencers who promote services such as social networks to display a clear warning label throughout the promotion stating “produits dangereux pour les moins de quinze ans” (dangerous products for those under fifteen). This labeling requirement treats social media platforms as potentially hazardous products, comparable to age-restricted substances, and places compliance obligations on content creators who monetize youth attention.

From Elementary Schools to High Schools: Physical Devices, Physical Spaces The legislation also extends France’s existing mobile phone restrictions upward into high schools. While primary schools and middle schools have operated under a phone prohibition since 2018, high schools have had greater flexibility (Education Code Article L511-5, 2018).

Amended by Law No. 2018-698 of August 3, 2018 – art. 1
The use of a mobile phone or any other electronic communication device by a student is prohibited in preschools, elementary schools, and middle schools, and during any educational activity taking place outside the school premises, except in circumstances, particularly for educational purposes, and in locations where the school’s internal regulations expressly permit it.
In high schools, the internal regulations may prohibit a student’s use of the devices mentioned in the first paragraph in all or part of the school premises, as well as during activities taking place outside the school.
This article does not apply to equipment that students with a disability or a debilitating health condition are authorized to use under the conditions set forth in Chapter I of Title V of Book III of this part.
Failure to comply with the rules established pursuant to this article may result in the confiscation of the device by administrative, teaching, educational, or supervisory staff. The internal regulations specify the procedures for its confiscation and return.

Article L511-5
Version effective since August 6, 2018

Under the new framework, each high school must specify in its rulebook the places and conditions where phone use is permitted. If a school fails to establish specific provisions, a statutory default applies phones are forbidden during lessons unless a teacher explicitly requests their use, forbidden in corridors, and allowed only in a defined courtyard zone. These rules are scheduled to take effect at the start of the 2026-2027 school year, creating a structured environment that reduces exposure to digital stimuli during designated periods.

No Digital Curfews nor Parental Digital Negligence

The Assembly-adopted text bears multiple articles marked “Supprimés” (deleted), indicating that portions of the original proposal were stripped during debate. Earlier versions reviewed by the Conseil d’État, France’s highest administrative court and legal adviser, had included a broader package of measures. Among the provisions that did not advance were a proposed “digital curfew” for 15- to 18-year-olds and a “digital negligence” offense that would have held parents criminally liable for their children’s excessive screen time. The Conseil d’État had warned that such measures raised concerns about proportionality and potential conflicts with European Union law. The narrowing of scope suggests lawmakers prioritized platform accountability over family surveillance.

An Unenforced Precedent Already on the Books

France is not starting from a blank statutory ground. A 2023 law established a “majorité numérique” (digital majority) at age 15, generally requiring parental authorization for younger users to sign up for social media platforms (Law No. 2023-566, 2023).

The introduction of a digital age of majority at 15 aims to protect children from social networks by ensuring that platforms implement a technical solution when they register, as well as to better prevent and prosecute online crimes, such as cyberbullying.

Note that online encyclopedias, for example, Wikipedia, and non-profit educational and scientific directories are not affected by the new measures on the digital majority.”

However, as Vie-publique, the French government’s public information service, notes, this requirement has not been applied in practice, citing missing implementing regulations and constraints related to European Union frameworks. The gap between statutory intent and operational reality underscores a recurring challenge: age verification and enforcement mechanisms are complex, resource-intensive, and raise privacy concerns. The new bill’s assignment of enforcement to ARCOM and its integration with DSA structures may represent an attempt to close this implementation gap, though critics remain skeptical.

The Scientific Warning That Shaped the Debate

On January 13, 2026, two weeks before the Assembly vote, ANSES (the French Agency for Food, Environmental and Occupational Health and Safety) published an expert assessment documenting the risks associated with adolescent social media use. The report emphasized that harms cannot be reduced to “time spent” alone because platform design features and emotional engagement mechanisms independently drive negative outcomes. ANSES identified cyberbullying, exposure to harmful content promoting self-harm or eating disorders, and particular vulnerability patterns among adolescent girls as documented risks.

Experts have studied the mechanisms used by social networks to capture the attention of teenagers. ” To assess the effects of social networks on health, it was important to go beyond the time spent on the networks, and to consider what teenagers actually do on social networks, their motivations and their emotional engagement, ” explains Olivia Roth-Delgado, coordinator of the expertise. Read more on Powerful attention-grabbing devices to which teenagers are particularly vulnerable.

Securing social media use to protect the health of teenagers explains that the business model of social networks aims to maximize user time for commercial purposes. The goal is to sell both advertising space and data on user preferences and habits. Companies developing social networks therefore implement attention-grabbing strategies designed to maintain user engagement for as long as possible. These strategies rely on powerful incentives such as manipulative interfaces ( dark patterns ) and algorithms that deliver highly personalized content. These algorithms can generate a “spiral effect” where users become trapped in increasingly targeted, sometimes extreme, content. 

Social networks as they are designed today exploit, in fact, the specific needs of adolescence in terms of social interaction and comparison, sensations and risk-taking, as well as the search for recognition from their peers .  ” Adolescence is a sensitive period in the development and construction of individual and social identity. Adolescents have less capacity for emotional and behavioral regulation than adults, which makes them particularly vulnerable to the harmful effects of social networks, ” explain the experts. The timing and institutional weight of the ANSES report provided scientific legitimacy to legislative advocates who argued that digital platforms function as environmental hazards requiring regulatory intervention rather than voluntary compliance.

What Comes Next: Senate Scrutiny and Implementation Challenges

The bill now moves to the Senate, where amendments are likely. If the Senate alters the text, it will return to the National Assembly for further readings in a process that could extend for months (French legislative procedure). The most contested provisions are likely to involve the algorithmic liability framework, which treats platforms as publishers when they use profiling to recommend content to minors. This represents a significant departure from the intermediary liability protections that platforms have historically enjoyed under both French law and EU directives. How this provision interacts with the Digital Services Act’s existing liability architecture will require careful legal interpretation and may face challenges from platforms arguing it creates conflicts with EU-level rules. The September 1, 2026, target implementation date creates a compressed timeline for developing age-verification technologies, establishing ARCOM enforcement protocols, and coordinating with other EU regulators where platforms operate across borders. Privacy advocates have raised concerns that robust age verification may require forms of identity documentation or biometric screening that create surveillance risks and data security vulnerabilities, particularly for young people. Platforms may also face practical difficulties distinguishing between accounts held by minors and those held by adults, especially in households where devices and accounts are shared.

Symbolic Gesture or Enforceable Model?

Ameinias was a very determined but fragile youth. When he was cruelly spurned by Narkissos (Narcissus), he took his sword and killed himself by the door, calling on the goddess Nemesis to avenge him. As a result when Narkissos saw the beauty of his form reflected in a stream he fell deeply in love with himself. In despair and believing that he had rightly earned this curse for the humiliation of Ameinias, he slew himself. From his blood sprang the flower.” Conon, Narrations 24 (trans. Atsma) (Greek mythographer C1st B.C. to C1st A.D.)

The legislative center of gravity is shifting from simple prohibition toward platform design accountability, particularly through provisions that alter liability when recommendation systems target minors. Whether these measures survive Senate review and how they are operationalized under DSA frameworks will determine whether France has created a symbolic gesture or a meaningfully enforceable model. France is not restraining users but attempting to regulate the platforms that construct the reflecting pool, the algorithmic systems that amplify and curate self-presentation, and the commercial incentives that profit from sustained adolescent attention. Not far behind is Austria, where the Austrian government has unveiled plans to introduce a ban on social media platforms for individuals under fourteen, with the new regulation expected to take effect in the autumn. Backed by the coalition parties, the proposal is intended to mitigate perceived risks associated with young people’s engagement with social media, including compulsive use patterns, contact with harmful or inappropriate content, and vulnerability to online radicalizing influences.

Narcissus could not break free because he mistook his reflection for substance. France’s gamble is that by holding platforms accountable for the images they amplify and the feedback loops they engineer, young people might encounter digital environments less designed to trap them in cycles of self-regard and social comparison. Whether platform accountability provisions prove robust enough to change design incentives, or whether enforcement gaps and workarounds render the law merely symbolic remains to be seen.

Dr. Jasmin (Bey) Cowin, a columnist for Stankevicius, employs the ethical framework of Nicomachean Ethics to examine how AI and emerging technologies shape human potential. Her analysis explores the risks and opportunities that arise from tech trends, offering personal perspectives on the interplay between innovation and ethical values. Connect with her on LinkedIn.

author avatar
Dr. Jasmin Cowin

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use