Across the globe, governments are debating and enacting rules to limit under-16 access to social media. From the UK to Turkey to Australia, what’s changing, why it matters for teens, and how enforcement is playing out—plus what it could mean for mental health and digital literacy. Below are the key questions people are asking right now, with concise answers to help you stay informed fast.
Multiple countries are pushing for age-based limits, often targeting under-16s or under-15s. The UK is consulting on age restrictions for under-16s, following similar moves abroad. Australia has already begun enforcing a ban for under-16s, while Turkey has enacted laws restricting under-15s from having social media accounts. These measures vary by country and are often paired with parental controls and platform-design changes.
Proponents say restrictions protect mental health, reduce exposure to harmful content, and give parents clearer boundaries. Critics argue bans can infringe on privacy, limit freedom and information access, and may push teens toward unregulated or shadow apps. Debates also focus on how to balance safety with rights, and whether design changes (instead of outright bans) could be more effective.
Austria, Brazil, France, Greece, Indonesia, Malaysia, Norway, Poland, Portugal, Slovenia, Spain, Turkey, and the UK are among those introducing or considering age-based limits. Enforcement challenges include platform compliance, cross-border access, privacy concerns, and ensuring updates keep pace with new apps. Australia’s eSafety Commissioner has already investigated major platforms for compliance gaps, illustrating real-world hurdles.
Policy efforts aim to curb negative mental health impacts, such as anxiety and sleep disruption from constant social media use. In the long term, age-based limits could shift how teens develop digital literacy and critical thinking about online content. The net effect depends on enforcement, availability of safe alternative activities, parental guidance, and whether policies drive platforms to redesign features that reduce harm.
A shared concern over children’s safety, mental health, and exposure to harmful content is pushing governments to act. High-profile events and research highlighting risks online have accelerated action. Countries are examining how to protect children while balancing privacy and parental rights, often prompting ongoing consultations and possible revisions as enforcement data emerges.
Several US initiatives are gaining traction, with proposals to ban or redesign platforms for under-16s and to remove addictive features. California’s AB-1709 is an example aiming to restrict access or require redesigns. While not uniform nationwide yet, the momentum signals a broader debate about protecting youth online and how platforms should adapt.
Norway said on Friday it would present a bill in parliament by year-end to ban children from using social media until they turn 16, making technology companies responsible for the task of age verification.
Four-year-old Jessica Macrae underwent surgery last year after a tumour was discovered on the back of her brain
Trump’s unconventional leadership style and penchant for breaking protocol had raised concerns about potential missteps