Roblox faces scrutiny over child safety amid reports of grooming and graphic content, prompting government meetings and safety checks.
Recent stories include a family murder-suicide in California, a missing teenager in the UK, a police widow's grief, and a child's online disappearance. These events reveal ongoing struggles with mental health, safety, and resilience, unfolding in recent days.
On January 26, 2026, France's National Assembly approved a bill banning social media use for under-15s and mobile phones in high schools, aiming to protect children from harmful content and excessive screen time. Championed by President Emmanuel Macron, the law follows Australia's under-16 ban and now moves to the Senate for final approval.
Australian authorities and other countries are investigating Roblox amid reports of child grooming, exposure to harmful content, and self-harm material. The government has demanded safety measures and an urgent meeting with the platform, which faces potential fines and regulatory action following ongoing allegations of exploitation.
As of March 6, 2026, Indonesia has enacted a regulation banning children under 16 from having accounts on high-risk social media platforms including YouTube, TikTok, Facebook, Instagram, Threads, X, Roblox, and Bigo Live. The ban will be implemented gradually starting March 28, with penalties for non-compliant platforms. This makes Indonesia the first Southeast Asian country to impose such restrictions, following Australia's December 2025 ban.
Britain and Australia are advancing measures to restrict children's access to social media and harmful content. The UK regulators demand stronger age verification and safety protections from platforms like TikTok and Meta, while Australia enforces a nationwide ban on social media for under-16s and new laws to prevent minors from accessing age-inappropriate content. These efforts aim to address concerns over online harms, addiction, and exposure to harmful material, amid ongoing debates about effectiveness and privacy risks.
A mother in the US received a $5,185 phone bill after her daughter unknowingly made international calls on Roblox. Reddit users suggested solutions like blocking international calls or negotiating with the provider. The story highlights online safety and billing issues for parents today.
Roblox is launching new age-based accounts in June to improve safety for children, following legal actions and government concerns over harmful content and grooming. The platform aims to restrict access and enhance parental controls, but faces ongoing lawsuits alleging negligence and harmful effects on youth mental health.
Roblox has agreed to a settlement with Nevada's attorney general, which includes a $10 million fund for youth programs and new safety protections for minors. The platform will now require age verification, restrict chat for users under 16, and expand parental oversight, aiming to create a safer online environment for children.
Australia has been enforcing its social media age restrictions since December, targeting platforms like Facebook, Instagram, TikTok, and YouTube. Regulators are investigating compliance issues, with platforms failing to apply age verification consistently. Despite suspected under-age accounts being removed, gaps remain, and enforcement is intensifying.