-
Hackers compromised a third-party vendor managing Discord's age verification data, exposing government ID photos of about 70,000 users globally. The breach includes contact details and messages, with extortion attempts reported. Discord has revoked vendor access, launched an investigation, and contacted law enforcement. The incident raises concerns over data privacy and security in age verification processes.
-
Australia has introduced a new law banning social media accounts for users under 16, effective December 10. Platforms face fines up to A$50 million if they fail to comply. The law aims to protect children from online risks, amid ongoing debates about enforcement and privacy concerns.
-
New York City has filed a 327-page lawsuit against Meta, Google, Snap, and ByteDance, accusing them of causing a youth mental health crisis and dangerous behaviors like subway surfing. The city claims platforms exploit youth psychology for profit, leading to increased health and safety costs. The case joins a wave of similar litigation.
-
As of October 2025, Instagram enforces PG-13 content settings by default for users under 18, restricting exposure to mature themes unless parents approve changes. The update includes stricter filters on sensitive topics, limits on interactions with inappropriate accounts, and enhanced parental controls. The rollout begins in the US, UK, Canada, and Australia, expanding globally next year amid ongoing concerns about teen safety online.
-
On Monday, 20 October 2025, Amazon Web Services (AWS) experienced a significant outage in its US-EAST-1 region, causing widespread disruptions to numerous popular websites and apps including Perplexity AI, Robinhood, Snapchat, Fortnite, and Amazon's own services. AWS engineers are actively working to resolve the issue and restore normal operations.
-
Amazon Web Services experienced a significant outage originating from its Virginia data center region, affecting hundreds of online services including social media, gaming, and financial platforms. The outage lasted over 15 hours, highlighting the risks of high cloud infrastructure concentration and its widespread impact.
-
Denmark plans to restrict social media for children under 15, with some parental exemptions from age 13. The move aims to address concerns over youth mental health and online safety, following similar measures in Australia. Legislation is expected to pass after months of debate, with enforcement relying on digital ID and age verification systems.
-
Meta has announced it will restrict access for Australian users aged 13-15 starting December 4, ahead of a new law requiring social media platforms to exclude under-16s. The law, effective December 10, aims to protect minors but raises concerns over privacy and effectiveness.
-
Australia will enforce a law from December 10 that bans social media platforms including Facebook, Instagram, TikTok, and Twitch from allowing users under 16. The law aims to protect minors from online harm, with penalties up to A$49.5 million for non-compliance. Twitch plans to deactivate underage accounts from January 9.
-
Australia plans to enforce a ban on social media accounts for users under 16 starting December 10, aiming to protect children from online harms. The government is reviewing mechanisms used in other countries, including Malaysia, to implement age restrictions and ensure platform compliance.
-
Malaysia's government is reviewing measures to restrict social media access for under-16s, inspired by Australia's upcoming ban at age 16. The move aims to protect youths from online harms like cyberbullying and scams, with plans to implement electronic age verification methods next year.
-
On December 10, 2025, Australia enforced a pioneering law banning users under 16 from major social media platforms including TikTok, Instagram, and YouTube. Platforms must block new and existing underage accounts or face fines up to A$49.5 million. The law aims to protect youth mental health amid concerns over cyberbullying and addictive design, though it faces legal challenges and practical enforcement issues.
-
The European Parliament has approved a non-binding resolution calling for an EU-wide ban on social media access for children under 16, with stricter limits for those under 13. The move aims to address concerns over mental health and addictive design features, amid ongoing debates and Australia's upcoming social media ban for under-16s.
-
Russian authorities have expanded restrictions on messaging apps, citing their use for terrorism and fraud. Snapchat, FaceTime, WhatsApp, and Roblox face bans or limitations amid ongoing internet controls aimed at curbing Ukrainian drone attacks and increasing surveillance. Disruptions impact daily life and privacy.
-
Russian authorities have announced bans on Snapchat and Apple’s FaceTime, accusing both of facilitating terrorist activities, recruitment, and crimes. The moves follow a series of restrictions on social media and messaging platforms since Russia’s invasion of Ukraine in 2022, intensifying government control over online communication.
-
Recent reports highlight the rising use of AI chatbots for mental health support among UK teenagers, especially those affected by violence. Experts warn about dependency, potential harm, and the need for regulation, amid concerns over safety and emotional dependency.