UK regulators are tightening online safety rules for tech platforms. This page breaks down what Ofcom demanded from X, how blocking access to accounts tied to banned groups will work, the mandated 24- to 48-hour review windows, quarterly data publishing, and how these moves stack up against past online safety efforts. Below you'll find practical answers to the questions people most often search for when these headlines pop up.
Ofcom secured commitments from X that include blocking UK access to accounts tied to banned groups, mandating rapid reviews of flagged content within a 24- to 48-hour window, and providing quarterly data for a 12-month period. This is part of a broader push to reduce terrorist and hate material on major platforms.
The agreement requires X to block or disable access to accounts associated with groups banned in the UK. In practice, this means faster identification and removal or throttling of such accounts for UK users, with ongoing monitoring and enforcement to prevent workarounds or de-anonymization that could sustain harm.
Content flagged as illegal must be reviewed within 24 to 48 hours. Over a 12-month period, X will publish quarterly data detailing moderation actions, compliance rates, and metrics on takedowns and content removals to demonstrate progress and accountability.
This effort is more targeted and time-bound than some prior steps, emphasizing concrete review timelines and public quarterly reporting for a defined period. It reflects a broader trend of tying platform commitments to measurable, near-term performance metrics in the fight against terror and hate content.
Reports from AP News, The Guardian, The Times of Israel, Reuters, and Politico cover the latest conditions on X, including the 24-hour review window, 85% within 48 hours, and the broader Grok-related investigations. These sources also contextualize responses from anti-racism groups and moderation experts.
While the page summarizes the current commitments, regulators typically have enforcement mechanisms and follow-up actions for non-compliance. This can include escalated review, additional deadlines, penalties, or further regulatory pressure to ensure adherence.
Media regulator announces commitments by Elon Musk’s platform to crack down on terrorist and hate content