The Online Safety Act received Royal Assent in October 2023, but its full effects have been rolling out in stages across 2024 and 2025 as Ofcom, the UK's communications regulator, published codes of practice and implementation deadlines for platforms of different sizes. By early 2026, the Act's major provisions are operational — and their impact on your experience of social media, search engines, and messaging platforms is beginning to be felt.
What the Act Requires of Platforms
The Online Safety Act establishes a duty of care framework: large platforms — defined as those with significant UK user bases, roughly above 7 million monthly users — must take "reasonably practicable" steps to protect users from a defined range of harmful content. The specific categories include illegal content (terrorism, child sexual abuse material, fraud, illegal weapons and drug sales), and, for services that are likely to be accessed by children, a broader range of "legal but harmful" content including bullying, self-harm promotion, and eating disorder content.
The Act gives Ofcom the power to issue fines of up to £18 million or 10% of global annual turnover — whichever is higher — for non-compliance. For the largest platforms, this means potential fines measured in billions of pounds. Ofcom also has powers to require platforms to implement specific technical measures, including age verification and content filtering systems.
Age Verification: What's Actually Changing
The most visible change for many users has been the progressive rollout of age verification requirements for services that host legal but harmful content. Pornography websites have been required to implement "robust" age verification since July 2025. Social media platforms have been required to implement age assurance mechanisms for new accounts since October 2025, with Ofcom's codes of practice setting out what constitutes adequate assurance.
In practice, what this means for most users depends on whether you are affected by the verification requirements. Adults who have been using established accounts for years will generally not notice significant changes. New account registrations on major platforms now involve some form of age verification — typically a credit card check, digital ID document scan, or vouching system through an existing verified account.
Content Moderation and What You'll See
The more significant changes for most users may be in what they do and do not see on their feeds. Platforms are required to have systems in place to identify and remove priority illegal content rapidly. Ofcom's codes specify response time targets — typically within 24–72 hours for user reports of illegal content, and faster for content flagged by Ofcom or law enforcement directly.
Platforms are also required to offer users more control over their content experience, including the ability to opt out of recommendation algorithms and to see content in chronological order. Whether individual platforms implement these options prominently or bury them in settings menus varies — and Ofcom has indicated it will be monitoring implementation quality as well as compliance on paper.
Encrypted Messaging: The Controversial Provision
The Act's most controversial provision relates to encrypted messaging services like WhatsApp, Signal, and iMessage. The Act allows Ofcom to require these services to use "accredited technology" to identify child sexual abuse material in private messages. This is technically achievable through client-side scanning — analysing content on the device before it is encrypted — but critics including Signal and WhatsApp have argued that any such backdoor fundamentally undermines the security of end-to-end encryption for all users.
As of early 2026, Ofcom has not yet formally required any messaging platform to implement such technology. The provision remains operational but unexercised, and the legal and technical debates around it continue. Signal has reiterated its position that it would withdraw from the UK market rather than implement client-side scanning.
What You Can Do
The Act also creates new user rights. If a platform removes your content or suspends your account, you now have a right to appeal that decision and to receive a reasoned explanation. Platforms are required to implement complaints mechanisms that respond within specified timeframes. If you are dissatisfied with a platform's response, you can escalate to Ofcom, which maintains a complaints reporting mechanism. For full details of user rights under the Act, Ofcom maintains a consumer guidance page at ofcom.org.uk.