New online safety rules will force tech firms to change, Ofcom insists
The regulator has set out how websites and apps must protect children from harmful content.

Ofcom’s new online safety rules to protect children will be “transformational” and will force tech firms to “do things in a different way”, the regulator has said.
Ofcom published its final children’s codes of practice on Thursday, setting out rules for how websites and apps must protect children from harmful content, including by using age assurance tools and reconfiguring algorithms to prevent young people accessing illegal and harmful material.
However, some online safety campaigners have argued the rules do not go far enough and give tech firms too much control over their approach and to define what content is harmful, rather than forcing them to block it.
Andy Burrows, chief executive of the Molly Rose Foundation – set up in honour of Molly Russell, who chose to end her life, aged 14, after viewing harmful content on social media – said Ofcom’s proposals are a “whole series of missed opportunities” that were “giving far too much weight to industry – rather than focusing on how it builds measures or how it sets objectives that can actually tackle the problem”.
But Almudena Lara, Ofcom’s child protection policy director, said she disagrees with that assessment, insisting the new rules will completely change the landscape of social media.
She told the PA news agency: “We need to start from the position of absolute respect for all these campaigners and the lived, horrific experiences that many of them have gone through, and I would, if I were in their shoes, always want to go further and faster, I take that as a given, but I disagree that these are not an ambitious set of rules.
“These are completely transformational. When implemented, all companies need to do things in a different way to where they are now – no company is at present meeting the requirements that we are putting out there when it comes to protecting children.
“Of course, we all want to go further and faster, and we will continue to work on this, and we have already announced that we are going to put out further rules.
“But that doesn’t detract from the fact that this is a very important moment, and this will be transformational.”
Under the codes, any site which hosts pornography or content which encourages self-harm, suicide or eating disorders, must have robust age verification tools in place in order to protect children from accessing that content.
Those tools could be the use of facial age estimation technology, photo ID matching, or credit card checks to verify age more reliably.
In addition, platforms will be required to configure their algorithms to filter out harmful content from children’s feeds and recommendations, ensuring they are not sent down a rabbit hole of harmful content, as well as giving children more control over their online experience through tools to block and filter out content and connection requests.
In total, the codes set out 40 practical measures firms must meet by July in order to fulfil their duties under the Online Safety Act.
As well as fines, which can be up to £18 million or 10% of qualifying global revenue – which could reach billions of pounds for the largest firms – Ofcom will also have the power to seek a court order banning access to a site in the UK, in the most extreme cases.
The NSPCC said the rules mark a “major step forward” for online safety, but added it wants to see Ofcom and the Government go further.
Rani Govender, policy manager for child safety online at the children’s charity, said: “This is a pivotal moment for children’s safety online. After seven years of campaigning, today marks a major step forward towards holding tech companies accountable for protecting children from harm on their platforms.
“However, unless Ofcom goes further to deliver the strong protections children need and deserve, they will continue to face preventable harm online. Private messaging platforms remain especially concerning as unmoderated harmful content can spread like wildfire.
“While Ofcom have looked to add some protections, end-to-end encrypted services will continue to pose an unacceptable, major risk to children under the current plans.

“We look forward to reviewing the codes further, but it’s crucial to remember these measures are an important stepping stone rather than the end solution.
“Both Government and Ofcom must act with urgency to build on these codes to ensure children are successfully protected from harm online.”
Campaigners have also previously raised concerns that tech giants, predominately based in the US, could pressure President Donald Trump to demand carve-outs for big platforms as part of any trade deal with the UK.
Speaking to Sky News, Mr Burrows said: “I don’t think any parents watching this morning would expect that our children’s online safety be at the whims of geopolitics.
“It shouldn’t be determined by Elon Musk and (US vice-president) JD Vance in the White House, rather than the UK’s independent regulators and politicians.
“The reality is, children’s lives and children’s wellbeing just should not be jeopardised for the sake of trade deals or a particularly distorted view of economic growth.”
Peter Kyle, the Secretary of State for Science, Innovation and Technology, has said US tech firms “must adhere to British laws” if they are to operate in the UK.
Speaking to Nicky Campbell on BBC Radio 5 Live, he said Silicon Valley bosses such as Mr Musk and Mark Zuckerberg must “adapt to the different territories they have access to”.
He added: “I’ve had the pleasure of visiting these companies, both in Silicon Valley and at their offices here in the UK.
“I do explain to them how brilliant Britain is – but actually being active in our society is a privilege, not a right.
“If you have that access, you must obey and adhere to British laws, and you must pay heed to keeping people safe.”