The protection of children on the internet has been a huge focus in the media lately, from calls for protection of children online from bereaved parents travelling to the United States to calls from the Duke and Duchess of Sussex for more to be done to protect children online.
Last Thursday (24 April 2025), Ofcom published a major policy statement containing 6 volumes and 8 finalised guidance as part of its Phase 2 implementation of the Online Safety Act (OSA) for protecting children from harm online. Given the media coverage and Ofcom’s policy statement, the protection of children online is under even greater scrutiny than before.
As part of OSA Phase 1 implementation, Ofcom has shown it is serious with its enforcement on platforms’ OSA duties. More information can be found in our previous article here.
In this article, which is a wider series about the Online Safety Act, we explore what additional measures will be implemented to keep kids safe online.
Access by children - the Children’s Access Assessment (deadline 16 April 2025)
From 16 April 2025, all platforms regulated under the OSA (i.e. platforms anywhere in the world with links to the UK) are expected to have completed their Children’s Access Assessment. Platforms need to assess (at the minimum every 12 months) whether it is likely for children to access their platform, with considerations of:
- Is it possible for children to normally access the service; AND
- Either:
- Are there a significant number of children who are users of the service; OR
- Is the service of a kind likely to attract a significant number of children?
Satisfying Questions 1 and 2 will result in the outcome that it is likely for children to access the assessor’s platform. Whilst this is relatively easy to complete, it demonstrates the expectation from the OSA and Ofcom for all providers to consider the impact and risk of access by children on their platform even if they provide adult-only content (such as pornography). Ofcom has made this clear by citing evidence that children may be attracted to dating and pornography services. Unless platforms have implemented ‘highly effective age assurance’, Ofcom expects the risk assessment to determine that it is likely for children to access the platform.
Be sure to look out for our upcoming article in June on ‘highly effective age assurance’ which is required to be in place for adult content providers by July 2025.
Protection of children - the Children’s Risk Assessment (deadline 24 July 2025)
The Children’s Risk Assessment must be conducted every 12 months by all platforms likely to be accessed by children. It categorises harmful content to children in three categories:
- Primary Priority Content (PPC): i) pornography; ii) content that encourages, promotes, or provides instructions for suicide; iii) content of the same for deliberate self-injury; and iv) content of the same for behaviours associated with an eating disorder. (4 types of PPC)
- Priority Content (PC): 8 types of content as outlined by the Ofcom guidance, similar to the 17 priority illegal harms, such as abuse, hate, bullying and violence. (8 types of PC)
- Non-Designated Content (NDC): content which presents material risks of significant harm to children in the UK, such as body-shaming or body-stigmatising content or content promoting depression, hopelessness and despair. (at least 2 types of NDC as identified by Ofcom)
Platforms must then conduct a risk assessment on i) the likelihood of a child encountering the harm; and ii) the impact to children from the kind of content, for each of the 4 PPCs, 8 PCs and at least two of the NDCs identified by Ofcom as well as any additional NDCs identified by the platform.
Non-Designated Content
Identifying and assessing NDC may be challenging as they are non-specific and Ofcom expects platforms to be able to review its service in-depth and to identify NDC. This means platforms cannot rely on Ofcom to outline the risks they need to consider, and must identify and assess additional risks unique to their platforms. Platforms are likely to requirement expert help in assessing these risks.
Platforms are also under an obligation to report identified NDC to Ofcom at nondesignatedcontent@ofcom.org.uk. Whilst there is no specified timeframe for reporting, it is likely that newly identified NDC would have arisen at the most recent Children’s Risk Assessment and should be reported accordingly upon conclusion of the risk assessment.
The Children’s Risk Assessment and the Illegal Harms Risk Assessment
The Children’s Risk Assessment employs the same methodology as the Illegal Harms Risk Assessment in terms of how risk assessments are conducted, which should have been in place for all platforms from 16 March 2025. Platforms are expected to have evidential input into its risk assessment such as core inputs of user data and incident reviews as well as enhanced inputs such as product testing data and consultations.
The same record-keeping requirements also apply, so the information captured in the Children’s Risk Assessment should largely be the same as the Illegal Harms Risk Assessment.
There is an inevitable overlap of the risks considered in the Illegal Harms Risk Assessment and the Children’s Risk Assessment. Ofcom still expects a separate risk assessment into the risks and harms but specifically in the context of protecting children online, however both sets of risk assessments should work alongside each other to outline risks specific to illegal harms and/or the protection of children.
The Protection of Children Code
Similar to the recommended measures of the illegal content code of practice, Ofcom has published c. 70 recommended measures for user-to-user services and search services to implement following completion of the Children’s Risk Assessment. The recommended measures under the Protection of Children Code are broadly similar to the illegal content code, such as requirements for governance and accountability, content moderation, and reporting and complaints.
However, there are additional measures such as age assurance processes and default settings for children. It is expected the same ‘comply or explain’ approach and the ‘forbearance period’ of up to six months would apply, as such by February 2026, platforms should have these measures implemented, failing which enforcement penalties of fines of up to £18m or 10% of global turnover, whichever is higher, could apply to those in default.
Next steps
Our advice is that platforms should use their existing Illegal Harms Risk Assessment to aide construction of their Children’s Risk Assessment. Adult content providers who are yet to implement ‘highly effective age assurance’ will have to update their Children’s Access Assessment once it is implemented by July 2025, as such it is unlikely they will need to conduct a Children’s Risk Assessment.
Ofcom is consulting on changes to the requirement of blocking and muting user controls as well as disabling comments for services between 700,000 to 7 million monthly UK users, as opposed to only services with 7 million monthly UK users. The consultation closes 22 July 2025 and is available here. There is also an existing consultation on the draft guidance for how to protect women and girls online which closes 23 May 2025 and is available here.
Phase 2 implementation of the OSA is well and truly underway, two additional risk assessments and over 70 recommended measures will add significant obligations to platform providers on top of their duties on illegal harms. The overlap of risks should make it easier for platforms, but considerations should be made to ensure these overlapping risks and assessments complement each other and are consistent.
--
The Social Media Group at Katten will be able to advise on your Online Safety Act requirements, assist you with conducting the risk assessments, and advise you on implementing the recommended measures to ensure you are compliant with the Online Safety Act. If you would like more information, please contact Terry Green (terry.green@katten.co.uk) and Larry Wong (larry.wong@katten.co.uk).
This article is published as part of a series about the Online Safety Act under the Global Relay Intelligence & Practice (GRIP) digital information service and available on the Katten website.