This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
List Professionals Alphabetically
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z View All
Search Professionals
Site Search Submit
| 5 minute read

You Must Be This Tall to Click: The Online Safety Act and Age-Appropriate Access

As the 25 July 2025 deadline looms, the UK’s Online Safety Act (“OSA”) enters into a critical enforcement phase. One of the OSA’s most contentious and consequential elements is the requirement of “highly effective” age assurance for adult content providers. With over 40 new rules in the ‘protection of children’ phase of the OSA which we discussed in our previous article and a growing list of enforcement programmes, most recently the programme into children’s risk assessment duties which should be in place from 24 July 2025. Ofcom is making it clear: online platforms must act – or face serious consequences.

Failure to comply can lead to fines of up to £18 million or 10% of global turnover, and/or court orders for business disruption measures, such as requiring internet service providers to withdraw services from, or block access to, a provider in the UK.

A Pivotal Moment for Online Child Safety

Research from the Children’s Commissioner for England shows that the average age at which UK children are first exposed to explicit material online is 13 – with 1 in 10 encountering it as early as age 9. Recognising this alarming trend, Ofcom’s new age assurance guidance aims to stop underage access to adult content in its tracks.

From 25 July 2025, online platforms with adult content must implement “highly effective” age checks. This applies to all services accessible in the UK or based in the UK. This marks a pivotal moment for children’s safety online and a seismic change to online platforms. 

What is “Highly Effective Age Assurance”?

Ofcom takes a principle-led approach on age assurance and has provided guidance on what is compliant. In short, these methods must be:

  • Technically accurate: there is no quantitative level of accuracy, but Ofcom may in the future. Should a system have a margin of error, the challenge age should be raised to prevent access by children. 
  • Robust:  effective in real-world conditions and resistant to common by-pass methods.
  • Reliable:  delivering consistent, accurate results.
  • Fair: based on diverse datasets to avoid bias and ensure equal treatment.

Ofcom sets out a non-exhaustive list of what can be “highly effective” age assurance, this includes email-based cross-referencing, mobile network operator authentication, and facial age estimation (where no data is stored). Ofcom will not accept the common methods currently in place such as self-declaration of age, or payment systems that do not confirm users are 18+.

Critically, adult content must not be visible until verification is complete, and online platforms must not host or permit content that promotes bypassing age-checks. These measures must be implemented by all online platforms with adult content by 25 July 2025, regardless of where they are based. 

Global Outlook

The UK has led the way in regulating access to adult content, it is somewhat aligned with Europe but leads the way with implementation, scale and enforcement. The fragmented global landscape underscores the need for international collaboration and for the UK to serve as a model in age assurance.

The global approach has been varied, for example:

  • France: France has opted for strict and prescriptive age verification methods (government-issued IDs or credit cards) for adult content websites as well as requiring controls to prevent ‘double usage’ of the same verified identity. Double anonymity requirements also effectively require a separate third party age verification provider to the online platform, so the systems cannot be operated in-house. Non-compliance results in fines of up to €150,000 (or 2% of annual global turnover) and/or access restrictions. 
  • EU: The draft Digital Service Act Guidelines follows the UK’s approach. Platforms with adult-content content must perform risk assessments, implement age assurance measures and use age verification. 
  • US: Age verification (often via facial scan or official ID) is required for providers of adult content in 19 US States, although the approach to implementation has varied state-by-state and is not uniform. 

This show a need for international collaboration and standardisation for age verification as the variability has led to access being restricted by adult content providers. A 2024 NYU study found traffic to compliant sites dropped significantly, but searches for VPNs and non-compliant platforms surged. These unintended consequences reveal the fragile balance between safety, privacy, and enforceability in internet regulation.

Balancing Protection with Data Privacy

Ofcom’s tech-neutral and principles-based approach to online safety will encourage innovation in the industry. However, privacy requirements create complex compliance challenges, especially for ID- or biometric-based verification methods. 

Past initiatives to introduce age verifications have failed like the Digital Economy Act 2017, whose measures were delayed and ultimately abandoned. This underscores how privacy concerns and technical impracticalities can derail regulation. Industry leaders and civil liberties groups have also criticised current proposals as invasive and easy to bypass.

Online platforms and service providers need secure and scalable infrastructure as well as standardised systems, otherwise there is a heightened risk of data breaches, ransom attacks and identity thefts. One less intrusive, scalable alternative is device-level filtering, where trusted operating systems like iOS or Android manage age verification, or tokenisation where users are only age verified once and has an associated ‘token’ to access other sites. This shifts the privacy burden away from individual sites to being on-device or a third party, and could improve security, privacy and user trust. 

The industry needs clear standards, otherwise providers risk being forced into fragmented, insecure and overly burdensome systems. Effective frameworks must meet over-arching legal obligations (such as UK GDPR, the Children’s Code, human rights) to maintain a balanced approach. 

Providers should prioritise low- or zero-data methods to reduce liability exposure. Robust security and legal safeguards ought to be in place and providers should track developments in OS-level filtering and cross-industry standards to support compliance and security. Much remains to be developed as the industry moves further towards age verification.

Next Steps for Online Platforms

For online platforms with adult content, the roadmap is clear:

  1. Conduct a Children’s Access Assessment
  2. Implement a Highly Effective Age Assurance System
  3. Update Privacy and Terms of Service Policies to reflect new age assurance requirements 
  4. Monitor Compliance and Effectiveness of Age Assurance Measures; Adapt (to technological changes); and Report (promptly if necessary)

If a platform adopts highly effective age assurance across its entire service, it does not need to conduct a broader Children’s Risk Assessment. However, partial implementation of highly effective age assurance will still require risk assessments for any sections still accessible to children.

The Road Ahead

Regardless of privacy concerns and technological challenges, one thing is clear: compliance with the OSA is not optional. Platforms must be ready by 25 July 2025 or risk reputational and financial damage. Ofcom has shown that it is willing to investigate and act against non-compliant services – and this new age of regulation is unlikely to be reversed.

For now, the message is simple: if your service is accessible in the UK, you must ensure users are old enough to click.

--

The Social Media Group at Katten will be able to advise on your Online Safety Act requirements, assist you with conducting the risk assessments, and advise you on implementing the recommended measures to ensure you are compliant with the Online Safety Act. If you would like more information, please contact Terry Green (terry.green@katten.co.uk) and Larry Wong (larry.wong@katten.co.uk).

This article is published as part of a series about the Online Safety Act under the Global Relay Intelligence & Practice (GRIP) digital information service and available on the Katten website