In our previous articles on Ofcom’s enforcement under the Online Safety Act (OSA), we examined age-appropriate access and the OSA’s international reach. This article shifts its focus to another critical element under the OSA: illegal content that encourages or assists suicide.
Warning: This article may contain distressing content relating to suicide and self-harm
A duty of care in action
Four percent of UK internet users have encountered content promoting suicide in the last month, and children are also more likely to encounter such content than adults. News publications have also been awash with articles concerning online suicide and calls from bereaved parents to 'do more' to protect children. This is an urgent issue requiring attention, particularly in light of OpenAI’s wrongful death suit.
The OSA prohibits platforms from hosting content that 'encourages or assists suicide', categorising such content as illegal. Ofcom has been empowered with a wide range of tools to combat this, including provisional notices of contravention, take-down orders, blocking orders, and fines of up to £18 million or 10% of global turnover.
By 16 March 2025, in-scope services were required to complete illegal content risk assessments, which amongst other things, must identify the risks of content that may encourage or assist suicide. Additionally, platforms must:
- swiftly remove illegal content;
- implement effective moderation and reporting systems;
- test and adapt algorithms to prevent the promotion of harmful material; and
- provide clear user protections in accessible terms.
Search services must also deprioritise or block harmful results and display crisis information.
Suicide forum under scrutiny
On 9 April 2025, Ofcom launched its first formal investigation under the OSA. The subject was an unnamed suicide discussion forum accused of hosting content that encourages suicide and directed users to means and methods. The investigation assessed:
- Failure to complete a sufficient illegal content risk assessment;
- Failure to put in place proportionate safety or moderation measures; and
- Failure to adequately respond to numerous statutory information requests.
Ofcom is now gathering evidence to determine if a provisional notice of contravention should be issued, which may lead to regulatory actions such as fines, take-downs, or blocking orders.
The stakes - beyond a single platform
This investigation signals Ofcom's preparedness for robust enforcement, even against overseas anonymous platforms, demonstrating the OSA’s extraterritorial reach.
The increase in investigations and their scope – such as the recently launched investigation into five new companies: Cyberitic, LLC, Web Prime Inc, Youngtek Solutions Ltd, ZD Media s.r.o, and the provider of xgroovy under new age check requirements; and the expansion of the investigations into 8579 LLC and Itai Tech for failing to respond to statutory information notices – reinforces Ofcom’s determination to dismantle digital spaces where harm proliferates unchecked, bringing the total to over 50 sites currently under investigation by Ofcom.
What platforms must do
In light of the ongoing investigation, Ofcom has updated its Guidance on Protecting People from Online Suicide and Self-Harm by reiterating its codes of practices for illegal content and the protection of children, in particular, having:
- Content moderation that is effective and well-resourced;
- Recommender systems that exclude illegal content from children’s feeds;
- Easy-to-find, accessible, and user-friendly complaints processes;
- Clear terms and conditions laying out how users are protected from illegal content; and
- Signposts to appropriate support for children reporting, posting, sharing, or searching suicide or self-harm material.
Ofcom is also considering additional safety measures for proactive identification of harmful material, as well as live-time reporting of livestreams.
Looking ahead
Ofcom’s response is closely watched by the industry. Enforcement against an anonymous, US-based platform could become a landmark case for cross border enforcement of digital platform regulation – as we have written on 4chan’s US lawsuit against Ofcom (A (Byrne &) Storm is Brewing – Do Not Ignore the Online Safety Act’s International Reach).
For platforms the message is clear: if content encouraging or assisting suicide is hosted on your platform – or if there’s any risk of it – now is the time to act, not wait.
--
Katten can advise on your Online Safety Act requirements, assist you in conducting risk assessments, and advise on the implementation of measures to ensure you are OSA compliant. If you would like more information, please contact Terry Green (terry.green@katten.co.uk) and Larry Wong (larry.wong@katten.co.uk).