On 16 December 2024, Ofcom published its first codes of practice and guidance under the Online Safety Act (OSA), relating to illegal harms - this means that these requirements are now in force. From today, in-scope service providers have 3 months to complete their illegal harms risk assessments and will need to introduce the required safety measures by 16 March 2025. Tech firms now have a clear roadmap for meeting the requirements on illegal harms, with Ofcom declaring that it is “time for tech firms to act”.
Ofcom first published its draft proposals about the steps in-scope online service providers (which include social media firms, search engines, messaging, gaming and dating apps, and pornography and file-sharing sites, among others) should take to address illegal harms in October 2023. Back then, Ofcom began a consultation process with relevant stakeholders, which culminated in over 200 responses from the legal sector, industry and charities. With this statement, Ofcom is confirming that they have taken stock of those submissions and that the safety duties it has outlined in relation to illegal harms now apply.
Subject to completing the Parliamentary process, from 17 March 2025, online platforms will need to implement the safety measures set out in the codes. Ofcom has recommended that those looking to understand their obligations should consider reading: (1) a summary of the decisions Ofcom has taken to date, and the user-to-user and search services to which they apply; and (2) a summary of each chapter on illegal harms, setting out what each chapter is about, stakeholder feedback received, and the decisions Ofcom have taken so far.
While the illegal harms codes of practice and guidance are lengthy, Ofcom has highlighted certain requirements that it sees as being key enforcement flags. Firstly, organisations will have to demonstrate that they take online safety sufficiently seriously by naming a senior person accountable to their highest governance body for compliance with their illegal content, reporting and complaints duties. They will also have to ensure that they have in place content moderation teams with sufficient resources and training to remove illegal material (e.g., illegal suicide content) quickly when they become aware of it, and to meet “robust” safety performance targets in this regard. In addition, providers will need to establish reporting and complaints functions so that users can easily report illegal content. At a technical level, firms will need to finesse their algorithms to make illegal content harder to disseminate.
In terms of next steps, businesses will now need to comply with the relevant requirements by March 2025 or face Ofcom’s enforcement powers including fines of up to £18m or 10% of qualifying worldwide revenue – please see our previous post for more information.
In addition, Ofcom is continuing to work on other consultations on further codes which will be published in 2025, including on topics such as:
- protecting users against CSAM, revenge porn and terrorist content;
- children’s access assessments;
- additional protections for children from harmful content promoting, e.g., suicide, self-harm, eating disorders and cyberbullying;
- final age assurance guidance for publishers of pornographic material;
- crisis response protocols for emergency events (such as last summer’s riots - see our previous post on this here); and
- technology notices which enable Ofcom to require providers to use or develop specific technologies that tackle CSAM or terrorism content - please see the consultation which was also launched today. The deadline for responses is 10 March 2025.