This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

Global online safety: Ireland’s media regulator adopts online safety code for video sharing platforms

With the UK’s Online Safety Act due to start its gradual entry into force in December, we are starting to see online safety taking centre stage in other jurisdictions. Most recently, the media regulator in Ireland, Coimisiún na Meán (the Media Regulator) has published an online safety code for ‘video-sharing platforms’ with their EU headquarters in Ireland. 

The code forms part of the Media Regulator’s ‘Online Safety Framework’ which comprises three different pieces of legislation, namely: (1) the Online Safety and Media Regulation Act 2022, (2) the EU Digital Services Act, and (3) the EU Terrorist Content Online Regulation. 

The code is a binding set of rules aimed at protecting users, and in particular children, from harmful content such as incitement to hatred or violence, CSAM and terrorism content, and content that may affect the physical, mental or moral development of children. The code will require video-sharing platforms to prohibit certain categories of video and associated content, and to provide means for reporting breaching content. The code also requires these platforms to have appropriate age verification measures and parental controls in place. 

Video-sharing platforms with EU headquarters in Ireland (many of which are household names) will be expected to comply with the code from next month, with a transition period for some of its more prescriptive provisions, to give providers time to implement the necessary measures. 

The Media Regulator was established in March 2023 and is also the designated ‘Digital Services Coordinator’ for Ireland under the EU Digital Services Act. The swift adoption and prioritisation of this online safety code represents the Media Regulator’s commitment to holding online platforms accountable for keeping their users safe. This aligns with the recent focus on online safety and the protection of children in the UK, with the upcoming entry into force of the Online Safety Act. 

The Framework gives us the tools to address the root causes of harm online, including the availability of illegal content, the harmful impacts of recommender systems, and inadequate protections for children on social media services. Social media companies can and should do more to make their platforms safer, moving to a safety-by-design approach.

Subscribe to receive our latest insights - on the topics that matter most to you - direct to your inbox, at your preferred frequency. Subscribe here

Tags

online safety act, data protection and privacy, online safety, commentary