This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 7 minute read

Minor Discrepancies: Comparing the European Commission’s DSA minor safety guidelines with the OSA

As explained in our OSA vs DSA comparison article, safeguarding minors (users that are under 18) is a shared goal of both the UK Online Safety Act (OSA) and the EU Digital Services Act (DSA). The OSA achieves this through requiring that in-scope service providers comply with numerous child safety obligations set out in sections 11 - 13 and 28 - 30. To support in- scope service providers with their compliance, Ofcom has published detailed guidance and codes of practice which, cumulatively, are hundreds of pages long and can often be difficult to break down. The DSA is not nearly as prescriptive as the OSA and contains one broad obligation (set out in Article 28), which requires that online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors on their service. To date, there has been no clarity on what exact measures should be put in place; however, the European Commission has now published draft guidelines (the Guidelines) which clarify its expectations for Article 28 compliance.

In this article we take a quick look at what the Commission is proposing and how this compares with the measures set out in Ofcom’s minor safety guidance and codes of practice.

Risk assessments & general governance requirements

The Guidelines recommend that service providers carry out an assessment to determine: (1) how likely it is that minors will access their service, and (2) the risks that the service may give rise to, in particular based on the specific service features. 

This is similar to the requirements under the OSA to carry out a Children’s Access Assessment and a Children’s Risk Assessment; however Ofcom is more prescriptive as to what each assessment should entail and has published specific guidance on each (see here and here). By way of example, the Children’s Risk Assessment guidance requires that a four step methodology is adopted and that the assessments are based on ‘evidence inputs’ obtained from sources including user reports, complaints, industry data, and commissioned research. 

Broadly the same service features should be taken into account when carrying out the relevant OSA and DSA assessments e.g., its design, its user interface, whether it is possible to message unapproved contacts, and how attractive it is to minors.

Age assurance

The Guidelines explain that an assessment should be carried out to determine whether age assurance is required and proportionate and, if so, which age assurance method (verification, estimation, and self declaration) is the most effective. This assessment should be based on five principles, namely: (1) accuracy, (2) robustness, (3) reliability, (4) non-discrimination, and (5) non-intrusiveness. These principles broadly overlap with those set out in Ofcom’s guidance on highly effective age assurance (which we reviewed here and here) and the EDPB statement on age assurance published earlier this year.

The Guidelines explain that self declaration (i.e. an individual voluntarily giving their age) won’t ensure a high standard of safety or privacy for minors. Age estimation (using technology to determine someone’s approximate age or age range) is appropriate where the service presents a medium risk and where website terms and conditions require the user to be over 18. Finally, age verification based on the use of hard identifiers or (e.g. credit cards or government IDs) or verified sources of identity is needed for high risk services, namely the purchase of alcohol and access to pornographic or gambling websites. The Guidelines helpfully clarify that the EU Digital Wallet (once available towards the end of 2026) will constitute an effective age verification measure. Before then, online services can use the EU age verification solution or an alternative but equally effective measure (though no examples are provided).

This approach broadly aligns with the recommendations in Ofcom’s guidance on highly effective age assurance. The following subtle differences are worth noting:

  • While digital identity services (including digital wallets) are considered by Ofcom to be a form of highly effective age assurance, the EU Digital Wallet may not be available to certain UK users and online services may therefore need to rely on hard identifiers to verify age. As explained by us here, highly effective age verification measures approved by Ofcom include credit card and mobile network operator checks. 
  • Under the OSA, there is an entire section dedicated to the regulation of providers of pornographic content (described as Part 5 services) which includes an express requirement for such providers to implement ‘highly effective age assurance’; however, unlike the Guidelines, highly effective age assurance for the purposes of Part 5 services does not strictly require age verification to be implemented and in the illustrative example provided by Ofcom, age estimation can be acceptable provided that Ofcom’s other criteria are satisfied and a ‘challenge age’ approach is adopted (see specific guidance on age assurance for Part 5 services). Where age estimation is appropriate, the Guidelines do not specify which forms of age estimation should be used. By contrast, Ofcom’s guidance on highly effective age assurances suggest that email based age estimates and facial age estimation can be considered forms of highly effective age assurance. 

Moderation

Content moderation has been a longstanding feature of Trust & Safety compliance, with many platforms already having systems in place to identify and review content which is potentially harmful to children; however, the scope and maturity of these systems varies significantly between services. The difficulty in determining which measures to use (hash-matching technology, URL detection, supportive prompts, or manual moderation) is further complicated by the requirement to ensure that such moderation does not unduly inhibit freedom of expression. 

The Guidelines provide only very limited clarity on Commission expectations which include: (1) policies and procedures which prioritise the moderation of content likely to be harmful to minors, (2) having terms and conditions which clearly define what type of content is considered illegal or harmful, and (3) technologies which prevent AI systems from generating or sharing content that is harmful to minors. 

The OSA imposes similar obligations with notable additional requirements to set and record performance targets, and to provide individuals working in content moderation with training and materials. Service providers also have more clarity on the moderation techniques Ofcom considers effective based on its 2023 report on this subject.

Default settings

In order to ensure that default settings ensure a high level of privacy for minors, the Guidelines recommend a list of measures including that the visibility of individual pieces of user content can be adjusted, only minors can take screenshots of content on their profile, and that tracking features and push notifications are disabled. More broadly the Guidelines recommend that user control and empowerment are augmented through more choice (including periodical prompts to re-affirm user settings). 

Again, these measures are very similar to those set out in Ofcom’s recommended risk mitigation measures with the OSA placing a particular emphasis on signposting. 

Support tools & complaints processes

The Guidelines recommend that both users and guardians have access to a suite of inbuilt tools to help them identify and report illegal or harmful content. More specifically, these include the display of warning messages where users attempt to upload or share illegal or harmful content, messages which warn minors where they are interacting with a chatbot, and an ability for guardians (where appropriate) to limit minor screen time and manage minor account settings. 

In addition to having easy-to-use complaints processes, the Guidelines also recommend that minors are able to provide more general feedback about content, accounts or groups that make them feel uncomfortable. 

The OSA requires that similar tools are made available to minors and guardians with a number of additional recommendations in the code, including in particular to monitor its performance against set targets. 

Recommender systems & advertising

The Guidelines amplify the DSA recommender system rules which, as noted by us here, adopt more of a structured format to those set out in the OSA guidance and codes. The Guidelines discourage the use of behavioural data (e.g. watch time or click through rates) to determine what content is presented to minors and also provide that minors should be able to choose an option of their recommender system that is not based on profiling. The latter requirement appears to expand the scope of Article 38 DSA (which applies only to VLOPs and VLOSEs) to all providers of online platforms. 

The Guidelines also contain a dedicated section on commercial practices which states that service providers should ensure that minors are not exposed to marketing and communication of products or services that can have an adverse impact on their privacy, safety and security. The Guidelines recommend that this is done through policies, the removal of hidden or disguised advertising, and avoiding the use of intermediate or virtual currencies. The OSA does not contain such specific recommendations but does require certain categories of services to implement measures to tackle fraudulent advertising. 

Final thoughts?

There is clearly a large degree of overlap between the measures proposed by the Guidelines and those contained in the OSA codes and guidance. This means that service providers within the scope of the OSA may be able to leverage existing processes and assessments to assist with demonstrating Article 28 compliance. Understandably, service providers may feel a sense of frustration with the additional paper exercise required and, in particular, having to identify and reconcile any subtle differences between the two regimes (yet another post-Brexit headache). 

Similar to the OSA, the measures listed in the Guidelines are not mandatory and service providers can choose to comply with Article 28 using alternative (but equally effective) measures. However, unlike the OSA codes, demonstrating compliance with the Guidelines is not a ‘safe harbour’ which guarantees compliance. This is likely due to the fact that Ofcom’s codes, at hundreds of pages, are much more detailed. With the less prescriptive Guidelines, there may be a degree of latitude available to service providers in how they choose to comply and these may not always meet the Commission’s expectations.

Overall, efforts to comply with the Guidelines are likely to be a strong line of defence for service providers in the event of any Article 28 complaints or requests for information. We may also see Member States choosing to supplement the Guidelines by introducing their own national laws or codes (which some Member States have done already).

The Guidelines are open to consultation until 10 June 2025. 

For more on protecting children online, check out our explainer article, which you can find on our dedicated OSA hub, The Safety Net.

Adopting these and other measures – on matters from recommender systems and governance to user support and reporting – may help providers of online platforms make online platforms safer, more secure and more privacy preserving for minors.

Subscribe to receive our latest insights - on the topics that matter most to you - direct to your inbox, at your preferred frequency. Subscribe here

Tags

online safety, article, commentary