This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 3 minute read

Not Child’s Play: AI and online safety for Roblox

Roblox, one of the world’s largest online gaming platforms, recently hit the headlines after launching a new 3D modelling tool for developers and users that uses AI, and following comments from its co-founder and chief executive, David Baszucki, that parents should simply not let their children be on the platform if they are not comfortable with it. Here we take a look at this new modelling tool in more detail, and discuss Mr Baszucki’s comments in light of the Online Safety Act. 

Roblox Cube

Roblox launched Roblox Cube last month1, its new AI-based 3D modelling tool that aims to improve the process for developers creating games (called “experiences” in Roblox), as well as users playing the game. As a text-to-3D model tool, it will generate objects based on text prompts. Not only does this save time by saving developers from having to design and code the objects themselves, it also has the potential to allow a greater variety of models and overall game design (as users are not tied to using existing designs). 

The tool is at an early stage at the moment: for example, it only works on “simple” objects - such as cars, animals and furniture - and users cannot interact with the models they generate. However, the potential use for this is significant if it can be scaled to allow whole experiences to be generated. 

As with all AI tools, there are questions around the data used to train the model as well as what rights users will have in the outputs. While developers own the copyright in the experiences they create, Roblox’s terms and conditions let Roblox use experiences for its own purposes, including to train AI models. This is supported by Roblox’s press release, which says that the underlying model was trained using native data. 

Additionally, it would seem that Roblox has not implemented any specific safety or other controls around the potential use of prompts for generating infringing, inappropriate or illegal content. For example, will developers be able to generate 3D models of specific car types or clothing that mimics existing brands? The Roblox press release shows a sword being generated, but will Roblox Cube let generators create any and all weapons? Given the large number of children playing Roblox, this will be a key concern to ensure the safety of its users.

Online Safety

Roblox has previously faced allegations of bullying and grooming on its platform, as well as claims of some children being exposed to harmful or explicit content in its games. The addition of Roblox Cube presents an opportunity to offer users a more engaging and enriching experience; however, it may also introduce an additional means by which children can encounter potentially harmful content. 

According to news reports, Mr Baszucki has acknowledged that there is a fine balance for Roblox to achieve between encouraging friendships between children and protecting them from harm, but considers that Roblox is able to manage this.2 In November, Roblox announced that it had made over 30 safety updates, including preventing under-13s from sending direct messages to others on the platform outside of games or experiences, and stopping under-9s from accessing content labelled “moderate” by default. In addressing parents who do not want their children using Roblox, Mr Baszucki stated that parents should not let their children be on the platform if they are not comfortable with it, noting that he trusted parents to make their own decisions. 

As the primary gaming platform for users aged 8-12 in the UK, Roblox falls squarely within scope of the Online Safety Act (OSA). The OSA imposes specific duties on in-scope online services to ensure that they offer a higher level of protection for children as compared to adults. As part of this, providers have until 16 April 2025 to perform a Children’s Access Assessment to determine whether the service is likely to be accessed by children. After this, providers will need to carry out a Children’s Risk Assessment by July 2025, to assess the risks of children being exposed to harmful content on their service. Providers must then implement safety measures to mitigate any identified risks. 

While parental involvement is a key part of keeping children safe online, the OSA places legal responsibilities on platforms themselves to assess and mitigate risks to children. Roblox and other in-scope gaming providers will therefore need to ensure that they proactively implement safety measures in the design and operation of their platform to prevent children from encountering harmful content and behaviour, and to ensure that the platform is fundamentally safe for children. 

You can read more about the Online Safety Act in Bristows’ online safety resource hub here, including our article about the protection of children under the OSA. 

Footnotes

[1]  https://corp.roblox.com/newsroom/2025/03/introducing-roblox-cube

[2]  https://www.bbc.co.uk/news/articles/c5yrjkl7dd6o 

Subscribe to receive our latest insights - on the topics that matter most to you - direct to your inbox, at your preferred frequency. Subscribe here

Tags

onlinesafetyact, artificial intelligence, data protection and privacy, online safety, technology, interactive entertainment, article