A whitepaper on video games in relation to the Online Safety Act and Ofcom (the independent regulator of the Online Safety Act, as well as age assurance and children’s access). Written by Andrew Wailes, CEO, PlaySafe ID. Published: 18/03/2025

TLDR (short version)

If your game has mechanics which enable users to interact with each other (such as messaging), and the United Kingdom is a target market for your game, or children within the United Kingdom are able and/or likely to access, then you are considered a “regulated user-to-user internet service” and need to comply to the Online Safety Act.

As a part of this, you will have to implement “highly effective age assurance”, which means you have to add technology to your game that either:

This is a non-negotiable, as the regulator (Ofcom) states that you can only say that you don’t have children accessing your service if you have these checks in place. So, you have to have highly effective age assurance implemented to know if/how many children are accessing your service, as well as to ensure a safe and age-appropriate experience is delivered.

Additionally, as of Spring-Summer 2025, additional Codes of Practice measures will be introduced to extend the role of highly effective age assurance to protect children from grooming. This is important to know, as the highly effective age assurance process you have to add will also be a requirement at this stage of legislation too.

Closing comments on the TLDR

This is the TLDR which didn’t provide any context or explanation. In the following paper I cover everything in detail including the legislation and Ofcom’s statements. Please take the time to read the whole paper, as it will save you many headaches and fines over the coming months and years, and it will ultimately help you to keep children safer.

But, importantly for you; this type of legislation is coming into law in Australia end of 2025, and EU member states over the coming months and years. This might be the first step for you implementing solutions to overcome this type of legislation - but this is not unique to the United Kingdom, and it will become increasingly commonplace around the world.

This is the start of a change in how we as a culture design and build systems, and how users interact with online services, driven by legislation.

Intro to the paper

The Online Safety Act is designed to make the internet safer for users within the United Kingdom, in particular for children. It is regulated by Ofcom, the independent communications regulator within the UK on behalf of Parliament.

The games industry has been lulled into a false sense of security regarding the Online Safety Act because it has taken since 2023 for anything to really start happening, and the initial focus of the act has been on extremely large service providers regarding the most serious illegal and harmful content - so if you work within the games industry, you probably think this doesn’t have much to do with you.

If you haven’t read all the legislation from the UK Government and all the supporting information from the independent regulator Ofcom, you’re in for a compliance shock in the coming months. This is why I have written this whitepaper - to provide you with the details you need to know if you operate video games who have users within the United Kingdom.

I have read the Online Safety Act 2023, the statements from Ofcom, and even delved into the Communications Act of 2003, in order to understand the topic and be able to help you prepare for the coming changes you have to make.

What is the Online Safety Act

It is a new regulatory framework which has come into law with the intent of making the internet safer for users within the United Kingdom. It focuses on two key areas; illegal content and activity, as well as a large focus on content and activity that is harmful to children.

It is now law, and it impacts all service providers globally regardless of their location, as long as they either have a significant number users within the United Kingdom or the United Kingdom is one of their target markets for their service, and/or if there are children of the United Kingdom who are likely to access the service.

The enforcement of the Online Safety Act is now only starting to happen through the independent regulator, Ofcom. They have taken a staggered approach as to how the act is being enforced, which has meant the last few years not a lot has happened, but some big dates have just come to pass, and more major dates are coming up.

Who are Ofcom?

Ofcom are tasked with regulating and enforcing the Online Safety Act. They are the regulator for the communications services within the United Kingdom.

Their duties come from Parliament with priority to look after users within the United Kingdom. They are an independent regulator and have the powers to impose fines and raise criminal charges.

They cover everything from “making online services safer for the people who use them, by making sure companies have effective systems in place to protect users from harm”, to ensuring compliance and good-practices within broadband, phone services, postal service, airwaves, and helping people avoid scams and bad-practices.

Important definitions to help you understand what’s happening and how the act impacts games

It’s important for you to understand a few definitions from the legislation, so that you understand how your games are in scope.