The below is relevant and accurate as of August 2024.
UK Online Safety Act
The UK Online Safety Act was passed into law in October 2023. The Act places a duty of care on user-to-user and search services, with users in the UK, to protect their users from harm on their service. The online safety rules focus on ensuring that regulated services have systems and processes in place to keep people safe online.
While the precise duties vary from service to service, most regulated services will need to:
- assess the risk of harm from illegal content;
- assess the particular risk of harm to children from harmful content (if children are likely to use the service);
- take effective steps to manage and mitigate the risks identified in these assessments – Ofcom will publish guidance that services can follow to do this;
- provide clear terms of service, explaining plans to protect users;
- make it easy for users to report illegal content, and content harmful to children;
- make it easy for users to complain, including when they think their post has been unfairly removed or account blocked; and
- consider the importance of protecting freedom of expression and the right to privacy when implementing safety measures.
A few of the largest and riskiest regulated services will have other duties to meet, so that:
- people have more choice and control over what they see online; and
- companies are more transparent and can be held to account for their activities.
Ofcom
Ofcom is the UK’s independent communications regulator, with over 20 years of experience regulating communications sectors including telecoms, media and broadcast, spectrum and post.
Since the passage of the UK Online Safety Act, Ofcom has taken on a new role as the regulator for online safety in the UK. Their job is to make sure online services, like sites and apps, meet their duties to protect their users. As the regulator, Ofcom’s role in implementing the new online safety rules includes providing guidance to regulated services and supervising them to ensure they know how to meet their duties and users are protected.
Tackling child sexual abuse under the online safety regime
The UK Online Safety Act sets out safety duties so that, among other things, online services must carry out risk assessments to understand the likelihood and impact of child sexual exploitation and abuse (CSEA) appearing on their service. They must also take steps to mitigate the risks identified in their risk assessment and to identify and remove illegal content where it appears. The higher the risk on a service, the more measures and safeguards they will need to take to keep their users safe from harm, and prevent their services being used as a platform to groom and exploit children.
Unlike in some other jurisdictions, as the regulator, Ofcom does not receive individual complaints related to any content, including CSEA content. The UK Online Safety Act is not a content moderation regime and so Ofcom will not be reviewing individual complaints and will not be directing services to remove content. However, a number of reporting channels exist already in the UK to support UK users to report content.
In Ofcom’s draft regulatory guidance published in October 2023, the regulator suggested safety measures that services can adopt, which would make a meaningful difference in protecting children from CSEA. These safety measures include:
- Hash-matching technology, which automatically detects known CSAM images shared by users in their public content.
- URL detection technology, which scans public posts to remove illegal URLs that lead to material depicting the abuse of children.
- Prevention of CSAM URLs from appearing in results by search engines and applying warning messages on search services when users search for content that explicitly relates to CSAM.
- Measures to tackle the online grooming of children, including safer default settings that make it harder for strangers to find and interact with children online.
- Supportive prompts and messages for child users during their online journey, to empower them to make safe choices online, such as when they turn off default settings or receive a message from a user for the first time.
Ofcom ran a three-month consultation on the draft regulatory guidance referenced above. This included engagement with children on the proposed measures. Following the consultation, Ofcom is now working to consolidate a regulatory statement, due to be published at the end of 2024, which will put in place the first set of requirements for regulated services under the UK Online Safety Act, after which services will be required to produce their first risk assessments.
Ofcom is always looking to develop the research and evidence base to strengthen and add to current listed safety measures, and they plan to iterate the codes of practice (e.g. safety measures listed above) over time to strengthen protections against CSEA online. This will include tackling ‘first-generation’ or ‘novel’ CSAM (CSAM images that have not been previously identified and hashed), additional interventions to disrupt commission of CSEA offences online, and measures to strengthen anti-grooming proposals.
For regulated services: Ofcom’s programme of research and engagement with regulated services, especially small and medium-sized ones, helps to design new resources and tools to support services protect UK users and comply with the new rules. You can take part, submit enquiries and also sign up for email updates on Ofcom’s website.
Sources:
Tackling child sexual abuse under the online safety regime - Ofcom
Online safety rules: what you need to know - Ofcom
Harmful online content: how to report it and where to get help - Ofcom