Ofcom has, four months ahead of the statutory deadline, published its first-edition codes of practice and guidance on tackling illegal harms – such as terror, hate, fraud, child sexual abuse and assisting or encouraging suicide – under the UK’s Online Safety Act 2023

The Act places new safety duties on social media firms, search engines, messaging, gaming and dating apps, and pornography and file-sharing sites. Before they can enforce these duties, they are required to produce codes of practice and industry guidance to help firms to comply, following a period of public consultation.

Subject to the codes completing the Parliamentary process by this date, from 17 March 2025, sites and apps will then need to start implementing safety measures to mitigate those risks, and the codes set out measures they can take. Some of these measures apply to all sites and apps, and others to larger or riskier platforms.

While developing the codes and guidance, Ofcom heard from thousands of children and parents about their online experiences, as well as professionals who work with them. New research also highlights children’s experiences of sexualised messages online, as well as teenage children’s views on our proposed safety measures aimed at preventing adult predators from grooming and sexually abusing children. Many young people we spoke to felt interactions with strangers, including adults or users perceived to be adults, are currently an inevitable part of being online, and they described becoming ‘desensitised’ to receiving sexualised messages.