From 25 July websites and apps are now required to protect children by filtering out harmful content and verifying ages. This is a significant test for the Online Safety Act, a landmark piece of legislation that covers the likes of Facebook, Instagram, TikTok, YouTube and Google.
Social media platforms and large search engines must also prevent children from accessing pornography and material that promotes or encourages suicide, self-harm and eating disorders. This has to be kept off children’s feeds entirely. Hundreds of companies are affected by the rules.
Platforms will also have to suppress the spread of other forms of material potentially harmful to children including the promotion of dangerous stunts, encouraging the use of harmful substances and enabling bullying.
Age assurance measures for pornography providers supported by Ofcom include: facial age estimation, which assesses a person’s likely age through a live photo or video; checking a person’s age via their credit card provider, bank or mobile phone network operator; photo ID matching, where a passport or similar ID is checked against a selfie; or a “digital identity wallet” that contains proof of age.

