International, Networking, News

Ofcom demands social media giants focus on child protection

UK regulator Ofcom recently called on the most popular social media platforms to enforce minimum age rules for people using their services. They went on to argue that the companies should be doing more to ensure children are kept safe online.

Ofcom stated it and data watchdog the Information Commissioner’s Office wrote to Facebook, Instagram, Snapchat, Roblox, TikTok and YouTube, requiring them to prove to parents “a genuine commitment to protecting children online”.

Since the UK’s online safety laws came into effect in 2025, Ofcom explained it had been investigating nearly 100 services, which resulted in progress being made around the sharing of child sexual abuse material, pornography site visits now requiring age checks and restrictions on platforms such as Telegram and Reddit to prevent exposing children to harmful content.

However, it added the industry had “not done enough”, and it was setting out clear demands for further action to ensure technology companies are held publicly accountable for delivering the safest possible online environment for UK children.

These demands include: effective minimum age policies to restrict children under-13s from accessing sites and apps; failsafe grooming protections; safer feeds for children; and an end to product testing on children.

Ofcom CEO Melanie Dawes said the online services in question are household names, but they were failing to put children’s safety at the heart of their products. “There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms”.

The platforms in questions a deadline of 30 April 2026 to inform Ofcom on what actions they will take.

Source: Mobile World Live

Image Credit: Stock Image

Previous ArticleNext Article

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines