Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, X pushed for stronger child age checks

Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, X pushed for stronger child age checks

Tech & Science

British media and data protection authorities have called on major tech companies to strengthen age checks for users under 13 on social media, according to BBC.

Regulators Ofcom and the Information Commissioner’s Office (ICO) have approached platforms such as Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X, demanding stronger measures to prevent young children from creating accounts on these services, CE Report quotes ATA.

According to Ofcom chief Dame Melanie Dawes, many platforms currently do not prioritize child safety in their products. Most services still rely on users self-declaring their age, a method that regulators say children can easily bypass.

Ofcom studies show that around 86% of children aged 10–12 have a social media profile, even though most platforms set a minimum age of 13. Regulators are calling on companies to implement more effective age verification, similar to systems used for adult-content services.

Tech companies have defended existing measures. Meta, which owns Facebook and Instagram, said it uses artificial intelligence and facial technology to estimate age, while TikTok stated it removed over 90 million suspected accounts of children under 13 in one year.

Digital health experts say age checks are only a first step. They argue the bigger challenge is how social media algorithms and recommendation systems influence children, calling for stronger regulation and safer platform design.

Tags

Related articles