Overview
- Ofcom and the ICO wrote to Meta platforms, TikTok, Snapchat, YouTube and Roblox demanding plans by April 30 to strengthen age checks, block grooming, make feeds safer and stop testing new products on minors, with the ICO urging modern age‑assurance to keep under‑13s off services not designed for them.
- Ofcom said it will publish a public report on the companies’ responses in May and is prepared to take enforcement action, with potential fines up to 10% of global revenue under Ofcom’s powers and up to 4% under the ICO.
- The UK House of Commons rejected an Australia‑style blanket ban for under‑16s (307–173) and instead backed broader ministerial powers after consultation, prompting civil‑liberties warnings from Open Rights Group about scope, privacy risks and unregulated age‑assurance providers.
- Platforms said they already deploy safeguards: Meta cited AI‑based age detection and teen accounts, YouTube defended its risk‑based approach for youth safety, and Roblox pointed to more than 140 recent safety features including mandatory age checks for chat.
- In India, a central IT ministry official said Delhi will not introduce a standalone child‑ban law and would rely on intermediary rules and DPDP consent requirements, even as states such as Karnataka and Andhra Pradesh pursue age limits and experts flag SIM, Wi‑Fi and privacy hurdles; Mexico is consulting on proposals for possible age restrictions by June.