Overview
- Australia’s eSafety Commission, which released its first compliance report on Tuesday, said it has shifted to enforcement and is formally investigating Facebook and Instagram, Snapchat, TikTok and YouTube for possible systemic breaches that carry fines up to A$49.5 million.
- The watchdog found recurring loopholes that let minors stay online, including unlimited retries on age checks, prompts to verify even after a user said they were under 16, weak ways to report underage accounts and poor barriers that allow quick re‑sign‑ups after removals.
- Platforms reported removing roughly 4.7–5 million suspected underage accounts soon after the Dec. 10 law took effect, yet a parent survey found many children still have access, with about seven in 10 retaining accounts on Facebook, Instagram, Snapchat or TikTok and about half on YouTube.
- Communications Minister Anika Wells said companies were doing the bare minimum and urged tough action if systemic failures are proven, while Meta, Snap and others argued that proving age online is hard and pushed for app‑store or phone‑level age checks with parental approval.
- The law, which puts the burden on platforms to take “reasonable steps” to block under‑16 accounts, is being watched by policymakers abroad as countries from parts of Europe to Indonesia consider or adopt similar restrictions.