Overview
- EU regulators issued preliminary findings that both companies failed to grant researchers adequate access to public platform data required under the DSA.
- For Meta, the Commission found Facebook and Instagram lack simple, user‑friendly tools to report illegal content such as child sexual abuse material and terrorist content, with interfaces that use dark patterns.
- The Commission also said Facebook and Instagram do not provide effective appeals, noting users cannot fully explain or submit evidence to contest moderation decisions.
- Meta says it disagrees and points to recent changes to reporting, appeals, and data access, while TikTok says it is reviewing the findings and argues DSA transparency duties conflict with GDPR safeguards.
- The probes remain open as the platforms review case files and can propose remedies, with potential fines of up to 6% of annual global revenue if breaches are confirmed; Ireland’s Coimisiún na Meán contributed 97 complaints and other issues, including minors’ protection, are still under investigation.