Overview
- The draft formally defines “synthetically generated information” and mandates visible labels covering at least 10% of images or video and disclosures during the first 10% of audio, with permanent, non‑removable metadata.
- Platforms that provide AI creation or editing tools must embed clear identifiers and are barred from suppressing or altering labels or metadata.
- Significant social media intermediaries with 5 million or more users must collect user declarations at upload, deploy reasonable technical verification, and clearly label confirmed synthetic content.
- Obligations apply only to publicly available or published material, while good‑faith removals via grievance processes preserve intermediaries’ safe‑harbour; persistent non‑compliance can forfeit Section 79 protection.
- Procedural changes limit content‑takedown orders to senior officials starting Nov. 15 with periodic review, as experts warn current detection accuracy may not reliably meet verification demands at scale.