Particle.news
Download on the App Store

Unsealed Deposition Says 1 in 5 Young Teens Report Unwanted Sexual Images on Instagram

Prosecutors are pressing Adam Mosseri over the years-long delay to Instagram’s teen nudity filter.

Instagram app icon is seen on a smartphone in this illustration taken October 27, 2025. REUTERS/Dado Ruvic/Illustration

Overview

  • A February court filing made public includes parts of Adam Mosseri’s March 2025 testimony and a 2021 Meta user survey showing 19.2% of Instagram users aged 13–15 reported seeing nudity or sexual images they did not want to see.
  • The same materials state that 8.4% of young teens said they saw someone harm themselves or threaten to do so on Instagram during the prior week of use.
  • Meta says the figures come from users’ self-reported survey responses rather than a review of posts, and Mosseri described such surveys as “notoriously problematic.”
  • Mosseri testified that most explicit images are shared in private messages, stressing that privacy considerations limit how the company can review DMs.
  • Prosecutors highlighted that Instagram did not launch an automatic DM nudity-blur feature until April 2024 despite internal discussions dating to 2018, as Meta also now removes explicit and AI-generated sexual content under a late-2025 policy while facing broad U.S. litigation over youth harm.