Overview
- Researchers report 55.3% of surveyed adolescents created at least one AI “nudification” image and 54.4% received such images.
- Non-consensual misuse was common, with 36.3% saying an image of them was created without consent and 33.2% reporting non-consensual distribution.
- Victims described harms consistent with other child sexual exploitation, including fear, hypervigilance, social withdrawal, and lasting disruption.
- Usage spanned demographics, though male participants reported higher rates of creating and distributing sexualized AI images.
- The exploratory, U.S.-only study is being cited to inform earlier prevention education, platform moderation, law-enforcement training, and potential legislation, with calls for larger cross-national research.