Particle.news
Download on the App Store

Apple and Google Pulled Apps After Report Found Stores Steering Users to AI 'Nudify' Tools

The probe spotlights a policy–enforcement gap that allowed child-accessible deepfake tools to thrive.

Overview

  • Tech Transparency Project, which published its follow-up Wednesday, said App Store and Google Play search, autocomplete and paid ads pointed users to apps that create non-consensual explicit images.
  • Apple removed 15 apps named by the researchers after reporters asked about them, while Google said many referenced apps were suspended and that investigations are ongoing.
  • TTP tests showed face-swap and image tools could place a real person’s face onto nude bodies or strip clothing from photos, even when the apps were listed as generic editors.
  • Apps flagged in the report have about 483 million downloads and roughly $122 million in lifetime revenue by AppMagic estimates, with some visibility boosted by sponsored search placements.
  • Many nudify-capable apps were rated for minors, raising child-safety risks, and the findings land as the U.S. Take It Down Act is in force and the U.K. prepares measures to hold tech executives liable.