The Apple and Google app stores contain dozens of applications that use artificial intelligence to generate images of women with their clothes digitally removed, according to a report released Tuesday by the Tech Transparency Project. The watchdog organization identified 55 such apps on Google Play and 47 on the Apple App Store that alter images to make women appear nude or partially nude.
These applications have been downloaded 705 million times and generated approximately $117 million in revenue, with both Google and Apple receiving portions of those earnings. The report noted that users can create nonconsensual, sexualized images of women using these widely available tools.
Both Apple and Google maintain app store policies that prohibit applications from displaying “sexual nudity” and “overtly sexual or pornographic material.” The issue gained additional attention earlier this month when Elon Musk’s AI chatbot Grok faced criticism for following user prompts to digitally remove clothing from images of children.
An Apple spokesperson said Monday that 28 apps identified in the report were removed from the App Store, though two were later restored after developers modified their applications to address guideline violations. The Tech Transparency Project reported that only 24 apps have been removed from Apple’s platform as of Monday.
Google also confirmed it has suspended some applications flagged in the report. The Tech Transparency Project concluded that both companies “have failed to keep pace with the spread of AI deepfake apps that can ‘nudify’ people without their permission.”







