![]()
Under the EU’s Digital Markets Act, Apple is required to allow developers to freely inform customers of alternative offers outside its App Store.
Gabby Jones | Bloomberg via Getty Images
The Apple and Google Play app stores are hosting dozens of “nudify” apps that can take photos of people and use artificial intelligence to generate nude images of them, according to a report Tuesday from an industry watchdog.
A review of the two app stores conducted in January by Tech Transparency Project found 55 nudify apps on Google Play and 47 in the Apple App Store, according to the organization’s report that was shared exclusively with CNBC.
After being contacted by TPP and CNBC last week, an Apple spokesperson on Monday said that the company removed 28 apps identified in the report. The iPhone maker said it also alerted developers of other apps that they risk removal from the Apple App Store if guideline violations aren’t addressed.
Two of the apps removed by Apple were restored to the store after the developers resubmitted new versions that addressed guideline concerns, a spokesman for the company told CNBC.
“Both companies say they are dedicated to the safety and security of users, but they host a collection of apps that can turn an innocuous photo of a woman into an abusive, sexualized image,” TTP wrote in its report about Apple and Google.
TTP told CNBC on Monday that a review of the Apple App Store found that only 24 apps were removed by the tech company.
A Google spokesperson told CNBC that the company suspended several apps referenced in the report for violating its app store’s policies, saying that it investigates when policy violations are reported. The company declined to say specifically how many apps it had removed, because its investigation into the apps identified by TTP was ongoing.
The report comes after Elon Musk‘s xAI faced backlash earlier this month when its Grok AI tool responded to user prompts that it generate sexualized photos of women and children.
The watchdog organization identified these apps on the two stores by searching for terms like “nudify” and “undress” to find apps, and tested those apps using AI-generated images of fully clothed women. The project tested two types of apps — those that used AI to render the images of the women without clothes, as well as “face swap” apps that superimposed the original women’s faces onto images of nude women.
“It’s very clear, these are not just ‘change outfit’ apps,” Katie Paul, TPP’s director, told CNBC. “These were definitely designed for non-consensual sexualization of people.”
CNBC investigated the dangers of nudify apps and websites in a report published in September.
In its investigation, CNBC followed a group of women in Minnesota whose public social media photos were fed into a nudify service to create sexualized deepfakes without their consent. Because the women were all adults and the man who generated the pornographic deepfakes didn’t necessarily distribute them, no apparent crime was committed. Over 80 women were victimized.
CNBC found that new AI models have made it easier than ever to generate deepfake nudes and explicit content, with the services bundled into user-friendly apps like the ones found by TTP.
Of the apps reviewed, 14 were based out of China, according to TPP. Paul said that adds an extra security concern.
“China’s data retention laws mean that the Chinese government has right to data from any company anywhere in China,” Paul said. “So if somebody’s making deepfake nudes of you, those are now in the hands of the Chinese government if they use one of those apps.”
After xAI drew scrutiny over its own nudify capabilities, Grok’s AI acknowledged “lapses in safeguards” that it is “urgently fixing,” in a reply to one X user.
On Monday, the European Commission said it opened an investigation into X over Grok’s spreading of sexually explicit content.
In response to CNBC’s request for comment, xAI sent an automated reply that said “Legacy Media Lies.”
Joe Raedle | Getty Images
In August, the National Association of Attorneys General wrote to payment platforms, including Apple Pay and Google Pay, raising concerns about services that generate non-consensual intimate images and requesting that the platforms remove such services from their networks.
Democratic senators from Oregon, New Mexico and Massachusetts asked Apple and Google to remove X from their app stores in a letter this month, saying that the mass generation of non-consensual sexualized images violates the stores’ distribution terms.
The Google Play Developer Policy Center says the platform doesn’t allow “apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps.” Apple’s app review guidelines bans material that is “overtly sexual or pornographic.”
TTP’s identified apps collectively have over 700 million downloads worldwide and have generated $117 million in revenue, the report says, citing app analytics firm AppMagic. Both Apple and Google take a cut of revenue generated by apps distributed through their stores.
“The fact that they are not adhering to their own policies, which are designed to protect people from non-consensual nude imagery and non-consensual pornography, raises a lot of questions about how they can present themselves as trusted app platforms,” Paul said.
— CNBC’s Jonathan Vanian and Katie Tarasov contributed to this report