Faked nude images of more than 100,000 women have been created from social media pictures and shared online, according to a new report.
Clothes are digitally removed from pictures of women by Artificial Intelligence (AI), and spread on the messaging app Telegram.
Some of those targeted "appeared to be underage", the report by intelligence company Sensity said.
But those running the service said it was simply "entertainment".
The BBC has tested the software and received poor results.
Sensity claim the technology used is a "deepfake bot".
Deepfakes are computer-generated, often realistic images and video, based on a real template. One of its uses has been to create faked pornographic video clips of celebrities.
But Sensity's chief executive Giorgio Patrini said the shift to using photos of private individuals is relatively new.
"Having a social media account with public photos is enough for anyone to become a target," he warned.
The artificial intelligence-powered bot lives inside a Telegram private messaging channel. Users can send the bot a photo of a woman, and it will digitally remove her clothes in minutes, at no cost.
The BBC tested multiple images, all with the subjects' consent, and none were completely realistic - our results included a photo of a woman with a belly button on her diaphragm.
A similar app was shut down last year, but it is believed there are cracked versions of the software in circulation.
The administrator running the service, known only as "P" said: "I don't care that much. This is entertainment that does not carry violence.
"No one will blackmail anyone with this, since the quality is unrealistic."
He also said the team looks at what photos are shared, and "when we see minors we block the user for good."
What do you think?