In new online abuse menace, AI bot turns women’s photos into nudes

Hindustan Times | By, New Delhi
Oct 21, 2020 10:40 AM IST

Deepfake tracking agency discovers bot used generated over 100,000 nude images of women from regular photographs

Unidentified cyber criminals have rolled out a service that allows anyone to create fake nude images of women from a regular photograph, a cyber research agency tracking deepfakes said in a report released on Tuesday, uncovering a growing ecosystem that has already targeted at least 100,000 women and threatens to open a new front in online sexual abuse.

(Getty Images)
(Getty Images)

A deepfake is any manipulated media created by an artificial intelligence technique called deep learning. The technology first gained attention in 2017 and evolved rapidly, with experts warning that it can affect democracy and law since, among other things, deepfakes can be used to create convincing fakes about rival politicians and generate false evidence to implicate someone in crime.

It has also been used to create deepfake pornography of celebrities, but Netherlands-based Sensity’s report now uncovers its first widespread use in targeting virtually any individual whose images are available. The tool works only on images of women.

“Our investigation of this bot and its affiliated channels revealed several key findings. Approximately 104,852 women have been targeted and had their personal “stripped” images shared publicly as of the end of July, 2020. The number of these images grew by 198% in the last 3 months,” said the report.

At present, most of the roughly 104,000 users and most of the victims appear to be from Russia, the report added, citing a poll in one of seven Telegram groups linked to the service – the name of which has been withheld in order to avoid publicity.

At the core is a bot that lets a person upload a photograph of a woman. The bot feeds back a version with any clothing deleted and replaced by fake but at times authentic, but often evidently fake skin and private parts. The tool is available for free, but the photos will be watermarked. Users can pay $1.5 (about 110) to remove it, the report said.

“The activity on the bot’s affiliated Telegram channels makes for bleak viewing. On the image sharing galleries, thousands of synthetically stripped images of young women taken from social media and private correspondence are constantly being uploaded,” said Henry Ajder, an expert on deepfakes and the lead author of the report who has since left Sensity.

“The bot’s significance, as opposed to other tools for creating deepfakes, is its accessibility, which has enabled tens of thousands of users to non-consensually strip these images,” Ajder said, adding that the most concerning aspect of the investigation was the discovery of images of underage girls.

“Up until now, we’ve seen a relatively low amount of activity with deepfakes and paedophilic content, but this investigation provides a stark warning that this trend is definitely over,” he added.

According to Sensity’s investigation, the tool appears to be a version of DeepNudes, a software first released anonymously in 2019 before criticism forced its developer to pull it down.

But, “on July 19th 2019, the creators sold the DeepNude licence on an online marketplace to an anonymous buyer for $30,000. The software has since been reverse engineered…” Sensity added. The key difference between that the first software and this service is now the fact that people do not need access to powerful graphics processing hardware and some degree of expertise to create such nudes.

“In terms of photorealism, the level of the technology is still quite primitive and in many cases, it will be possible to distinguish them (the photographs) as a fake. Still, it does not mean that this material isn’t a reputation threat for people,” said Giorgio Patrini, CEO and chief scientist at Sensity. “Imagine someone posts a naked photo that you didn’t even take — the quality of the image is really not what makes the difference,” he added.

According to Patrini, it is not established for certainty who are behind the new service. “We can state confidently that the individuals involved in the bot creation are very likely to be Russian native speakers, given the presence of this language among the users and the fact that a large part of the victim are Russian nationals,” Patrini said.

Deepfake nudes do not fall in specific legal definitions in most countries, making it uncharted territory for law enforcement. Experts said it is closest to crimes known as revenge pornography and non-consensual sharing of intimate images.

In India, such crimes are usually tried under Section 499 (criminal defamation) and Section 354C of the Indian Penal Code (voyeurism), and Section 66E (depicting private parts of a person) and Section 67A (section against sexually explicit materials) of the Information Technology Act, according to a 2018 analysis by Yesha Paul, then at the Centre for Communication Governance (CCG), National Law University (NLU) Delhi.

“It is unlikely to be the case that the punishment using these laws will fit the crime because we are dealing with cyber crime as a service -- particularly cyber crime against women as a service,” said Gunjan Chawla, programme manager, technology and national security at CCG, NLU Delhi.

There are also concerns of jurisdiction and evidence, when the illegal material is generated by an algorithm. “The courts might be significantly challenged to analyse the evidence to see whether or not that meets the standard of proof required in criminal trials - to prove guilt beyond reasonable doubt,” she said.

The use of AI to create fake nudes demonstrates the risk of putting out personal photographs, Patrini added. “People must think twice when sharing visual content online, visible to anyone. Even if the content itself is innocuous, unfortunately one day it might be repurposed and utilised maliciously against you,” he said.

Get Latest World Newsalong with Latest Newsfrom Indiaat Hindustan Times.
SHARE THIS ARTICLE ON
  • ABOUT THE AUTHOR

    Binayak reports on information security, privacy and scientific research in health and environment with explanatory pieces. He also edits the news sections of the newspaper.

SHARE
Story Saved
Live Score
OPEN APP
×
Saved Articles
Following
My Reads
My Offers
Sign out
New Delhi 0C
Sunday, May 28, 2023
Start 14 Days Free Trial Subscribe Now
Register Free and get Exciting Deals