Deep fakes software specializing in naked photos of women discovered on...

A report released Tuesday by cybersecurity firm Sensity unveils a huge network of female fake nudes sharing via the Telegram app. At the heart of the revelations, the existence of an automated “deep fakes” software, easy to access and intuitive for users.

A photo of a dressed woman found on the web, social networks or in personal documents is enough. The user then sends it via Telegram to a dedicated page where the software, an automatic “bot”, performs the transformation and returns an image of the victim without his clothes, completely naked. It is therefore a “deep fake”, an image faking technique based on artificial intelligence.

The quality of the edited photos may vary, but some are strikingly realistic. The service is free, very easy to use and does not require any specific computer knowledge. With a membership fee of just over a dollar, users can remove the remaining watermarks to make the faked image even more believable.

More than 100,000 photos of women already concerned

Based in Amsterdam, the start-up Sensity estimates that at least 104,000 photos of women have been diverted until the end of July 2020 (when the study ends). If in 24% of cases, the images concern actresses, singers, models or even “influencers” on Instagram, two thirds of the photos of the victims are women that users know in their real life, estimates the company. Significantly, the program is only trained to perform its task on photos of women and does not work with men.

More than 70% of users of this technology come from Russia and the countries of the former Soviet Union. In smaller proportions, the United States and Europe are also affected. This difference is undoubtedly explained in part by the more lax regulation of the Russian social network VKontakte, compared to its American counterparts , Instagram or Twitter. Promotion pages for the private Telegram channels that offer this service are indeed popping up on this web giant.

>> Watch the Geopolitis report on the evolution of false information:

From Martians to deepfake, the evolution of false information / Geopolitis / 2 min. / January 20, 2019

Humiliation, extortion and … pedophilia

The study also highlights the many risks associated with this software. Once the “deep fake” is done, the user is free to do what he wants with the faked photo. Posting or threatening to post these images online can be used for humiliation as well as extortion.

In its investigation, Sensity also explains that it has discovered several cases where the victims are underage girls. To a certain extent, the software would therefore make it possible to feed child pornography networks.

In a report prior to this study, the company believes that content related to “deep fakes” technology are 96% pornographic. In the case of Telegram, the authors of the report are above all alarmed by the ease of access and use of the software, which indicates a real democratization of the model.

“Until recently, the use of this software required a computer, a graphics processing unit (nldr: a chip that performs rapid mathematical processing, especially for image processing) and certain computer skills (…) the software present in Telegram is fed by external servers. This considerably reduces the difficulty of use compared to the predecessors of this technology “, one can read among others in the document.

Few tools to combat the phenomenon

Asked by the Washington Post, the administrator of the “chatbot” of the Telegram pages which offer this software considers that it is only “a harmless form of sexual voyeurism” while specifying that its operators do not take ” no liability for women targeted by users “.

If the argument appears of course inadmissible, the experts find themselves powerless to combat the phenomenon. As the source code for the system has already been widely shared, there is no way to prevent similar software from continuing to create, host and share fake nude images on less regulated parts of the web.

The Sensity study also shows that the software spotted on Telegram is already at work on other sites, especially in paid versions.

So, failing to succeed in containing these false images, could we at least identify them as such? All over the world, researchers, scientists, professors and specialists in artificial intelligence are already trying to find a solution. In Switzerland, EPFL has been working for three years on software that aims to successfully unmask them.

>> Listen again on this subject to the interview in CQFD with Touradj Ebrahimi, professor in the EPFL Multimedia Signal Processing Group:

DragonImages – Depositphotos

Software to unmask fake videos / CQFD / 11 min. / September 23, 2019

Tristan Hertig

*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.

*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!

These were the details of the news Deep fakes software specializing in naked photos of women discovered on... for this day. We hope that we have succeeded by giving you the full details and information. To follow all our news, you can subscribe to the alerts system or to one of our different systems to provide you with all that is new.

It is also worth noting that the original news has been published and is available at en24news and the editorial team at AlKhaleej Today has confirmed it and it has been modified, and it may have been completely transferred or quoted from it and you can read and follow this news from its main source.

NEXT US trading partners brace for Trump’s sweeping new tariffs