Telegram Deepfake Bot Creates Thousands of Non-Consensual Nude Photos

The unpleasant side of online technology gets even worse

The Telegram app
A Telegram deepfake bot is causing issues on the service.
Rafael Henrique/SOPA Images/LightRocket via Getty Images

How do you make revenge porn worse? Sadly, that’s not a rhetorical question; instead, it illustrates how two nominally separate types of very online behavior can join together and make something that’s already terrible even worse. In this case, it’s via the addition of deepfake technology, taking an already morally bleak situation and pulling it further down.

A new article at CNET explores the ramifications of a new bot that uses the Telegram messaging service. This bot, as writer Joan E. Solsman puts it, “has victimized seemingly hundreds of thousands of women by replacing the clothed parts of their bodies in photos with nudity.” According to the bot’s website, over 100,000 people have used it to create 700,000 images. This data suggests that over 100,000 people have very disturbing ideas about consent and morality, along with way too much free time on their hands.

As the article notes, Telegram has taken action in the past to curb atrocious behavior — kicking neo-Nazis off the platform, for instance. Both CNET and Sensisty, a deepfake research company who have reported on the bot in question, reached out to Telegram; Telegram did not respond to either one.

The whole article is well worth a read, though it also takes some unsettling turns. This includes the bot’s incorporation of gamification, which adds yet another highly unpleasant wrinkle to the whole thing. It’s a harrowing reminder of what technology is capable of in the wrong hands.

The InsideHook Newsletter.

News, advice and insights for the most interesting person in the room.