FOR FREE PEOPLE

FOR FREE PEOPLE

QTCinderella livestreams her reaction to hearing that her likeness had been used in deepfake porn. (Photo via Twitch)

The Rise of Deepfake Porn

AI can turn anyone into a porn star without their knowledge. That reality should frighten us all.

Rapid advances in AI technology mean that computers can now write stories, create art, compose music, and even pass the bar. And while this has led to much media hand-wringing over the end of mankind, there is an even more immediate threat: the rise of deepfake AI, which can project anyone’s likeness into a hyperrealistic video without their knowledge or permission. It can even turn a person into an unwitting porn star.

Just last week, this happened to a 28-year-old woman who appears on Twitch and YouTube under the handle QTCinderella.

In this piece, which originally appeared in Pirate Wires, River Page explains why we all need to wake up to the profound and dangerous implications of this new technology. 


On January 30, QTCinderella went live on Twitch, the social media platform for gamers like her. Every month, hundreds of thousands of people pay to watch QTCinderella play video games, bake cakes and chat with her fans.

She’s used to being watched by strangers. But this time it was different.

Throughout the nearly four-minute clip, she cried, she choked on her words, she repeatedly said that she knew she shouldn’t be doing this, but that she “want[ed] to show people what pain looks like.” The video was raw.

She had just found out that a male Twitch streamer, known as Atrioc, had purchased AI deepfake porn of two other female streamers, Maya Higa and the massively popular Pokimane. After Atrioc inadvertently shared the name of the porn site with his viewers, QTCinderella discovered that her own likeness was also featured in videos on the site.

“Fuck the fucking internet. Fuck Atrioc for showing it to thousands of people. Fuck the people DMing me pictures of myself from that website. Fuck you all!” She wept, her face red, tears streaming.

“This is what it looks like to feel violated. This is what it looks like to feel taken advantage of, this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.” 

There was no catharsis. She ended the stream looking just as wounded as she did at the start. You can tell she hadn’t slept much before the stream, and wouldn’t sleep much after. It was brutal.

The video affected me deeply. I felt badly for her, and assumed everyone else would, too. (Atrioc himself quickly apologized for viewing and purchasing the porn, saying, “I got morbidly curious and I clicked something. It’s gross and I’m sorry.”)

Then I logged on to Twitter and realized that many others didn’t share my point of view.

Someone had posted a screengrab from the same video I’d just seen with the caption: “Millionaire internet streamer’s reaction to AI porn of herself. You won’t find more fragile people than popular internet personalities (especially women).” Bewildered, I quote-tweeted it, saying—hyperbolically—that “if you can’t understand why someone would feel violated and upset by this you should be in jail.” The tweet went viral, which was a surprise for me. The sheer number of responses to my tweet created what amounts to a virtual focus group on deepfake porn, and the results are worth exploring.

This post is for paying subscribers only

Subscribe

Already have an account? Log in

Latest