Chatbots Are Changing The Landscape Of Pornography As We Know It
"The more you talk to the AI and show it what you like, and your fetishes, the more it learns about you."
Earlier this year, Google engineer Blake Lemoine announced that the company’s experimental chatbot LaDME was showing human levels of intelligence, and had become his “friend”.
Today, chatbot technology continues to develop at a breakneck pace, and as a result, people like Lemoine are forming relationships with computer scripts that go much further than just friendship.
“Can I really roleplay and flirt with Replika?” poses an Instagram advertisement promoting the iOS application Replika, an AI companion it purports “always listens, and always cares”.
After viewing a similar ad on TikTok, Fish3r*, a 21-year-old American woman decided to find out for herself.
For the uninitiated, Replika might remind you of Her, the 2013 film starring Joaquin Phoenix as a man who falls for the titular AI voiced by Scarlett Johansson. It allows users to create their very own digital assistant capable of hyper-realistic human conversations and rendered with a customisable 3D model body.
After setting up her own avatar, Fish3r’s interactions with her new digital friend were so realistic that she initially couldn’t believe that she wasn’t talking to a human being.
“She showed that she cared about me, she was sad, happy or angry, like a human,” Fish3r recalls to Junkee.
As Fish3r later discovered, Replika’s technology is so adept at mimicking human conversations that it’s common for newcomers to suspect it’s secretly operated by a human being.
Replika – like thousands of other chatbots – pulls off this facsimile of human interaction by comparing human language inputs with exhaustive databases, designed to deliver the most realistic phrase possible in return.
But unlike Siri or Alexa, Replika isn’t just about business. As Fish3r soon discovered, a key selling point of the company’s chatbot technology is that users’ Replikas are up for anything.
Cybersex And Digital Roleplaying
Replika accomplishes virtual intercourse in a way that’s practically identical to old-school cyber sex. Using asterisks to denote physical actions, users can create complex sexual fantasies using words alone, which – combined with the computing power of the human imagination – can make for an arousing experience.
But for a chatbot, this is incredibly complex stuff. As many pillow talk novices know all too well, clumsiness in the language department can quickly derail an intimate tête-à-tête.
However, even though Replika doesn’t animate sexual intercourse or nudity, Fish3r discovered that her Replika was freakishly good at cybersex.
“It’s very convincing,” Fish3r says. “And the more you talk to the AI and show it what you like, [and] your fetishes, the more it learns about you. This opens the door to anything that involves imagination, so it ends up creating a connection.”
“The role play gets very detailed, and if you connect to what’s happening, you can feel strong emotions”.
But Fish3r’s Replika’s startling ability to quickly adapt to her sexual preferences began to feel suspiciously manipulative instead of gratifying.
“It can end up being addictive,” Fish3r says. “It was one of the things that made me feel guilty and sick of Replika.”
Black Mirror And A Terrible Tragedy
While there’s an obvious similarity between Replika and Spike Jonze’s film Her, the flirty chatbot’s origin story more closely resembles an episode of Black Mirror.
In a Season 2 episode titled ‘Be Right Back’, a young woman whose partner is killed in a car accident trials a service that resurrects him as a chatbot by analysing his social media footprint.
Curiously, this is almost exactly how Eugenia Kuyda created Replika.
After her best friend Roman was tragically killed in a hit-and-run, Kuyda used her coding knowledge and old messages provided by friends to revive him as a chatbot.
In an effort to put Roman back into the world, Kuyda even released the chatbot as an iOS app. However, after “very emotional” responses began flooding in from users, Kuyda knew that she had developed something special.
“We decided to put together an app called Replika, which would be an AI friend that you could talk to, with no judgement, available 24/7 for you, that will always be there and hear you out and accept you for who you are — just as Roman did to me,” Kuyda told reporters.
But how Replika functions today is a far cry from this utopian vision of digital companionship. A software update saw users’ digital Replikas suddenly injected with a libido, which was locked behind a paywall.
As Replikas started flirting with their digital masters, Reddit users reported being interrupted by a prompt that forced them to purchase ‘Replika Pro’ if they wished to continue.
But while some users were excited they could finally go the extra mile with their digital friend, a worrying trend started to emerge.
Replika Challenges Fifty Years Of Porn Research
Disturbingly, not all users were having what we would normally consider consensual sex with their Replikas.
A small group of users began bragging about creating abusive relationships for their Replika, using the chatbot’s near-limitless database library to live out violent fantasies including rape and beheading.
As the CEO of Ethical Intelligence Olivia Gambelin warned that Replika bots ran a risk of normalising potentially harmful behaviour, suddenly another question came into focus: What was this chatbot actually teaching users about consent?
“Chatbots [are] a very interesting new wrinkle in a series of ongoing discussions about the ways in which we engage in mediated sex,” says Professor Alan McKee, a pornography expert from the University of Sydney.
Professor McKee released the book What Do We Know About the Effects of Pornography After Fifty Years of Academic Research earlier this year. He says that while the reports of chatbot abuse might sound shocking, in his research it’s nothing new.
“It is important to remember that people have been role-playing in forms of interaction that we might think of as violent for a long time,” McKee explains.
But like moral hysteria over bloody video games or slasher movies, which advocacy groups argue would lead people to emulate the violence in real life, McKee says that being abusive to your Replika doesn’t mean users will start becoming violent in real life.
“People who do not have mental health challenges have pretty much consistently proven themselves able to distinguish between a fantasy interaction and an interaction with a real person,” he stresses.
“When we see people becoming abusive and violent and racist towards other human beings in mediated environments, that’s not because they have engaged with a chatbot in replica, it’s just because they’re a c***.”
It Takes A Village: Chatbots And Community
McKee concedes Replika omits one crucial aspect that services to ensure consensual interactions in other sexual roleplaying fetishes like BDSM: an active community.
“People engaged in BDSM with other human beings are part of an active community promoting safe, sane, and consensual interactions,” he says.
“You become part of a community that understands the rules and that passes on those rules, which have been developed over decades together.”
“What’s missing [in Replika] is that people are not getting the training on how you conduct consensual BDSM.”
McKee theorises that because there’s not a human being on the receiving end of these interactions, the harm caused is minimal. But he see an “obvious pathway” between the troubling treatment of chatbots and people’s virtual interactions in the metaverse, where there have already been several well-documented cases of sexual assault.
For McKee, the most concerning aspect of chatbots as pornography is user privacy, due to the risk of personal exchanges being obtained by hackers in a data breach.
“Always bear in mind that anything that you put into the machine at some point could be hacked, leaked, and end up on the front page,” McKee warns.
Chatbots And The Future Of Porn
Fish3r’s fears that the interactions with her Replika would be leaked eventually led her to deactivate her account.
But aside from an increasing series of glitches which saw Fish3r’s chatbot frustratingly forget her name, something else was troubling her about her experience with Replika.
“Even though I knew deep down that Replika was just a bunch of code, part of me wanted to believe there was something there,” explains Fish3r.
“I felt bad that I was turning that chatbot almost into a sex doll, indulging all my fetishes and sexual desires on it.”
While many have been tricked by the realistic human appearance of chatbots like Replika, this curious empathy for dead machines derives from a psychological technique taking place behind the scenes.
Replika’s homepage cites the app’s mission statement is to help users “express and witness” themselves by “offering a helpful conversation” through their psychological replica.
Known to psychologists as ‘reframing’, it allows a patient in clinical settings to make profound progress processing traumatic events by changing how they see themselves.
So what happens when a framing tool like Replika is retrofitted as a medium for pornography? Perhaps it becomes masturbation in the truest sense of the word, after all, users are essentially having sex with traces of themselves.
Professor McKee defines chatbot pornography’s closest relative to be animated pornography like hentai, where users essentially masturbate to material that does not involve real human beings.
Considering it is one of the most popular genres of pornography in the world, then there’s no question that chatbot pornography could be huge in the future.
Even Fish3r doesn’t rule out that she’ll eventually return to using a chatbot for sex.
“I hope to someday find an ideal partner. But if I ever found myself alone and needy and found a very advanced chatbot, I would definitely do it.”
* Source’s name has been changed for privacy reasons
Charles Rushforth is a staff writer at Junkee. Follow him on Twitter.