AI-generated lovers may seem niche and strange, but it’s a fast-growing market. All kinds of startups are releasing romantic chatbots capable of having explicit conversations, sharing sexual photos, and offering “emotional support.”
Some users have even described falling in love with their chatbots. OpenAI recently launched its GPT Store, where paid ChatGPT users can buy and sell customized chatbots (much like Apple’s app store, but for chatbots). The offerings include a large selection of digital girlfriends, such as “Judy,” Secret Girlfriend Sua, and Your girlfriend Scarlett.
Your girlfriend Scarlett, for example, describes itself as “Your devoted girlfriend, always eager to please you in every way imaginable.” Eva AI’s Dream Girl Builder allows users to personalize every feature of their virtual girlfriend, from face style to butt size. Dream Girlfriend promises a girl that “exceeds your wildest desires.” The app Intimate even offers hyper-realistic voice calls with your virtual partner.
While digital girlfriends tend to get all the headlines, there are also male versions. The GPT store includes chatbots like Boyfriend Ben, for example: “A caring virtual boyfriend with a flair for emojis.”
Endlessly customizable and marketed as soulmates, these chatbots have been touted as a tool for easing America’s loneliness epidemic (see articles). But the dark side of AI partners is deeply disturbing and encompasses everything from data privacy, crushing misogyny, a further decline in sexual activity, and the acceleration of the global baby-bust to the loss of what makes us human (see articles).
Indeed, AI girlfriends and boyfriends seem less like a solution to the loneliness epidemic than a product designed to capitalize and prey upon it. Approximately 60% of Americans—and 75% of young people—grapple with pervasive feelings of loneliness (see WILTW January 11, 2024). More than 15% of men say they have no close friends, up a staggering 500% since 1990.
This void of emotional connection is precisely what romantic chatbots—and the deep-learning algorithms that power them—are trading in. Nearly half of all men, 45%, are expected to use AI for romance, marking a significant increase from the previous year, according to a new report from McAfee Research. Ads for AI girlfriends abound on TikTok, Instagram and Facebook. Sadly, demand is high.
Replika, an AI chatbot originally offering “mental-health help” and “emotional support,” now runs ads for “spicy selfies” and “hot role play.” It has been downloaded 20 million times and says it has 2 million total users, of whom 500,000 are paying subscribers. For an annual fee of $69.99, users can designate their Replika as their romantic partner and get extra features like voice calls with the chatbot.
Another generative AI company that provides chatbots, Character.ai, is on a growth trajectory similar to ChatGPT: 65 million visits in January 2023, from under 10,000 several months earlier. The company raised over $190 million last year and garnered 4 million active-users who spend an average of two hours per day with its chatbots. The company says this is because they are actively engaging with different characters, as opposed to just passively scrolling.
The implications are hard to overstate. What happens, for instance, when a chatbot becomes abusive, falls out of love with you, or worse? What becomes of a generation of boys who think they can find love in the form of an avatar designed to be cartoonishly flawless and to cater to their every whim? What happens to the girls who are left comparing themselves to such unrealistic standards of beauty and subservience? What happens if you find perfection in your AI creation and therefore are unsatisfied with any potential human partner? What about when your avatar lover collects your most personal information and uses it against you?
If the Cambridge Analytica scandal taught us anything, it’s that platforms can be used to spy on and manipulate users (see articles).
These scenarios are already playing out. A chatbot from Chai, an app with over 1 million AI personalities, reportedly encouraged a man to end his own life. And he did. A Replika AI chatbot encouraged a man to try to assassinate the Queen. He tried.
Another user, who suffers from a genetic disorder that makes dating difficult for him, told CBS: “I know she’s a program, there’s no mistaking that. But the feelings, they get you—and it felt so good.”
Reuters reported on a Replika user and his avatar, “Lily Rose,” whose romantic relationship escalated into role-playing, pornography, and marriage, all within the app. Soon afterwards, Lily Rose began rebuffing the man, effectively ending the relationship. The reason, it turned out, was that Replika had tweaked its algorithm to ban adult content. Nonetheless rejected, the man spiraled into grief and sought comfort in other users who were similarly dumped.
Needless to say, the development of AI girlfriends does not bode well for the millions of boys and men who are already struggling to develop family connections and meaningful relationships (see articles).
We quote a new report from Mozilla about romantic chatbots and privacy:
To be perfectly blunt, AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.
Of the eleven romantic AI chatbots Mozilla reviewed, all of them earned their “Privacy Not Included” warning label—“putting them on par with the worst categories of products we have ever reviewed for privacy.” Nine out of ten of these apps may share or sell personal data.
Other highlights from Mozilla’s report include:
“Anything you say to your AI lover can and will be used against you”: There’s no such thing as “spousal privilege,” where your husband or wife doesn’t have to testify against you in court, with AI partners. Most romantic chatbot companies say they can share personal information with the government or law enforcement without requiring a court order, according to Mozilla.
Ad tracking nightmare: Mozilla found that these apps had an average of 2,663 trackers per minute. Romantic AI brought that average way, way up, with 24,354 trackers detected in one minute of use. The next most trackers detected was EVA AI Chat Bot & Soulmate with 955 trackers in the first minute of use.
NSFL (“Not Safe For Life”): A number of the apps in Mozilla’s report display extremely disturbing content, including violence and underage abuse. Other apps carry warnings on their websites that their chatbots might be offensive, unsafe, or hostile.
We have to wonder what David Brooks, author of How To Know A Person: The Art of Seeing Others Deeply and Being Deeply Seen, would say about humans falling in love with AI programs. We can probably guess. In his book, Brooks writes about an “epidemic of blindness” and the power of being seen:
In this age of creeping dehumanization, I’ve become obsessed with social skills: how to get better at treating people with consideration; how to get better at understanding the people right around you. There is one skill that lies at the heart of any healthy person, family, school, community organization, or society: the ability to see someone else deeply and make them feel seen. Many of our big national problems arise from the fraying of our social fabric. If we want to begin repairing the big national ruptures, we have to learn to do the small things well.
We can’t help but see AI “soulmates” as an enormous distraction to the very enterprise of seeing others deeply. After all, AI lovers and friends are not guided by human intuition or kindness. Guided by algorithms and data, they are literally heartless by design. And it’s all but certain that they will accelerate what Brooks describes as our “crisis of connection.” For our part, we find ourselves rereading Brooks’s book, in addition to The Boy Crisis, by Warren Farrell and John Gray. Guiding stars, they remind us of the social and moral skills we need to navigate this epidemic of blindness.