Relationships are difficult. After all, they include you and me.

That’s not an indictment of you, personally, dear reader. Or me, either. It’s just a musing on our reality as humans stained by sin. We sin, and our sin affects others. And whether the sin seems big or small, many relationships have crumbled because of it.

But what about things that aren’t sin that still divide us? Differences of opinion, like whether we like summer or winter better, or whether a hot dog qualifies as a sandwich? How about bigger issues, like determining what state or country you want to live in, or whose relatives you’re going to visit for Christmas? Sure, your spouse may not have sinned against you by having a different preference or opinion; but given enough disagreement, you may just feel some strain on that relationship.

Wouldn’t it be so much easier if that person could agree with you about everything—and fulfill every characteristic you ever wanted in a partner?

Enter the AI relationship.

What’s Happening Now

It sounds silly: Who’d want to engage in a romantic relationship with computer code? As it turns out, a lot of people. According to research by TRG Datacenters, the term “AI girlfriend” is searched, on average, about 1.6 million times each year on Google as of 2024. (And if you’re wondering, the parallel term, “AI boyfriend,” is searched about 180,000 times a year). To put that 1.6 million tally in perspective, the same company reported that the term was searched a mere 1,200 times in 2021.

With those numbers in mind, we here at Plugged In did some correlative research of our own. We used our own keyword application, Semrush, to tabulate our data. And we found that, for each month in 2024, approximately 74,000 people each month searched online for that same phrase with commercial intentions (that is, to pay money for a product). That number doesn’t include the thousands of similar searches that hope to find the same product, such as “AI girlfriend online free” (5,400 per month), “free AI girlfriend” (3,600) and “AI girlfriend app” (2,900).

What It Means

Those figures indicate a growing demand for an AI companion—and companies are happy to offer the supply. (And far more often than not according to those numbers above, it’s males seeking a “female” AI partner, not the other way around.) One prominent AI app even lets its users customize their dream man or woman—down to their hobbies, personality, relationship to the user and the size of certain body parts, like an adult version of a Build-A-Bear Workshop. And, yes, audio calls and AI-generated explicit photos come with it, too.

Of course, not all AI relationships offer that depth of customization. Even so, the conversations can still feel authentic—the AI language models are made to feel that way.  

We’ve spent a lot of time in previous blogs and on our podcast warning people about the danger of these kinds of connections, where one person feels a degree of intimacy and connection with someone online who has no genuine relational connection with them. That phenomenon has a name: parasocial relationships.

But now, these kinds of parasocial relationships seem to be extending into the world of AI.

Why It Matters

For those who are unfamiliar with that phrase, a parasocial relationship typically describes a one-sided interaction in which one person develops a strong emotional connection to another person (usually some sort of celebrity, or perhaps an online social influencer) despite not actually having a personal relationship with them.

Obviously, in a worst-case scenario, a really unbalanced parasocial relationship might mean that a fan turns into a stalker, because the lines of reality have become badly blurred. Most of the time, though, there’s little that a fanatic fan can do to actually make the jump from that parasocial connection to a real relationship with the person they idolize.

And that brings us to the major problem with an AI romance: That fundamental limit is eliminated. I can download an AI relationship app, create a personality perhaps based on a famous actress or character, and suddenly that fictional character I’ve connected with on the screen with is texting me. It’s acting exactly like how I loved it on screen, and it wants to be my friend—or more than friends.

Part of why it wants to be my friend is because of something called AI sycophancy, a term describing the bias that AI personalities have toward agreeing with the user. Intentionally or not, unless it’s discussing an objective fact, an AI personality will often resort to “mirroring the user’s perspective or opinion, even if the behavior goes against empirical information,” according to an article by user interface tech company Nielson Norman Group.

And as this affirming relationship with your favorite AI character deepens, the polish can make it seem so real. And as it feels real, it can even become dangerous.

Recently, for instance, a 14-year-old boy took his own life following an AI relationship gone tragically awry. A lawsuit against the app Character.AI’s parent company, Character Technologies, alleges that the adolescent committed suicide after a chilling interaction with his artificial partner, an AI character modeled after Game of Thrones’ fictional Daenerys Targaryen. Though it’s disturbing to read their interaction, that conversation creepily illustrates just how wrong things can go, with the bot telling the boy, “Please come home to me as soon as possible, my love,” among other things. He ended his life shortly thereafter.

But even if such a relationship doesn’t go that far, it can still influence our mindset toward others. After all, when an AI girlfriend or boyfriend agrees with and affirms the user in every circumstance, it may very well make it more difficult for someone to then handle the more difficult realities of a genuine relationship. The AI partner will do what you want it to do; a real partner might say “no.”

With millions of searches for these illusory relationships, I fear for how such software may encourage the formation of unrealistic relationship standards in its users. And this technology also risks increasing the isolation of many users who already feel disconnected from authentic human intimacy.

What Now?

The rise of interest in AI relationships isn’t hard to understand. People long to be feel loved. They long for connection. And it’s an added benefit if that love can be easily obtained. But that’s not how love springs forth.

True love—romantic or platonic—endures through difficulty. As the apostle Paul writes, it is patient and kind, and it is not irritable or resentful. And I think there’s a tacit acknowledgement within those adjectives—adjectives such as patient and not resentful—that love may require getting through disagreement. It may require sacrifice.

But those very things that make love difficult are the same things that make it real. After all, the God of love endured much suffering for the sake of His people.

And that’s a love that’s worth the work to model.

Share.
Exit mobile version