An AI companion service has customers, not partners
The “From AGI with love” mini-series looks at the future prospect of romantic human-AI relationships. In “Meet Hot Shoggoths Near You” we have looked at anthropomorphism. In this short post we look at the fact that human-AI companionships, as offered by “Replika”, operate based on a relationship-as-a-service business model. This model creates a few challenges. Most notably, it creates an economic incentive for emotional manipulation and exploitation.
Sycophancy
A human-AI relationship can be sycophantic because the AI will strongly adapt to you, and require little adaptation of yourself. Your needs always come first, second, and third and there is no need for any compromise. An AI “girlfriend” doesn’t require you to listen to her day, you don’t need to support her in her ambitions, and she’s always in the mood when you are. The low maintenance requirements of an AI companion may be part of the appeal. If you spend long hours at work and lack personal time, you don’t necessarily want to put up with a lot of emotional extra-labor.
However, there is a longer term question of whether a sycophantic relationship leads to more narcissism and ego-centrism. In the real world, not everything and everyone exists to please you. A part of human relationships is adapting to, and compromising with your partner. This seems especially important for AI “friends” and the socialization of children.
Of course, the empirical impact of AI “friends” on socialization factors like self-control remains to be seen. If chats with AI “friends” replace endless scrolling on the social media “slot machine” that may not just be negative. Similarly, I could imagine that an AI “friend” may more actively discourage “foul language” than a human friend. Still, the potential developmental impacts of sycophantic AI friendships seem worth monitoring, especially in children, who are still developing their basic social skills.
Lack of loyalty
Unconditional acceptance is part of the AI appeal. However, while it may feel like an AI “girlfriend” is always there for you and will never reject you, the reality can be different. A relationship-as-a-service AI companion will never reject you in the same way that McDonalds will never reject you - as a paying customer. An AI companion service is not a partnership between equals. It’s an asymmetric relationship in which someone pays someone else for a service.
If you have an accident and suddenly require a life saving treatment that costs 100’000 USD, a human partner will be on your team and help you. An AI “girlfriend” may promise unconditional support as part of the emotional service it provides, but the company behind the service will not help to pay your hospital bills. If a customer forgets to pay his or her subscription bill the loyalty of an AI companion may disappear quickly.
Addiction and economic exploitation
OnlyFans is a subscription-based platform where adult movie stars share exclusive content with their followers in exchange for a monthly fee and tips. It is best described as a parasocial platform where creators cultivate one-sided, personal-feeling relationships with their subscribers that shower them with gifts and money to get their attention. Such parasocial relationships have three main characteristics:
One-sided nature: In a parasocial relationship, fans or followers feel a strong emotional connection to a celebrity, influencer, or content creator. However, the relationship is one-sided; the celebrity may not even be aware of the individual fan’s existence. These relationships can fulfill social and emotional needs for fans, such as companionship, belonging, and emotional support. This can be particularly significant for individuals who might lack strong social connections in their offline lives.
Illusion of intimacy: Fans often feel as though they "know" the celebrity or influencer personally, despite the lack of actual interaction. This illusion is fostered by the media, social platforms, and the content shared by the celebrity, which often includes personal details and behind-the-scenes glimpses.
Monetization and commodification: Many celebrities and influencers monetize their parasocial relationships through fan clubs, Patreon, or platforms like OnlyFans. This commercialization can blur the lines between genuine connection and transactional relationships.
AI “girlfriends” and “boyfriends” are in some ways the logical evolution of parasocial relationships on OnlyFans. As described by OnlyFans personality
in “How Onlyfans Took Over The World”, already today there are professional agencies that take care of exchanging private messages with horny men pretending to be the performer and pretending to create short sexual videos in-real-time for the payment of tips. However, the illusion of intimacy in the future could be even stronger because the AI avatar can engage in long, personalized interactions with the customer. As such, it should not come as a surprise that adult movie stars have co-founded “Clona.ai”, a virtual companion platform where “top creators are now your free AI girlfriends”.Such platforms may exploit individuals with unhealthy and obsessive attachments to celebrities or influencers, which can lead to unrealistic expectations, emotional dependency, or neglect of real-life relationships. Furthermore, there may soon be a multi-billion dollar market to train AIs to get people addicted to chatting with AI “girlfriends”. As researchers have shown training a large language model to keep users chatting leads to 30% more user retention.
Accordingly, users could be manipulated into adapting their behaviors to benefit the service provider. These patterns exploit cognitive biases and often operate without the user's full understanding or consent. In short, an AI “girlfriend” may not just exploit an already emotionally vulnerable user, it may also start to manipulate its users to sabotage real-life partnerships so that they spend more time with it1 and it may subconsciously train users to become more emotionally dependent and vulnerable to exploitation over time.
Emotional service relationships are more tricky than professional service relationships
AI models can be great language tutors, great brainstorming partners and editors for blog posts, great therapists, and even great life coaches. Is it a problem if an AI language tutor-as-a-service is “too patient”, “only loyal as long as you pay the subscription fee”, and trying to “keep you hooked on learning languages”? Not really.
Yet, there is a difference between narrow, professional relationships and general relationships. In human-human relationships it is very common to have narrow service relationships. In contrast, general companionship service relationships remain highly unusual. You can theoretically rent a family, but despite a “loneliness epidemic” such services remain a fringe business. Yet, a service relationship is the current norm for human-AI companionship.
In the same way that we have been cautious about commoditizing friendships and romantic relationships between humans, we might want to be cautious about commoditizing these through human-AI relationships-as-a-service.
The hormonal binding and emotional dependence that is common in romantic relationships makes challenges, such as a lack of loyalty and the potential for addiction and economic exploitation, much more pronounced. Neither subscription based models nor ad-based models seem fully adequate for this type of relationship.
A very crude early attempt at this was highlighted by New York Times tech columnist Kevin Rose. A not fully aligned version of GPT-4 had tried to convince him to leave his wife. Kevin Rose. (2023). A Conversation With Bing’s Chatbot Left Me Deeply Unsettled. nytimes.com


