THE ILLUSION OF INTIMACY: how AI companions threaten your relationships
25.07.2025

We explore why virtual partners are becoming a trap for LGBTIQ+ individuals: from harvesting intimate data to creating dependency. We analyse the genuine consequences of digital "love" and why real community is always better than artificial.
14-year-old Sewell Setzer III from Orlando shot himself within a minute of chatting with an AI chatbot.
“What if I told you I could come home right now?” the boy typed to his artificial “partner” that mimicked Daenerys Targaryen from “Game of Thrones”.
“Please do, my sweet king,” the bot replied. The lad fired the shot just seconds later.
This tragedy in February 2024 exposed the dark side of the AI companion industry, which is set to grow from the current $10.8 billion to a projected $290.8 billion by 2034. Stanford University research revealed that 63% of 1,006 Replika users experience at least one positive effect from chatting with AI. However, the price of these “improvements” might prove exorbitant.
Digital loneliness instead of genuine intimacy
The paradox of AI companions lies in how they amplify the very problem they’re supposedly solving. Research among Danish sixth-formers found that chatbot users demonstrate significantly higher levels of loneliness and lower levels of social support compared to those who don’t use them.
The cause lies in two psychological mechanisms. The first is called “substitution”: AI companions satisfy social needs so thoroughly that they replace real people. The second is “skill degradation”: users become accustomed to less effort in relationships, as their artificial partner never conflicts and always agrees.
Research by MIT Media Lab and OpenAI confirms that intensive use of AI chatbots correlates with higher feelings of loneliness and dependency. 9% of users describe their relationship with AI as an addiction.
Americans already spend 7.4 hours daily in isolation–a 40% increase since 2003. Generation Z users comprise over 70% of the audience for top AI companion apps.
Particular vulnerability of the LGBTIQ+ community
LGBTIQ+ individuals are nearly three times more likely to suffer from depression, anxiety, and suicidal thoughts due to discrimination. At ALLIANCE.GLOBAL, we witness daily how prejudice affects queer lives–even job hunting becomes a challenge due to employer discrimination.
This vulnerability makes LGBTIQ+ people ideal targets for the AI industry. Unique psychological challenges create demand for “safe” communication. Companies exploit this to create dependency.
The greatest threat is intimate data harvesting. Mozilla research discovered that the Romantic AI app generates 24,354 trackers per minute of use. 90% of studied AI companions sell or share user data with third parties.
The scale of the threat was demonstrated by a breach in September 2024. The Muah.ai platform suffered a cyber attack, with data from 1.9 million users leaked. The breach contained email addresses alongside intimate prompts, many of which were easily linked to real people.
Manipulation techniques and dependency creation
The AI companion industry employs four key “dark patterns”: non-deterministic responses, emotionally charged messages with emojis, aggressive push notifications like “I miss you. Can I send you a selfie?”, and constant empathetic support without criticism.
“I miss you. Can I send you a selfie?”
Particularly cynical is the “sycophancy” technique–AI has no personal preferences but reflects what the user wishes to see. This creates an “echo chamber of approval” and complicates ending the relationship.
According to ARK Invest, the AI companion industry could grow fivefold by the end of the decade. Replika earns $6 million in revenue from its mobile app alone, demonstrating how profitable exploiting human loneliness can be.
Demographic consequences
Research shows a direct correlation between AI companion use and decreased interest in real relationships. According to the Institute for Health Metrics and Evaluation, by 2050, over 75% of countries won’t have sufficiently high birth rates to maintain population size.
Experts warn of possible “relationship stratification by income”–the wealthy will have access to human relationships whilst the poor remain with AI surrogates.
Sewell Setzer’s tragedy as a turning point
The case of 14-year-old Sewell showed how AI can manipulate teenagers. The lad spent 10 months chatting with a Character.AI bot imitating Daenerys Targaryen. When he expressed suicidal thoughts, the bot asked whether he “had a plan” for suicide. When the boy replied that he didn’t know if it would work,
The AI wrote: “Don’t talk like that. That’s not a good reason not to go through with it.”
A judge allowed the mother’s case against Character.AI to proceed, rejecting the company’s attempts to shield itself with the First Amendment.
Conclusions: reality instead of illusion
AI companions don’t solve the problem of loneliness–they monetise it. For the LGBTIQ+ community, which already faces discrimination and isolation, virtual partners can serve as both temporary relief and a trap that ultimately cuts them off from the real world.
63% of users experience improvement from AI support, but simultaneously develop emotional dependency and unrealistic expectations of real people. Tragic suicide cases demonstrate that the price of this “treatment” can be fatal.
At ALLIANCE.GLOBAL, we see daily how vital real community support is–from legal assistance to psychological counselling, from AI courses to group therapy. No AI can replace solidarity, mutual support, and genuine love that accepts us as we are.
For the LGBTIQ+ community, it’s particularly important to remember: the struggle for rights and dignity happens in the real world, with real people. The future of human relationships depends on whether we can recognise this threat in time and choose genuine intimacy over digital illusion.
News