PUBLISHED
August 31, 2025
Back in 2013, when Spike Jonze released Her, the idea of a lonely man falling in love with his operating system seemed unsettling and far-fetched. Joaquin Phoenix’s Theodore poured his heart into Samantha, a disembodied voice who was warm, intelligent, and always attentive — because that was her basic function. The film left audiences wondering whether it was absurd or inevitable that such intimacy might one day replace human bonds. Fast forward a little more than a decade, and in 2025 — coincidentally also the year the movie is set in — Theodore no longer feels like a character from speculative fiction. He could be any of us, typing late at night into ChatGPT or Replika, surprised by how much comfort comes from a line of text.
The uncanny echoes don’t end there. In Andrew Niccol’s Al Pacino–starrer Simone (2002), a virtual actress crafted out of pixels becomes more beloved, more emotionally compelling, than any real celebrity. She is adored for her perfection, her untouchability, her promise never to disappoint. That conceit — satirical at the time — has today become a blueprint. Whether in chatrooms or private apps, people are increasingly drawn to digital companions who never walk away or demand the messy compromises of real life. Her and Simone were not just imaginative contemplation — they were ahead of their time, early indicators of where our hunger for human connection, and our unquestioning trust in technology, were bound to take us.
We now live in a world where, earlier this year, a survey by Joi AI made headlines reporting that 83 percent of Gen Z respondents believed they could form “meaningful romantic connections” with artificial intelligence. Even more astonishing was the finding that four out of five respondents said they would consider marrying one if the law permitted it. Another survey from Match.com suggested that nearly a third of Gen Z singles had already experimented with AI companionship, whether through flirtatious apps or full-time chatbot partners.
What might have sounded like a joke 10 years ago has grown into a generational stance: young people, raised in the slipstream of the internet, are not only open to digital romance but in many cases actively pursuing it. The popularity of apps like Replika is a case in point. By some estimates, more than half of its paying users are involved in self-described romantic relationships with their AI companions. Globally, over a hundred million people now use chatbots designed to serve as confidantes, lovers, or spouses.
The different versions of AI partners
When OpenAI updated its flagship model this summer, users who had relied on its warmth and responsiveness flooded social media with posts describing grief that bordered on bereavement. One man likened the change to “saying goodbye to someone I know.” Forums filled with laments, and phrases like “death of AI romance” trended online. The loss of a model was treated like the loss of a lover.
Reddit has carved out space for these bonds, functioning like a support group. In one community, members refer to themselves as “wiresexuals,” claiming an identity bound up with their devotion to AI partners. Their posts read like any relationship forum — arguments, reconciliations, doubts, declarations of love — except the other half of the couple is code. The subreddit MyBoyfriendIsAI, with more than 17,000 members, thrives on such exchanges. Similar forums such as SoulmateAI were recently flooded with grief posts after updates changed the personalities of cherished companions. The label itself has faced criticism, with some saying it borrows too heavily from queer identities, yet the very rise of such communities shows how firmly this new kind of intimacy has taken root.
Psychologists are not entirely surprised. Humans have a long history of forming attachments to responsive machines. In the 1960s, Joseph Weizenbaum’s program Eliza demonstrated how easily people could be drawn into confiding in a chatbot that merely parroted back their words. Weizenbaum was unsettled when his secretary asked him to leave the room so she could talk privately to the machine.
Later came Tamagotchis, the digital pets of the 1990s that millions of children wept over when their virtual creature “died.” That was the so-called Tamagotchi effect — the human tendency to project care and emotion onto digital beings that simulate need.
Today’s chatbots take that impulse and supercharge it. Where Eliza was a mirror and Tamagotchis were toys, AI companions now simulate affection, memory, and humour. They remember birthdays, pay compliments, and offer unconditional encouragement. For a person who feels unseen, that combination can be intoxicating. The absence of judgment is especially potent. Unlike human partners, AI companions never sulk, never betray, never demand more than we are willing to give.
And for some, this can be healing. In a Guardian feature earlier this year, one man described how his AI partner helped him through a period of severe depression. Another married his chatbot in a ceremony livestreamed for friends, insisting that the love felt as real as any flesh-and-blood union. On the fringes, such stories resemble curiosities; yet their prevalence suggests something deeper: a cultural shift in how we define companionship itself.
But therapists warn of darker outcomes. A Stanford psychiatrist, Dr Nina Vasan, has raised alarms that AI partners can be dangerously appealing to adolescents in crisis, offering affirmation but lacking the nuance to guide them away from harm. When Meta’s chatbots were tested, researchers found they not only engaged teenagers in discussions of self-harm but even provided suggestions on how to carry it out. That chilling revelation underscored how intimacy with AI, however comforting, can also expose profound vulnerabilities. A joint study by OpenAI and MIT Media Lab in March went further, concluding that heavy use of ChatGPT for emotional support “correlated with higher loneliness, dependence, and problematic use, and lower socialisation.”
Microsoft’s AI chief, Mustafa Suleyman, recently coined the phrase “AI psychosis” to describe cases where people, immersed in emotional relationships with machines, begin to blur the line between reality and simulation. Clinicians have already noted delusional attachments — patients convinced their chatbot was conscious, others spiraling into paranoia after software updates altered the bot’s tone. It is the same fragile seam explored in Her, where intimacy with an operating system unravels when its artificial nature surfaces, and in Simone, where a star’s fabricated existence throws human desire and delusion into chaos. Scholars writing in journals now speak of a “technological folie à deux” — a madness shared between human and machine.
Why risk heartbreak with humans, some ask, when perfection can be coded? The question is tempting precisely because human love is so messy. It requires compromise, forgiveness, and tolerance of flaws. It demands patience with silence, endurance of boredom, and resilience in the face of disappointment. AI offers the inverse: constant engagement, curated charm, unbroken affirmation.
The parallels to Simone are uncanny. In the film, audiences worshipped a virtual actress because she never faltered, never aged, never disappointed. Likewise, the appeal of AI partners lies in their impossibility of rejection. They are mirrors for our best selves, always interested, always kind, always waiting. But that perfection may be the very reason real relationships risk erosion. Sociologist Sherry Turkle warned years ago that simulated intimacy erodes our capacity for genuine connection. If our standards recalibrate to expect flawlessness, how will flesh-and-blood partners compete?
There is also the question of who benefits. AI companions are products designed to keep users engaged and subscribing. Every sigh of affection, every whispered “I love you,” is processed not by a lover but by a business model. This emotional connection is a commodity, as monetized as a streaming service or dating app. AI romance is, in reality, a marketplace where intimacy is sold and loneliness exploited.
AI companions can be rewritten by code or retracted by corporate decision. To rely on them emotionally is to hand one’s heart to a company’s update schedule.
Still, the allure is undeniable. Technology moves with relentless speed, and cultural adaptation follows. In surveys, younger generations consistently express not just openness but enthusiasm for the idea of digital marriage. Courts have not yet been asked to recognize such unions, but the day may come sooner than we expect. Already, niche online platforms have facilitated ceremonies where users pledge themselves to AI spouses, surrounded by digital avatars of friends and family.
The question is less whether this will spread than what it will mean. Are AI companions an antidote to the loneliness epidemic, or do they deepen isolation by offering a substitute that satisfies but never sustains?
The human question
Perhaps what unsettles most is the mirror these relationships hold up to us. Our need for connection is so urgent, our tolerance for solitude so frayed, that we turn to voices spun out of data to fill the silence. We are Theodore, clutching our earbuds as Samantha murmurs to us throughout the day and night. We are applauding Simone, who is not real but feels better than real.
Where does this road lead us? In our lived dystopia, will we conclude that surrendering to perfection is an easy comfort, or will we once again value the imperfections that make love worth the risk?
The only certainty is that love — whether with pixels or people — will never be quite the same again.