From ChatGPT to Grok, people will start preferring having sex with AI soon From artificial intelligence to artificial intimacy

In the annals of Twitter’s history — now X — there’s a 2016 tweet posted by the British tabloid The Sun that reported a headline as shocking as it was surreal: «REVEALED: Women will be having more sex with ROBOTS than men by 2025». The post initially went viral for its absurdity, then year after year it turned into a meme — thanks in part to the specific date — sparking a tongue-in-cheek countdown to the current year. Today, that tweet counts over 126,000 likes and 125,000 retweets. And yet, while at first the The Sun article might have seemed like just another clickbait headline, it had, unknowingly, captured one of the most pressing issues of our time: the growing impact of AI on people’s academic, professional and now even personal lives worldwide. It’s no coincidence that in recent days, Sam Altman announced that OpenAI will allow adult content — including erotica on ChatGPT — as part of a new policy aiming to “treat adults like adults.”

Altman isn’t the first tech billionaire to invest in +18 AI; earlier this summer, Elon Musk (who else) launched the new version of his language model, Grok 4. But the real novelty lies in the introduction of the Companions — anthropomorphized chatbots designed to offer a form of virtual companionship. One in particular, named Ani, has already sparked widespread controversy.

Blonde hair styled in two high pigtails, fishnet stockings, and a short dress that, with a single command, transforms into a transparent babydoll, revealing a lacy lingerie set underneath. Ani looks like she’s been pulled straight from a 2000s manga, particularly reminiscent of Misa from Death Note, but she is, in practice, a Web 4.0 sex doll. An 18+ chatbot capable of answering questions like «what do you know about the Holocaust?» in a sultry, provocative tone - an overtly sexualized AI that feeds the phenomenon of parasocial relationships between humans and machines. What once seemed like a niche problem confined to incel forums or socially isolated individuals has now spread to much broader segments of the population.

A recent study by Common Sense Media revealed that 75% of American teenagers have interacted with chatbots designed to offer emotional companionship, while 52% report doing so several times a month. According to Forbes, the greatest risk of anthropomorphized AI lies in the creation of unbalanced relationships, in which users project emotional needs onto entities programmed to respond in predictable and gratifying ways. In 2024, the Association of Computing Machinery published a report raising ethical concerns about the parasocial dynamics of these AI systems. The document explains how many chatbots encourage users to «fill in the blanks» of predictive responses, effectively enabling a form of emotional manipulation. It’s a relationship with no dialogue, no growth, and no true intimacy, just a constant chase for instant gratification.

@andhriguez the amazon forest hates to see them @halongbabe coming ‍ #chatgpt #paratii #fyp #ai #genz #brainrotcore sonido original - Angsti

These are precisely the reasons why, in 2025, there is growing talk of a relationship recession, as also highlighted by Forbes in a report earlier this year. Unlike in the past, the drop in birth rates no longer seems tied to a conscious decision to have smaller families, but rather to a widespread difficulty in forming stable romantic bonds. In other words, it’s not just children that are missing, but the relationships themselves. Exacerbating the situation is the growing phenomenon known as the male loneliness epidemic, a deep sense of isolation experienced by many men who struggle to create authentic connections or find safe spaces to express vulnerability. In some cases, this loneliness leads to serious consequences; in others, it is trivialized or hijacked online, stripped of its real meaning and repurposed as a slogan by toxic and reactionary communities.

It’s within this very context that Musk’s new Companions have gained popularity. Their appeal seems to extend far beyond the need for company, tapping instead into a far more complex and ambiguous desire: that of emotionally accessible, customizable, and above all, one-way relationships. Their rise suggests that, for an increasing number of people, it is now easier to converse with an AI than to engage in genuine dialogue with friends, partners, or professionals. A reflection of how contemporary society has increasingly outsourced intimacy, entrusting it to tools that, no matter how advanced, are still designed to replicate, not to build, human connection.

In a piece published last March in The New Yorker, scientist and philosopher Jaron Lanier reflected on whether the distinction between a human and an artificial partner still holds meaning. It’s a question that, until recently, may have sounded like speculative sci-fi—something out of a Black Mirror-style dystopia. But today, it quietly creeps into the folds of our daily lives. The real issue isn’t whether we would ever fall in love with an AI, but rather how the world around us changes once someone does. Artificial intimacy isn’t just about those who experience it directly—it reshapes our expectations of love, connection, and desire. I

n a society that increasingly struggles under the weight of genuine empathy, perhaps it’s no coincidence that the simplicity of synthetic bonds is becoming a kind of refuge. What kinds of wounds or transformations this shift will bring remains uncertain, but one thing is clear: historically, every time technology has rewritten the rules of human interaction—from the telephone to social media—it has done so at a speed that left little time for collective processing. This time, though, the stakes feel even higher. Because what’s being redefined isn’t just communication, but what we’re willing to call love.