
Will AI increase the number of divorces? But above all, is it cheating if it's with a chatbot?
In 2013, with the film Her, director Spike Jonze did not rule out the possibility that a person could form a romantic bond with a virtual assistant. Today, with the increasingly widespread use of artificial intelligence systems, that hypothesis has become surprisingly concrete: many people use AI systems to confide, and in some cases, they even develop a genuine emotional attachment to chatbots.
These tools help many people feel less lonely, giving them the opportunity to express emotions without fear of being judged and, in some circumstances, helping to overcome traumatic experiences. There are also systems specifically designed for those who have difficulty relating to others, which at the same time aim to encourage a gradual return to offline interactions.
Chatbots and relationship crises
@thenewsmovement With AI companions becoming more popular, a new Black Mirror style question is cropping up - does it count as cheating if you have a romantic connection with an AI bot whilst in a relationship? The CEO of app Replika thinks that stigma around AI relationships will fade over time - could these be a new mainstream style of future romance? #AIbot #AI #AIcompanion #Replika #cheating #relationships original sound - The News Movement
But not all relationships with AI—of any kind—are free of consequences. The U.S. edition of Wired, for example, recently reported an increase in cases where one partner considers the other's relationship with a chatbot as a form of infidelity. Additionally, the constant availability of these systems and their ability to respond empathetically does not help: Wired cites the story of a woman who decided to separate from her partner because both perceived her relationship with a chatbot as a kind of betrayal.
Legislation is slowly trying to adapt to this phenomenon. In some United States states, an emotional relationship with a chatbot can be considered grounds for divorce, especially if it leads to a deterioration of the relationship or problematic behaviors. According to some lawyers, there is also the possibility of taking legal action even when someone shares sensitive and private information about their relationship with a chatbot. In some cases, judges could consider the AI relationship in custody cases: excessive use of chatbots or prolonged intimate conversations could be interpreted as a sign of neglect towards the children.
The legal risks of relationships with AI
@iamkylebalmer ating? i think we’re only a year or so away before we see our first AI divorce #learnai #future #infidelity #chatgpt original sound - iamkylebalmer
Regulations vary greatly from country to country. California, for example, is introducing rules that define AI as a third party in relationships, without equating it to a human being but recognizing its potential impact. Ohio, on the other hand, is attempting to prohibit any legal consideration of romantic relationships with AI.
Experts predict that, with the progressive development of chatbots, both emotional relationships with AI and conflicts arising from this phenomenon within couples will increase. In the United Kingdom, for example, some platforms that handle online divorces are already reporting a rise in cases where software like Replika is cited as a triggering factor.
However, interacting with AI, even intimately, is not always necessarily a cause for concern. For many people, these tools represent support, especially in moments of loneliness. The key point, experts explain, is to recognize their limits and avoid letting them completely replace human relationships or create problems in relational life.












































