Judges, Palmer says, already “struggle with what to do about affairs with humans,” and AI companions will only complicate that, as they take into account the broader impact on the relationship. Children complicate the matter even more. When it comes to custody battles, “it is conceivable and likely that they would question the parents’ judgment because they’re having intimate discussions with a chatbot,” which “brings into question how they are spending time with their child.”
Although the sophisticated chatbots we use today have only been around for a few years, Yang claims the tech will only play a bigger role in marriages and divorces. “As it continues improving, becoming more realistic, compassionate, and empathetic, more and more people in unhappy marriages who are lonely are going to be going to seek love with a bot.”
Yang has not had clients raise the issue yet, but she anticipates a boom in divorces in the coming years as more people turn to AI for companionship. “We’ll probably see an increased rate of divorce filings. When Covid happened a few years ago, the increase in divorces was very significant. We probably saw three times the amount of divorces that were filed around 2020 to 2022. After 2022, once things got back to normal, divorce rates were back down. But it will probably go back up.”
It’s already happening in some places. In the UK, a partner’s use of chatbot apps has become a more common factor contributing to divorce, according to data collection service Divorce-Online. The platform claims to have received an increase in the number of divorce applications this year where clients have said apps like Replika and Anima created “emotional or romantic attachment.”
Despite the rupture it is causing, Palmer says she still believes AI relationships can be positive. “Some people are finding real fulfillment.” But she warns that “people need to recognize the limitations.” In October, California became the first state to pass an AI regulations law for companion chatbots. The law goes into effect in January 2026 and requires apps to have certain key features, such as age verification and break reminders for minors, and makes it illegal for chatbots to act as health care professionals. Companies who profit from illegal deepfakes are also fined up to $250,000 per incident.
In some ways, Palmer has seen what’s happening now before with social media instead of AI. “It could be that a partner connected with someone they haven’t seen in years. Or that there is just a true need to have communication. It is a rare case anymore where social media is not involved.” AI, she says, is the natural evolution of that. “And what I am finding is, AI is turning into exactly that.”