Skip to content

Please stop asking chatbots for love advice

    As he sat facing me, my patient had a sad expression on his face.

    “I had a date,” he announced. “It didn’t go well.”

    That was not unusual for this patient. For years he had shared stories of romantic hopes dashed to the ground. But before I could ask him what went wrong, he continued, “So I asked a chatbot what to do.”

    uh. What? Human conversation simulations powered by artificial intelligence – chatbots – have been in the news a lot, but I had never had a patient tell me they had actually used one for advice.

    “What did it tell you?” I asked, curious.

    “To tell her I care about her values.”

    “Oh. Did it work?”

    “Two guesses,” he sighed and raised his hands. Although this patient was the first, it is now a regular occurrence in my therapy practice to hear from new patients that they consulted chatbots before consulting me. Usually it’s for love and relationship advice, but it could also be to connect with their kids or set boundaries or straighten out a friendship gone wrong. The results are decidedly mixed.

    A new patient asked the chatbot how to deal with the death of a loved one. Set aside time in your day to remember what was special about the person, the bot advised. I couldn’t have said it better myself.

    “What it wrote made me cry,” said the patient. “I realized I was avoiding my sadness. So I made this deal.”

    Another patient started relying on AI when her friends started wearing thin. “I can’t burn out my chatbot,” she told me.

    As a therapist, I am both alarmed and intrigued by AI’s potential to enter the therapy world. There is no doubt that AI is the future. It has already proven to be useful in everything from writing cover letters and speeches to planning trips and weddings. So why not let it help with our relationships too? A new venture called Replika, the “AI Companion Who Cares,” has gone one step further and even created romantic avatars for people to fall in love with. On other sites, like Character.ai, you can chat and hang out with your favorite fictional characters, or build a bot to talk to yourself.

    But we live in an age of disinformation. We have already seen disturbing examples of how algorithms spread lies and conspiracy theories to ignorant or ill-intentioned people. What will happen if we allow them into our emotional life?

    “While AI can articulate things like a human, you have to ask yourself what the purpose is,” said Naama Hoffman, an assistant professor in the department of psychiatry at the Icahn School of Medicine, Mount Sinai Hospital, in New York City. “The goal in relationships or in therapy is to improve quality of life, while the goal of AI is to find what is most quoted. It’s not meant to help per se.”

    As a therapist, I know that my work can benefit from outside support. I’ve been leading trauma groups for 20 years and I’ve seen how the scaffolding of a psychoeducational framework, especially an evidence-based framework like Seeking Safety, facilitate deeper emotional work. After all, the original chatbot, Eliza, was designed as a “virtual therapist” because it asked endless open-ended questions – and you can still use it. Chatbots can help people get inspired or even break through defenses and enable people to go into therapy. But where is the point where people become too dependent on machines?