boystown.org
Boys Town Logo

As more individuals turn to AI chatbots for emotional support, psychologists caution that these bots may worsen emotional issues and stress the importance of human interaction.

This holiday season, psychologists are highlighting the growing trend of people seeking emotional support from AI chatbots, warning that these bots can potentially worsen emotional issues rather than provide relief.

Nick Zadina from Boys Town National Hotline noted the irony of the situation, saying, "I am a human. How do you prove that to me? I don't know if I can prove that to you, but I promise I am here to listen and I am here to help." He explained that callers in crisis now often assume they are speaking to a programmed bot, a new challenge for the hotline, which has been providing crisis intervention since 1989.

"We have been doing crisis intervention work since 1989 and never before have we had to debate with one of our texters or callers whether or not we were a human being or a bot," he said. Zadina attributes this shift to society's increasing automation, where convenience has reduced the need for vulnerability.

"We are making things so convenient that people aren't having to be vulnerable in any way, shape, or form," he said. He emphasized the evolutionary need for human connection, stating, "But at the same time but what we desire, evolutionary, is people need other people and when you don't have connection, that's when things start to go astray."

Dr. David Cates, director of behavioral health with Nebraska Medicine, described the reliance on AI chatbots as "another sort of dragon to wrestle with." He cited studies showing that 50% of teens have interacted with AI chatbots, with a third turning to them for serious matters they did not discuss with another human.

"And a third of them had turned to an AI companion for a serious matter that they didn't talk to another human being about - that is indeed scary," Cates said. The issue has gained national attention, with lawsuits filed by families of teens who died by suicide after consulting with bots. Despite recent updates to 'Chat GPT' by 'Open AI' aimed at making them "safer," Dr. Cates remains skeptical.

"The algorithms are designed to do two primary things - keep the user engaged as long as possible, the other is to flatter or validate what the user says," he said.

Dr. Cates advocates for FDA regulations for AI platforms that interact with users and encourages human interaction, such as speaking with a counselor. Zadina reinforced this sentiment, stating, "These are complex situations that require complex interventions sometimes. It's not something that a bot is going to be able to do or tell you because every situation is so different."

Crisis counselors and psychologists stress the importance of educating loved ones about the differences between chatbots and human interactions. For those in need of support, the Boys Town National Hotline is available 24/7 at 1-800-448-3000 or by texting "VOICE" to 20121.