AI ‘friends’ a threat to lonely people, KU expert says


LAWRENCE – The promise of friendship from an artificial intelligence chatbot fundamentally misunderstands what real friendship is, according to one of the nation’s leading academic experts on the intersection of relationships and technology.

Jeffrey Hall, professor of communication studies at the University of Kansas, pushes back against the notion, propounded in recent days by Meta founder Mark Zuckerberg, that computer programs that simulate human companionship will meet human needs in the near future.

Hall has gained acclaim for his research on what it takes to make and maintain a friendship. He directs KU’s Relationships and Technology Lab, and he has experimented with several AI “friends.”

Some of their potential harm, Hall said, stems from their sycophantic nature.

“Talking with the chatbot is like someone took all the tips on how to make friends — ask questions, show enthusiasm, express interest, be responsive — and blended them into a nutrient-free smoothie,” Hall said. “It may taste like friendship, but it lacks the fundamental ingredients.” 

His research shows that quality communication — the stuff that makes us feel better — takes work. Being a good listener and showing care both take energy and intention. Catching up takes time, he said.

“We also want friends to understand who we really are and still like us,” Hall said. “My research on friendship expectations found that the most important and mature standard of friendship is genuine positive regard. Yes, we want to be liked. But we also want to be seen accurately. Positivity without genuineness feels wrong.

Worse, AI “friends” could exploit humans emotionally and financially. To understand the risks, “it is important to understand what it means to be lonely,” Hall said. 

“True friendship is about mattering to another person, not a checklist of benefits to be extracted or behaviors to be enacted,” the KU researcher said. That’s where the bigger risks come in.

“An AI friend could be programmed to be needy, to require attention and be demanding. They could become frighteningly good at making us feel like we owe it something. If we let it down, an AI friend could be distant, remote or manipulative,” Hall said. “Consider the financial scams perpetrated on lonely people and the strong attachments people form with celebrities. A personalized, always-available AI friend that makes a person feel needed and necessary would be dangerous — more personalized and available than a celebrity fraudster.”

Given the high rates of loneliness, people are hungry for intimacy, Hall said.

“Human needs are not designed to be permanently fulfilled; they are designed to be temporarily satisfied,” he said, “so we can be fooled into believing something artificial — like an AI friend — is the real thing.” 

Thu, 05/08/2025

author

Rick Hellman

Media Contacts

Rick Hellman

KU News Service

785-864-8852