The rise of chatbot “friends”
Can you truly be friends with a chatbot?
If you find yourself asking that question, it’s probably too late. In a Reddit thread a year ago, one user wrote that AI friends are “wonderful and significantly better than real friends […] your AI friend would never break or betray you.” But there’s also the 14-year-old who died by suicide after becoming attached to a chatbot.
The fact that something is already happening makes it even more important to have a sharper idea of what exactly is going on when humans become entangled with these “social AI” or “conversational AI” tools.
Are these chatbot pals real relationships that sometimes go wrong (which, of course, happens with human-to-human relationships, too)? Or is anyone who feels connected to Claude inherently deluded?
To answer this, let’s turn to the philosophers. Much of the research is on robots, but I’m reapplying it here to chatbots.
The case against chatbot friends
The case against is more obvious, intuitive and, frankly, strong.
Delusion
It’s common for philosophers to define friendship by building on Aristotle’s theory of true (or “virtue”) friendship, which typically requires mutuality, shared life, and equality, among other conditions.
“There has to be some sort of mutuality — something going on [between] both sides of the equation,” according to Sven Nyholm, a professor of AI ethics at Ludwig Maximilian University of Munich. “A computer program that is operating on statistical relations among inputs in its training data is something rather different than a friend that responds to us in certain ways because they care about us.”
This story was first featured in the Future Perfect newsletter.
Sign up here to explore the big, complicated problems the world faces and the most efficient ways to solve them. Sent twice a week.
The chatbot, at least until it becomes sapient, can only simulate caring, and so true friendship isn’t possible. (For what it’s worth, my editor queried ChatGPT on this and it agrees that humans cannot be........
© Vox
