When Devan Leos’ uncle, Darren Harris, a commander and spokesperson for the Santa Clarita Police Department, took his own life in late 2023, it was a huge blow. “It was just so unexpected because there were just no signs,” Leos told Observer. “We were all just unprepared. We wanted answers. We wanted closure. We just wanted to tell him how much we loved him and cared about him.” In the midst of his grief, Leos turned to ElevenLabs AI, a generative voice artificial intelligence (A.I.), to help him understand what had happened and find solace. He fed the A.I. some of his uncle’s videos from his time at the sheriff’s department and had a brief conversation with the A.I. bot.
Thank you for signing up!
By clicking submit, you agree to our terms of service and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.
“I created some messages just so that I could hear his voice and hear him say things that I would otherwise never get to hear him say and hearing my uncle’s voice through A.I. kind of made me feel like he was still alive. And it kind of provided this strange but somewhat comforting connection to this person. It was just so weird, but at the same time, it was mixed for me because it was a bit therapeutic,” Leos told Observer.
“Deadbots” and “griefbots” are trained on the language patterns and digital footprints of those who are no longer with us. Platforms like Seance AI, StoryFile, Replika, HereAfter and others use videos, voice recordings, and/or written content to simulate real people, both alive and dead, and allow users to have conversations with those simulations. While not uncommon, it is uncanny.
For example, movie producers have resurrected Carrie Fisher and James Dean, while other innovators have tried to recreate the mind of Ruth Bader Ginsberg. OpenAI recently had to pull their voice bot, Sky, because it sounded eerily similar to a very much alive Scarlett Johansson. In a recent paper........