
From voice clones to digital avatars, AI offers new ways to digitally preserve their loved ones, and raising concerns about data, consent and how this technology can impact how we cry.
Diego Felix Dos Santos never expected to hear again the voice of his late father, until he made it possible. “The tone of the voice is quite perfect,” he says. “He feels like, almost, he’s here.”
After the 39 -year -old father died unexpectedly last year, Dos Santos traveled to his native Brazil to be with the family. It was only after returning home in Edinburgh, Scotland, who says he realized that “he had nothing to remember [me of] my dad. “What he did was a voice note that his father sent him from his hospital bed.
In July, Dos Santos took that voice note and, with the help of eleven laboratories, an artificial intelligence voice generator platform founded in 2022, paid a monthly rate of $ 22 to raise the audio and create new messages in the voice of his father, simulating conversations that they never had to have.
“Hello son, how are you?” His father’s voice sounds like the application, as he would in his usual weekly calls. “Kisses. I love you, Mandón,” adds the voice, using the nickname his father gave him when he was a child. Although the religious family of Dos Santos initially had reservations on him using AI to communicate with his father beyond the grave, he says that since then they have reached their choice. Now, he and his wife, whom they were diagnosed with cancer in 2013, are also considering creating clones of Ai of themselves.
The experience of Dos Santos reflects a growing trend in which people use AI not only to create digital similarities, but to simulate the dead. As these technologies become more personal and generalized, experts warn about ethical and emotional risks, of data consent and protection issues to commercial incentives that drive their development.
The artificial intelligence technologies market designed to help people process loss, known as “pain technology”, has grown exponentially in recent years. Lighting by American startups such as Storyfile (a Video with AI that allows people to be recorded for posthumous reproduction) and from now on AI (a voice -based application that creates interactive avatars of deceased loved ones), this technology is marketed as a means to face, and perhaps even forest, gray.
Robert Locascio founded eternal, a Startup based in Palo Alto that helps people create a digital twin of AI, in 2024 after losing his father. Since then, more than 400 people have used the platform to create interactive avatars of AI, says Locascio, with subscriptions that begin from $ 25 for an inherited account that allows the story of a person to remain accessible to their loved ones after their death.
Michael Bommer, an engineer and former colleague of Locascio, was one of the first to use eternal to create a digital replica of himself after finding out his diagnosis of terminal cancer. Crazy says that Bommer, who died last year, found a closure to leave a piece of himself for his family. His family has also found a closure of her. “He captures his essence well,” said his wife Anett Bommer, who lives in Berlin, Germany, Reuters in an email. “I’m sorry closed in my life through AI because it was his last sincere project and this has now become part of my life.”
The objective of this technology is not to create digital ghosts, says Alex Quinn, CEO of Authentic Interactions INC, the parent company based in Los Angeles de Storyfile. Rather, it is to preserve people’s memories while they are still close to share them. “These stories would cease to exist without some kind of interference,” says Quinn, and points out that while the limitations of the clones of AI are obvious, the avatar will not know the climate outside or who is the current president, the results are still worth it. “I don’t think anyone wants to see someone’s story and someone’s story and someone’s memory goes completely.”
One of the greatest concerns surrounding duel technology is consent: what does it mean to recreate someone who finally has no control over how their similarity is used after they die? While some companies such as Eleven Labs allow people to create digital similarities of their loved ones posthumously, others are more restrictive. Crazy Eternal, for example, says that their policy restricts them to create avatars of people who cannot give their consent and manage checks to enforce it, including demanding that those who make accounts register their voice twice. “We will not cross the line,” he says. “I think, ethically, this doesn’t work.”
Once Labs did not respond to a request for comments.
In 2024, the ethics of AI at the University of Cambridge published a study that requested security protocols to address the social and psychological risks raised by the “digital life industry.” Katarzyna Nowaczyk-Basińska, a researcher at the Leverhulme Cambridge Center for the future of intelligence and co-author of the study, says that commercial incentives often promote the development of these technologies, which makes transparency around the privacy of the data essential.
“We have no idea how these data (of the deceased person) will be used in two or 10 years, or how this technology will evolve,” says Nowaczyk-Basińska. A solution, it suggests, is to treat consent as a continuous process, reviewed as IA’s abilities change.
But beyond concerns about privacy and data exploitation, some experts also care about the emotional cost of this technology. Could you inhibit the way people deal with pain?
Cody Delistraty, author of “The Grief Cure”, warns against the idea that AI can offer a shortcut through duel. “The pain is individualized,” he says, and points out that people cannot pass it through the sieve of a digital avatar or ia chatbot and wait “to get something really positive.”
Anett Bommer says he did not trust her husband’s avatar during the early stages of her own grieving process, but does not believe she would have affected her negatively if she had done so. “The relationship with the loss has not changed anything,” he says, and adds that avatar “is just another tool that I can use together with photos, drawings, letters, notes” to remember it.
Andy Langford, clinical director of the beneficial duel organization based in the United Kingdom Cruse, says that although it is too early to make concrete conclusions on the effects of AI on pain, it is important that those who use this technology to overcome the loss do not “stuck” in their pain. “We need to do a little of both: the duel and the living,” he says.
For dos Santos, resorting to AI at his time of pain was not about finding the closure, it was about looking for a connection. “There are some specific moments in life … that I would normally call it for advice,” says Dos Santos. While he knows that AI cannot bring his father back, he offers a way to recreate the “magical moments” that he can no longer share.