Can AI Resurrect the Past: Can AI Help Us Connect with the Departed??

 AI is bringing back the dead, sparking debate about its impact on users.

On May 13, 2024, visitors at Arthur Rimbaud's home in Charleville-Mezieres can interact with a digital version of the late French poet, created by the Alsatian company Jumbo Mana using AI.


AI “resurrections” are an emerging technology allowing people to interact with digital versions of deceased loved ones. These systems work by gathering information about the deceased, such as text messages, emails, or responses to personality questions. This data helps the AI mimic the person's manner of speaking and thinking. For instance, Replika can simulate texting styles based on the information provided, while other projects go further by offering video interactions, letting users see a lifelike representation of the deceased. This technology aims to provide comfort but also raises ethical questions about how we handle memories and grief.

For instance, the Los Angeles company StoryFile lets people record videos of themselves sharing their life stories and thoughts before they pass away. At their funeral, people can ask questions, and the AI will play relevant answers from the recorded video.

In June, another company, Eternos, created an AI-powered digital version of a person. This project started earlier this year and allowed 83-year-old Michael Bommer to leave behind a digital version of himself that his family can still interact with.

Do these projects really help people?

In 2020, a South Korean mother met an AI version of her deceased daughter in virtual reality. The emotional video of their encounter led to a big online debate about whether this technology is helpful or harmful.

Creators of these projects argue that they give people control and help with deep emotional pain. Jason Rohrer, who started Project December, which also uses AI to talk to the deceased, explains that many users are dealing with extreme grief and find the tool useful for coping.

Jason Rohrer, creator of Project December, notes that many users turn to his service because their grief is so overwhelming they’re willing to try anything. The project lets people chat with AI versions of famous figures or personal acquaintances who have passed away. Rohrer explains that these conversations often help users find closure and say things they didn’t get a chance to before.

Robert LoCasio, founder of Eternos, created his company to preserve people’s life stories and help their families move on. When his colleague Michael Bommer, who passed away in June, wanted to leave a digital legacy, it was intended solely for his family to cherish and interact with.

Robert LoCasio recalled that just before Michael Bommer passed away, Bommer emphasized, "Remember, this was for me. I’m not sure if it will be used in the future, but it was important to me."

However, some experts are cautious about AI resurrections. They worry that deeply grieving individuals might not make well-informed choices about using this technology and fear it could have negative psychological effects.

Dr. Alessandra Lemma from the Anna Freud National Centre for Children and Families highlights that mourning is crucial for emotional development. She warns that relying too much on AI recreations could prevent people from accepting the loss and leave them stuck in a state of “limbo.” Some AI services even promote ongoing connections with deceased loved ones as a main feature, which raises additional concerns.

“Welcome to YOV (You, Only Virtual), the AI startup at the forefront of digital communication, helping us stay connected with loved ones forever,” used to say on the company’s website before it was updated.

Jason Rohrer explained that his grief bot has a built-in limit: users pay $10 for a set amount of conversation time. This fee covers the use of a supercomputer, and each response has a different computational cost. So, $10 might buy one to two hours of conversation, but users are notified as time runs out so they can say their final goodbyes.

Other AI chat services also charge for their use.

Dr. Alessandra Lemma, who studies the psychological effects of grief bots, is concerned about their use outside of a therapeutic setting. She believes they could be safely used as a supplement to therapy with a trained professional.

Studies worldwide are exploring how AI might provide mental health counseling, especially through personalized conversational tools.

"Are These Tools Unnatural?"

While they might seem like something from a Black Mirror episode, supporters argue that these digital innovations offer new ways to preserve life stories and address gaps left by fading traditional family storytelling.

Dr. Alessandra Lemma compares it to old practices where parents would leave behind items or books for their children if they knew they were dying. She sees AI as a modern version of this tradition, created by people anticipating their passing.

Robert LoCasio from Eternos shares this view, agreeing that AI can serve as a contemporary means of leaving a legacy.

“The ability to share personal life stories with friends and family is the most natural thing,” he said.

How Safe and Private Are AI Resurrection Services?

Experts worry that these services might not keep personal data secure. Information like text messages could be accessed by third parties. Even if a company promises privacy when you sign up, changes in terms or company ownership can jeopardize this, warns Renee Richardson Gosline from MIT Sloan School of Management.

Both Rohrer and LoCasio claim that privacy is a top priority for their projects. Rohrer can only access conversations if users request customer support, while LoCasio’s Eternos restricts access to the digital legacy to authorized family members.

However, both experts agreed that these concerns could become more significant with large tech companies or for-profit businesses.

One major worry is that companies might use AI resurrections to target users with personalized marketing. Imagine receiving ads or product suggestions in the voice of a deceased loved one.

“When you use AI this way with vulnerable people, it creates a false endorsement from someone who never consented to it. This raises serious issues about agency and power imbalance,” said Gosline.

What Are the Other Concerns About AI Chatbots?

Gosline points out that since these tools primarily target people dealing with grief, they are inherently risky, especially when major tech companies get involved.“In a tech world that often values speed over safety, we should worry because it's usually the most vulnerable who suffer first,” said Gosline. “And it’s hard to find people more vulnerable than those who are grieving.”

Experts are concerned about the ethics of creating digital versions of deceased people, especially when they didn’t agree to it and users provide the data.

There are also worries about the environmental impact of AI tools and chatbots. Large language models (LLMs) used in these applications require massive data centers that emit a lot of carbon, use a lot of water for cooling, and generate e-waste from frequent hardware updates.

A recent Google report revealed the company is falling short of its net-zero goals due to the high demands of AI on its data centers.

Gosline acknowledges that no program is perfect and understands why people might use AI chatbots to reconnect with lost loved ones. However, she urges leaders and scientists to consider the world they’re creating and ask themselves: “Do we really need this?”

Comments

Popular posts from this blog

Kenyan Fans Overcome Financial Strain to Attend Paris Olympics

Diamond’s English Under Scrutiny Again After Claiming He Arrived ‘2nd in the Morning’

The Uniquely Formulated Mist Spray That Supports Healthy Toenails