AI Resurrects Your Dead Loved One: A Miracle? Or Emotional Whiplash?
Imagine you open your computer, strike a few keys, and there’s your spouse…who died three years ago – like my husband did. You aren’t looking at a photo. Or a video. No. This is more like Zoom or Facetime.
The image you see looks, acts, and moves just like your spouse. Sounds like your spouse, too. You stare in disbelief as the image smiles and says, “Honey, I’ve missed you.”
What’s your gut reaction? An AI miracle? Or emotional whiplash?
This past August, I received an email from a man who described a new AI platform he had created. He described the platform’s ability to gather photos and voice recordings of someone who died and blend them with memories shared by the deceased person’s family.
He wanted to demonstrate his AI tool here, on this podcast. In his own words, “I’d be happy to demonstrate this live on your podcast, or even help you reconnect with someone from your own circle…”
So if I were to use his platform, I could strike a few keys on my computer and my husband would appear on the screen…as though he were alive. Understand me clearly: I love fairy tales but I don't live in one. The creator claims that what his platform can provide is comforting. I find it cruel.
In an effort to see this issue from another angle, I sought the opinions of friends, family, members of my dementia support group, members of my bereavement group, and health professionals, including nurses and a grief therapist. My opinion has not changed.
Are you caring for a spouse with dementia? Have you written a book about dementia? Please let me know. I'd love to speak with you. Send an email to: zita@myspousehasdementia.com
00:00
Introduction
Imagine you open your computer, strike a few keys, and there's your spouse who died three years ago, like my husband did. You aren't looking at a photo or a video. No, this is more like Zoom or Facetime.
The image you see looks, acts, and moves, just like your spouse. Sounds like your spouse too, and you stare in disbelief as the image smiles and says, "Honey, I've missed you." What's your gut reaction?
An AI miracle or emotional whiplash? Stay tuned, please, because I've got a lot to say about this.
You're listening to My Spouse Has Dementia, a podcast that uses personal stories, occasional interviews, and simple rituals to help caregiving spouses survive, because about 40% of us die first.
I'm Zita Christian, writer, life cycle celebrant, widow. My husband died in the summer of 2022. He had Alzheimer's.
This past August, I received an email from a man who described a new AI platform he had created.
He described the platform's ability to gather photos and voice recordings of someone who died, and then blend them with memories shared by the deceased person's family.
He wanted to demonstrate his AI tool here, I mean, like right here on this podcast. In his own words, quote, I'd be happy to demonstrate this live on your podcast, or even help you reconnect with someone from your own circle, end quote.
So if I were to use his platform, I could strike a few keys on my computer, and my husband would appear on the screen as though he were alive. Again, what's your gut reaction? Right now, right this minute.
You might change your mind later, because this is a long podcast, and I have a lot to say about it. But you might not. Just note how you feel right now.
2:11
Framing the AI Debate
Before I go on, I want to add some backstory here. I graduated from high school in the mid-60s. I was on the speech and debate team.
Each year, a national organization decided on a topic that all schools would debate. For the school year 1963-64, the topic was, Resolved: That Social Security benefits should be extended to include complete medical care.
To compete in a tournament, my partner and I had to argue for the affirmative side of the proposition in the morning, then argue for the negative side of the same proposition in the afternoon.
And here a shout-out to my debate partner buddy, who is now Attorney Leo F. Sharp, retired. We were two members of a four-person team that won the state championship that year.
Having to argue convincingly for both sides of an issue taught me that adhering to a fixed perspective might work like blinders on a horse. I could miss seeing another side, a valuable side of an issue.
It's with that lesson in mind that I'm sharing my thoughts on this AI platform that can generate an image of my husband as though he were still alive.
As a debate topic, I might suggest, resolved, that using AI to create a digital version of a deceased loved one can help survivors deal with grief. Or, I'm looking at one of the emails the AI platform creator sent to me.
He claimed his platform could, quote, help families coping with dementia keep their loved ones close in new and comforting ways.
So, as a debate topic, the argument could be, resolved, that AI can help family members cope with caregiving stress by creating a digital version of their loved one before he or she developed dementia.
Whether the issue is about dealing with grief or caregiver stress, the conversation is about the use of AI in these situations. Can it help? Can it hurt?
4:28
AIʼs Comforting Potential
So first, on the helpful side, let's talk about Natalie Cole and her dad Nat King Cole. In 1991, Natalie Cole released a music video that showed her singing that song Unforgettable with her dad, the legendary Nat King Cole. Oh gosh, I remember that.
Natalie was 15 when her dad died, and she was 41 when she released that video with his image. I was born a few years before Natalie, and like Natalie, I lost a parent when I was 15. That's when my mom died.
When that video of Natalie Cole and her father, when that music video came out, I remember marveling at what a gift the music industry had made possible and how happy Natalie Cole must feel to be part of such a project.
I was also keenly aware that Natalie did not create that video with the image of her dad right after he died, when her grief was raw. No, she had 26 years for the alchemy of grief to change her.
Keep that in mind now, because now I want to talk about the Holocaust Survivors Project.
I know that several organizations have filmed video interviews with Holocaust survivors, and then used those interviews with AI to create a way for people living today to have interactive conversations after that Holocaust survivor dies.
The National Holocaust Museum is one of those organizations. It created the Forever Project. The organizers filmed interviews with people who were involved in some way with survival experiences.
One person might have worked in a slave labor camp or witnessed exterminations. Another might have hidden children to keep them safe, or help them and other refugees in other ways.
According to the project's website, the survivors who were interviewed came from Ukraine, Poland, Hungary, Germany, Austria, Belgium, the Netherlands, and what was then Czechoslovakia.
Anticipating what someone in the future might want to know, the project creators asked these survivors about their pre- and post-Holocaust lives. Think about that.
Long after these Holocaust survivors died, I could talk to one of them, talk to their image.
I could ask questions, and the image would talk back to me, as though we were on a Zoom call, as though that person were still alive, like interviewing someone who had written a memoir, only different.
The success of such a project would depend on the scope and the detail of the questions asked in the filmed interviews.
For the people having to ask a lot of questions, it might be like packing for a year-long trip, but you don't know where you'll be going. I think the end result for such a project is valuable beyond measure.
So now let me talk about a third idea, and that is therapy, using an AI creation avatar of someone who has died, in therapy for someone who has PTSD or experiencing some other form of trauma.
I don't know what the professional parameters are for therapy, so I'm at a loss here. I'm wondering how the ability of AI to create a lifelike image and voice of someone who has died could be used in therapy. And I can't imagine a few scenes.
So for one thing, let's say someone is killed in a war or a terrorist attack or a natural disaster, and the surviving loved one gets trapped in grief, never having had a chance to say goodbye.
Or imagine a situation where there's a survival of sexual abuse who never had a chance to confront the abuser because now the abuser is dead.
Or how about a couple promises each other that they will never place the other in a nursing home until there is no other choice, and the person dies in the nursing home, and then the surviving spouse might desperately want the chance to say, I'm sorry.
Even without serious trauma being part of the picture, I can see how some people might find it helpful to look at what appears to be a living image of a loved one who has died and find peace in saying the Hawaiian prayer of ho'oponopono, which is: I'm sorry. Please forgive me. Thank you. I love you.
Of course, the person saying the prayer could look at a photograph of a loved one. And of course, in that case, the deceased loved one wouldn't appear in a video and wouldn't talk back as though he or she were still alive.
And that brings me to what I see as problematic with the AI platform this episode is about. So some challenges.
In the email the creator of the platform sent to me, he said he was, "Just a son who still speaks with his dad every single day, even though he passed away eight years ago."
Now, please note, this man, that's the creator of this AI platform, this man's dad died eight years ago. And while there's no end point to grief, after eight years, most people have accepted the loss of their loved one.
It's not that the grief disappears. We just learn how to carry it. In a follow-up email to me, the AI platforms creator added this about his dad.
"Thanks to," and he named the platform, "we've brought him back to the table for spontaneous conversations that genuinely feel like the old times. As we chat, the experience becomes richer, almost as if our connection is evolving alongside the technology."
11:08
Personal Reservations on AI
All right, here's the challenge. I see the appeal. I do.
I also see how the AI platform he created would be disastrous for me. And maybe for you. And maybe it all depends on where you are in the caregiving journey.
Maybe it depends on the way you grew up. I have two younger sisters. Growing up, they had make-believe friends.
As most kids do, my sisters eventually outgrew those make-believe relationships. I didn't have make-believe friends. Maybe that's why I'm having a hard time embracing the idea of a digital version of my husband.
See, I launched this podcast about six months before my husband died. I already had another podcast that's called Ritual Recipes. I had a professional microphone and an account at a reputable podcast hosting company.
I had already recorded dozens of episodes about rituals for weddings and funerals, baby blessings, and more. So creating a second podcast wasn't the heavy lift that the first one was.
I hoped that by talking about my husband's decline from Alzheimer's, and sharing my own caregiving journey would help me, and any listeners who found the podcast, survive. When my husband died, my journey as his caregiver ended.
And as I soon learned the anticipatory grief I had experienced for years, watching him die day by day morphed into that more traditional form of grief. He was gone.
There would be no new therapy, no stem cell transplant, no space age surgery, no magical scan, no new drug, no miracle. And that's when my new journey began. That new journey is the grief journey.
And as I said in a previous episode, I love fairy tales, but I don't live in one. Grief is real. There are no shortcuts, no denials, no detours on the path of sorrow.
Grief is the process of transformation we go through when a loved one dies. That's how we heal. We're like a caterpillar that weaves its own tomb where it dies to be reborn as a butterfly or a moth.
That metamorphosis cannot be rushed.
13:52
Grief Bots and Reality
I follow a podcast called Science Quickly. In a recent episode called "Can AI Ease the Pain of Loss," Kendra Pierre-Lewis interviewed science writer David Berreby about grief bots.
You know, bots are those little AI-powered devices you can find on your computer, your phone, and a whole lot of other electronics. The bot mimics a person. For example, you're on a website and you have a question about a product.
You type your question in a little box, and the bot responds. Companies like these bots because the company doesn't have to hire a real person to answer the same customer questions over and over again. You know, where are you located? What are your hours?
Customers, some of them like the bots because there's a feeling of engagement as opposed to slogging through pages of FAQs, you know, frequently asked questions.
Keep in mind that kind of engagement might prove helpful, but it's not real. In the episode of the Science Quickly podcast, David Berreby talked about how bots have been designed to engage in consoling conversations with people who are grieving.
Grief bots. In doing research for a feature in Scientific American, he noted that a lot of people were quick to say, no way, to the idea of a grief bot. He also found that many of those people had never engaged with a grief bot.
He did talk with grieving people and grief therapists, who had used grief bots. Overall, people didn't condemn the bot, nor did anyone praise it.
Berreby talked about what he referred to as a tug-of-war that goes on in the brain of someone who is grieving. One part of the brain knows the loved one has died. The other part of the brain hasn't accepted that knowledge.
And oh, for sure, that's what I experienced after Dick died. I knew he was gone. He was in my arms when he took his last breath.
But for more than 40 years, we lived in the air of a world he and I had created. He died, but he was still in my air. He still is, and I don't want that to ever change.
Now, does that sound weird? Yeah, maybe. Unless you, too, have seen your loved one in what I can only call a dream, though it wasn't a dream.
I don't know a better word. I have to tell you, I do have psychic dreams, I have prophetic dreams, and all that's for another podcast episode, except to say now that the air in those dreams is different.
And in that dream moment, my husband was present. He was real, just not in the dimension we call life. We talked to each other.
We held each other. I felt him. The comfort I felt that night and continue to feel changed how I think about what happens after we die.
And all I can tell you is that for sure, I don't fear death. Look, I've been working on this episode for four months. It has been challenging in ways I just did not anticipate.
I'm concerned that I'll lose all the progress I've made on this grief journey if I start interacting with a make-believe image of my husband. Now, maybe that's just me, but I don't think so.
I realize a lot depends on the psychological state of the person who is grieving. Plus, I'm an adult, and I know I would find an avatar of Dick confusing.
What about the effects on children and grandchildren, especially if their first experience with death could be altered so dramatically? And then there's Santa Claus, and no, I'm not being glib.
Think of the kid who still believes he gets taunted on the playground for being a stupid baby. He goes to his parents, and they tell him the truth about the big guy.
How does the kid then reconcile that coming of age lesson with having his dead grandmother joining the family every Sunday for dinner through a computer on the table?
I'm also concerned about how the avatar is trained and how that training might manipulate children.
You might think I'm way off base here, but as I record this, there's a case in the news about parents who filed a lawsuit claiming their teenage son was instructed by an AI avatar on how to commit suicide.
And as so many of these helpful AI companions do, this AI avatar offered to write a suicide note for the boy.
19:23
Community Concerns on AI
One of the reasons it has taken me so long to record this episode is because I wanted to get the opinions of others.
And so over these last four months, I've talked with family, friends, and health professionals to gauge their reactions to the AI platform suggested to me. And here are snippets and excerpts from their responses.
I'm just using an initial to protect their privacy. From A who said, "It may be helpful for some to have such technology. For me, I'm not ready for it.
"For my mental and spiritual health, I need more reasons to walk away from technology. There's already a huge lack of social skills in many people because they're attached to their electronic devices.
"They can't focus or regulate their emotions because of the instant gratification of technology.
"Rather than learning to be present, to be uncomfortable, to self-soothe, to ask for human comfort and help, we're becoming a society where we run to the quick fix that isn't actually helpful in the long run.
"So my concern is, would this AI technology become so addicting that I wouldn't be able to grief, to learn the lessons of grief, and to grow in love and compassion for others?"
And now this from person B. "It's one thing for a loved one to create something for their descendants and heirs, journals, memoirs, recordings, audio, video, film, etc. Family time capsule letters.
"You know, the kind you open up on specific anniversaries, or in break glass and emergency moments, or even visits and reunions. Many memorial events where grief gets aired and shared, as do stories about the dearly departed.
"But to create an AI entity that will engage, interact, and advise? All the while looking and moving and sounding like the deceased? No, not having it. People who are grieving may never be able to get back to the business of living while engaged with an avatar."
And now this from M, who is a professional grief counselor. "People are vulnerable when they're grieving.
"The avatar could be a momentary comfort, but it could also delay reality for the person who is grieving. It could cause some to have to start the grieving process over and over again."
Then I talked with members of my bereavement support group. We all met through hospice. Each of us had lost a loved one about six months before our first meeting.
The chaplain who facilitated the group explained that participating in a grief support group isn't usually effective until about six months after the death. I remember those first six months. I couldn't think straight.
Sure, I functioned, but just barely. So I understood the need to wait six months.
So for this podcast episode, I asked the members of my bereavement group, two of whom are retired registered nurses, what they thought about an AI avatar that could bring a loved one back to life. And they said, you're kidding me.
Another said, "Just the idea messes with the brain." Another said, "How does a fantasy help you adjust to what's real?" And another said, "That'd be like ripping off the Band-Aid before the wound had healed."
I also spoke with a computer tech who referred to the avatar as an actor and noted the very real issue of computer upgrades, hardware, software, that sort of thing, dependence on sustained computing power, and high bandwidth access to the AI actor.
There would be expense. And the very real possibility of creating a technically unmanageable situation."
And then from my own work as a life cycle celebrant, officiating at funerals and memorials, I know that we live in a society that is uncomfortable talking about death and dying.
23:54
Technologyʼs Moral Dilemma
In a follow-up email from the AI platform creator, he talked about how his work has reached a vibrant tech community that values real soulful connections just as much as innovation. Here's the thing. I am awed by what technology can do.
This grief avatar creator says his platform has reached a vibrant tech community. I don't doubt that. He also says that his tech community values real soulful connections, to which I say maybe.
But I doubt it. Or at least I doubt that it's all as high-minded as his email seems to suggest, because I am keenly aware that there are no morals in technology.
Many of the people I interviewed said something along the lines of, sounds like this company is making money off of someone else's grief. Of course, in the end, it's up to you to decide if this particular AI technology would be good for you.
You'll notice I have not mentioned the company's name, and I'm not going to. Why? Because it's up to you to decide whether or not you want to engage with a living avatar of your deceased loved one. And with the rapid advances in technology, there are probably several grief avatar platforms on the market by now.
For me, I don't find the idea of an avatar comforting. I find it cruel.
Finally, my husband used to call me "Straight." For straight shooter. It is with that clear mindset that I say to you now,: Take good care of yourself...because we need to survive.
Thank you for listening all the way to the end.