KJZZ is a service of Rio Salado College,
and Maricopa Community Colleges

Copyright © 2024 KJZZ/Rio Salado College/MCCCD
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Startups are using AI to help connect people with loved ones who've died

A handful of startups are using technology to help connect people with loved ones who’ve died. Called “grief tech” or “grief bots,” the platforms can allow users to "communicate" with people who are no longer alive through artificial intelligence. OpenAI, the maker of ChatGPT is one of the firms working on this; others have names like Séance AI and HereAfter AI.

The idea is that users can provide the software texts and other messages written by their loved one who’s died. The platform will then use those messages to recreate that loved one’s voice, and respond to future messages.

With The Show to talk about the potential — and potential pitfalls — of this new technology is Joanne Cacciatore, a professor in ASU’s School of Social Work.

Full conversation

MARK BRODIE: Joanne, this is clearly a new thing. What do you think about it?

JOANNE CACCIATORE: It's really an interesting concept. if used correctly, I think if used incorrectly, this can be potentially incredibly dangerous, emotionally and psychologically to people.

BRODIE: So what to you, then would be the correct way to use these?

CACCIATORE: I mean, I think there's a way to use the AI therapeutically in, for example, imaginal work with our people who died if we wanted to say something that we didn't get to say. For example, here at the Selah Carefarm, we have something called the wind telephone. It's a it's an old rotary phone, that's, you know, just sort of set outdoors by a canal, and people can pick up the phone and say the things they never got a chance to say to their beloved dead. But it's in a therapeutic setting, there are counselors here who can work with them, if something very strongly comes up. The AI actually pretends to be in conversation pretends to be the person who died, and that can create a real potential danger for someone who isn't getting good social support, and having difficulty adjusting in the, in the aftermath of — especially catastrophic loss. So, the danger is that it can create a kind of a false sense of reality for people, and if they're not working through some of those big feelings, then I think it could potentially have deleterious psychological effects in the long run, particularly if someone's using them as a sole coping mechanism.

BRODIE: Well, just in doing a little bit of reading about this, the thing that struck me was, it would be easy to never be able to sort of move beyond the loss or sort of come to that stage in the stages of grief of acceptance, and sort of recognizing that this is the reality now if you are able to, you know, "communicate with the person who's died."

CACCIATORE: I think you're on to sort of some of the concern that I have, and it's not necessarily that people have to accept the reality of the loss, which is quite hard, but they have to accept how they feel about the reality of the loss. And, and if we're not able to stay with those emotions, those very, very big emotions of emptiness, sorrow, anguish, despair, particularly in catastrophic loss, most of my work is with people whose children die or are dying.

And in the case of homicide and suicide, I work with many, many families whose primary family members die that way. And so creating an environment where people are able to use this as a as an avoidance measure, to avoid how they feel about their experience, I think can be really dangerous. And in fact, one of the articles that talked about this particular application of AI, one of the founders was saying, we don't ever have to grieve again, because we can always talk to our dead. Well, that's ridiculous. It's ridiculous to think that we can eradicate grief through the use of AI. And I think it's a dangerous move. I think that feeling grief is part of our experience of being here and loving.

BRODIE: So do you think that if somebody who's experiencing loss is using one of these in concert with other kinds of therapies or other kinds of care, could they have beneficial uses? Or could they at least be maybe not as dangerous as as if somebody was using just these?

CACCIATORE: Yeah, I mean, the research isn't out yet. Maybe at some point, I'll be designing a study, I see one coming down the lane. But my feeling in doing this work for 30 years, my feeling in doing this work for so many years is that, yes, they can have a potentially therapeutic application. I do a lot of imaginal work with people. I'm a meditation teacher also, and so one of the things one of the meditations I take people through is where they sit in a space in their mind. They feel very safe, and they allow their loved one to come into the space imaginably and they have a conversation with them, which is not dissimilar from from this right from this application, though it's more imaginal.

But it's a very powerful experience, but then they have someone there to therapeutically process this and to work with what comes up and the reality of the person not physically being there is something that they're able to confront in that moment. And more importantly, the feelings about that person not being physically there can be therapeutically processed. And it's a very, very slight nuance, but it's a really important nuance for us to, for us to really acknowledge that, that having someone there present with you, when you're going through this, and you're saying the things that you feel like you never got to say to the person you love who died, or you're experiencing a direct contact with them in an imaginal way, being able to process those really helps us to digest the enormity of the loss. And that's a very different thing than using AI as an avoidance strategy.

BRODIE: I mean, so it really seems like the key here is, if somebody is going to use one of these, and even if they're not having somebody else with them, to help talk them through what they're feeling and what they're going through, seems really to be key here. I mean, we hear all the time, you know, you shouldn't process really big things like this alone. But it sounds like especially if you're inclined to maybe try an AI program, you definitely should not be doing that on your own.

CACCIATORE: Well, I would I would recommend having a therapeutic presence for sure. I mean, I think these kinds of interventions, one of the studies that we conducted found that well we looked at good grief support, and we know that grievers don't get good grief support. Mental health providers rated only at like, I think 50ish percent, around half the satisfaction with grieving people. Family and friends rated in the 30% satisfaction. Clergy rated in the 30 to 40% satisfaction range. Human beings don't know how to provide good support generally. OK. So clearly, a therapeutic presence isn't that easy to find, and I think that's probably the push for developing some of these AI techniques is because they can be programmed intelligently with emotional intelligence, even though they're not emotionally intelligent. They can be programmed with emotional intelligence.

The problem is that there needs to be someone who is emotionally intelligent to process what people experienced as they're going through this therapeutic, potentially, or potentially harmful form of, I guess, interaction with the person who died. So, we know humans aren't doing a very good job, can AI do a better job? This is why I think this could potentially be a really good thing, if someone can find a really good therapeutic source of support, and it doesn't necessarily have to be a therapist, it could be a spiritual provider, it could be it could be a really trusted and wise neighbor. But having a wise therapeutic presence can really be powerful for people. And so its potential there, there's real some real potential that AI has a role in that. We just need to make sure that it's done with a great deal of wisdom and prudence, and compassion because if it's not done that way, it could really serve to harm people in the short term and the long term.

BRODIE: Alright, that is Joanne Cacciatore, a professor in ASU School of Social Work. Joanne, thank you so much for the conversation. I really appreciate it.

CACCIATORE: Thanks so much for having me, Mark. It's so important to talk about this and I appreciate it.

More stories from KJZZ

Mark Brodie is a co-host of The Show, KJZZ’s locally produced news magazine. Since starting at KJZZ in 2002, Brodie has been a host, reporter and producer, including several years covering the Arizona Legislature, based at the Capitol.