The internet has made grieving easier: Website memorials, communal grieving across social media, online messenger services offering support—they’ve all made bereavement more social, more accessible, less taboo.
To Muhammad Ahmad, though, what might seem like the digital age of mourning feels dated. Ahmad wants to radically transform how we mourn. The data scientist believes artificial intelligence will eventually allow us to craft the data left behind by an individual into convincing text-based simulations of that person. Dubbed “griefbots,” they’ll respond when prompted, imitating the deceased’s cadence, tone, and idiosyncrasies. Ahmad thinks these griefbots could make grieving for loved ones an interactive experience.
The concept of griefbots might seem familiar to some: It was popularized in an episode of Black Mirror, “Be Right Back,” in which a pregnant woman, Martha, tries an online service for communicating with the dead after the sudden loss of her fiancé, Ash. True to form, the episode escalates and Martha’s dependency on the griefbot sees her upgrade to the android version of Ash, a Blade Runner-style replica which perfectly imitates his appearance and voice. It’s an ambitious forecast, but we edged closer to Black Mirror’s initial griefbot becoming a reality just last year, when Eugenia Kuyda created a simulation of her friend Roman Mazurenko, after he was killed in a road accident.
ADVERTISEMENT
Ahmad has now built his own working prototype. When his father died four years ago, Ahmad lamented the fact that any future children he would have would never be able to bond with their grandfather. He drew on his previous research, which focused on modeling human behavior in video games, and spent the last few years collecting data his father had left behind, such as audio or video recordings, text messages, and transcripts of letters. This information has allowed him to create a messenger program that (he claims) can imitate his father.
“Currently, the program can carry out conversations on a limited set of topics,” Ahmad told The Daily Beast. “I am working on how to make the system respond to dramatic changes in its wider context.”
Ahmad now has a 2-year-old daughter, and as he continues to evolve his simulation (he’s currently exploring how to enable it respond to images and adapt to new contexts), he hopes that one day she’ll form the semblance of a connection with her grandfather.
“Historically speaking, people have told stories of their parents to their grandchildren about their grandparents,” he said. “Now we have come to a point where… it is at least possible to have some sort of an interactive experience of those who have passed. It may not be perfect, but at least we can capture some of that essence.”
Ahmad’s creation was inspired by a desire to give his grandchildren an interactive memorial to their grandparent. Eugenia Kuyda’s simulation was borne out of her immediate grief. Automated chatbots are still quite primitive, but as artificial intelligence improves the technology, it enables could have a profound impact on how we grieve.
“There are stages of grief… that technology has been able to interact with in the sense that it allows people to reach out and get social support,” Pamela Rutledge, director of the Media Psychology Research Center, a nonprofit dedicated to media and technology research, said. “In the short term it might be that having these bots, [and] the ability to still make contact in a way that feels meaningful, would alleviate some of that initial distress.”
According to Dr. Sheri Jacobson, a clinical director of Britain’s Harley Therapy, technology has sped up the traditional stages of grief—denial, anger, bargaining, depression, and acceptance. Griefbots could potentially do the same.
“In terms of the natural grieving process, we often think back to our memories, our stories… we’ll bring objects of the deceased, and we will kind of recreate aspects of their lives to help us with the grieving process,” Jacobson told The Daily Beast. “So perhaps to some extent griefbots will be able to activate that a little more easily and a little more accurately than our own memories and objects would.”
To some extent, people are already using social media in this way. Facebook pages can remain active or memorialized long after a person has died, allowing users to post messages on their wall. These interactions are simply a 21st-century display of an internal process known as “continuing bonds,” or how people grieve with (or without) others.
“A model that’s being used now called continuing bonds recognizes that when people pass away, those that loved them and are still alive carry on this internal conversion still with the person that’s dead,” Wendy Moncur, a professor at the University of Dundee and author of Charting the Digital Lifespan, said. “What we’re seeing is that conversation playing out on social media, rather than internally or someone standing at a graveside and having a chat with a headstone.”
But this only accounts for the interactions users consciously initiate. Facebook has previously had to apologize for perpetuating a phenomenon Muhammad Ahmad calls “algorithmic grieving,” where it has accidentally reminded users of past traumas. The site’s automatically generated “year in review” videos have come under particular scrutiny. It would be impossible to guarantee the safety of griefbots from triggering painful memories without first ensuring the AI is programmed to understand the sensitive social conventions that come with grief.
There are also concerns that griefbots could keep people from moving on, a particular concern for those dealing with more traumatic deaths. “There’s a very well-known phenomenon called para-social relationships,” Rutledge said, referring to one-sided relationships in which one person puts a great deal of energy into while the other doesn’t even realize they exist: celebrities and their superfans are a common example.
“If you have a lot of contact with something… [but] you don’t have this awareness that [para-social relationships] can happen, you might end up with a relationship that actually keeps you from grieving the loss of that person,” Rutledge continued. She uses virtual reality as an example of how your brain might respond to dialogue with a griefbot. The rational side of the brain would constantly remind you that the bot is not real, a sort of “cognitive override.”
But that cognitive override doesn’t always work.
“Do you name your cars or your computers or do you refer to them as he or she? Or say your toaster is misbehaving? We have a tendency to anthropomorphise things,” Rutledge pointed out. “If you take that tendency to look for human patterns and anthropomorphize technology—with the combination of the para-social relationship—our brain responds to virtual as if it were real.”
So how do we know how to protect those who might be most at risk of developing an unhealthy relationship with a griefbot? Rutledge suggested a mental health test, “because what you don’t want is people taking advice from a bot,” she says.
Along with the ethical and legal implications that come with using the deceased’s private communications to build a griefbot, there’s also the issue of what information the AI chatbot might choose to make available. Personal data has the potential to reveal a multitude of secrets held by a person: the controversial views they hold, their disagreeable opinions about friends and family, the nefarious activities they may have got up to in their youth. Along with surrendering emotional closure, the continual release of new information and revelations could fundamentally change the way a deceased person is viewed by a loved one.
Though Muhammad Ahmad’s wife is fascinated by his AI simulation, he hasn’t yet shown the working prototype to his siblings. There’s no guarantee they’d be as open-minded about talking to a digital imitation of their late father as he is. “This idea of having a simulation of a loved one, I think some people just find it very— it’s not repulsive, it’s not strange, I would say it’s a mix of these two,” he said.
To Ahmad, though, they’re neither strange nor repulsive. We already use photographs and videos as tools to remember the deceased, so employing AI in the grieving process is just the next natural stage in the evolution of how we experience grief. Griefbots would simply digitize the one-way conversations we’re already having.
But we’re more than the data we leave behind. In the near future, it’s unlikely AI will evolve enough to fully replace the emotional support humans can offer the bereaved.
“We are social creatures,” Sheri Jacobson said. “We thrive in the company of others and in supporting one another… that’s why we all need to live in a community, we need friends, we need support. So to what extent griefbots will be able to replicate the human support level remains to be seen.”