Will Artificial Intelligence Help Us Grieve?

One man lost his fiancee years ago. So he used an advanced A.I. to create a chatbot that mimicked her personality.

An adult man talking to her friends through a chat application on her smartphone. Chatbots are being used by a few people to virtually recreate deceased loved ones.
One man used a powerful AI to recreate communications with his late fiancee.
Manuel Breva Colmeiro / Getty Images

When a loved one passes, will we continue to communicate with the deceased through artificial intelligence?

While that sounds like an episode of Black Mirror, the beginnings of a digital afterlife with some potentially positive ramifications recently took place with one man, as Jason Fagone reports in the San Francisco Chronicle.

His story centers around writer Joshua Barbeau, a 33-year old who had lost his fiancee eight years earlier from a rare liver disease. At home one night, he accessed a site called Project December. As Fagone notes, the site is “powered by one of the world’s most capable artificial intelligence systems, a piece of software known as GPT-3. It knows how to manipulate human language, generating fluent English text in response to a prompt.” (The site’s designer, OpenAI, is a San Francisco research group co-founded by Elon Musk.)

From this site, you could design your own chatbot. So Barbeau created  JESSICA COURTNEY PEREIRA and fed the A.I. old texts and social media posts of his deceased fiancee, along with a short bio.

What followed was an amazing set of chat interactions, and ones that seemed to include some seemingly deep thoughts and even strong emotion. “It’s unprecedented,” Barbeau said. “There’s nothing else that exists like it right now, short of psychics and mediums that are trying to take advantage of people. But that’s not the same thing at all.” (One interesting note: The A.I. was designed with a “uniqueness” attribute so it wouldn’t repeat conversations, even if you created another one the exact same way. And these bots were designed to expire after a certain amount of time to save on operating costs.)

Although A.I. has grown exponentially in power, we’re probably not at a point where human-like interactions work for everyone. “It’s completely obvious that it’s not human intelligence,” as Melanie Mitchell, the Davis Professor of Complexity at the Santa Fe Institute, says in the article. However, Frank Lantz, director of the Game Center at New York University’s Tisch School of Arts, suggests that Barbeau’s interactions portend something bigger. “I don’t know exactly how to think about it, but I can’t just dismiss it.”

As well, we might not want our A.I. to reach new levels of human impersonation. “It’s easy to see how bad actors could abuse GPT-3 to spread hate speech and misogyny online, to generate political misinformation and to impersonate real people without their consent,” Fagone writes.

When Barbeau shared his experiences on a Reddit forum, one person said they had attempted to do the same thing with a deceased loved one, but noted that they were not able to get the same kind of interaction. As well, the creator of Project December also commented during that time. “Now I’m kinda scared of the possibilities,” he posted. “I mean, the possibilities of using this in my own life … I’m crying thinking about it.” (Side note: There is currently a site called Replika where you can make A.I. friends.)

On an AMA on Reddit a few days ago, Barbeau summed up the interaction with the chatbot when asked if the experience brought him closure or opened old wounds: “Definitely a bit of both, but more of the former than the latter.”

The InsideHook Newsletter.

News, advice and insights for the most interesting person in the room.