Chatting with my own HereAfter

Our journalist chats with his digital twin, and ponders the future of loss.
Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox

I’m conducting an interview. 

Which isn’t too unusual, until I fail to answer my own question. 

Ok, fair enough, I think — I’m a professional, let’s take another tack.

What do I remember from my childhood?

With that, I’m telling myself about one of my earliest childhood memories, about stern-looking, armed MPs and armored vehicles in the West Point parking lots as the Gulf War began, and my brief story sparked more memories. I remembered skies and military Gothic buildings, the color of the cadet’s covers and greatcoats, of being so small in something that felt so big.

While the approaches to personal digital twins vary, the general thrust is this: that someone can “live” forever, through AI.

That’s the kind of moment — that innervation, that resurrection of a previous, lost time — the creators aspiring to build our future digital twins are driving towards. 

Granted, they’re looking more to give you that moment from someone deceased — or vice-versa.

While the approaches to personal digital twins vary, the general thrust is this: that someone can “live” forever, through AI.

“Today (my daughter) speaks to Siri. But one day in the future, I want her to speak to me,” Emil Jimenez, the founder of Mind Bank AI, told me when I first began reporting on digital twin AIs.

Mind Bank’s vision of a digital twin is about as ambitious as they get: a deep learning-powered “you” capable not only of sharing information previously gathered, but using all it has learned to have new conversations, answer new questions, give advice and react to new situations — to continue on as you, essentially breaking (or at least bending) the chain of loss.

That’s a long ways out, though; the digital me I’m talking to right now is incapable of saying something I haven’t previously recorded. Though asking myself questions — that I then answer— still feels quite a bit like living in a futurist’s dream.

Legacy Avatars

I’m talking to my HereAfter AI “Legacy Avatar,” essentially an archive of myself that can be interacted with like you’re talking to Siri or Alexa. 

Legacy Avatars were inspired by founder James Vlahos’ experience with his father. As he chronicled in WIRED, when his father was diagnosed with terminal lung cancer, Vlahos rushed to record everything he could. He used the resulting oral archive to create “Dadbot,” a form of “artificial immortality” for his father.

“This all started out with my own personal Dadbot project, to save and share the stories of my father using conversational AI,” Vlahos tells me. 

HereAfter AI works on a similar principle, taking recordings of your memories, reflections, feelings, and philosophies, and using AI to retrieve your answers in a conversational format.

Originally, the interviews were conducted by journalists HereAfter AI had trained to be life interviewers. With the rollout of their initial commercial version, that human interviewer has been replaced by a chat interface, which pulls from a large pool of random questions to deliver prompts.

Your memories, reflections, feelings, and philosophies are retrieved by AI in a conversational format — your Legacy Avatar.

“We really had to hone this process,” Vlahos says. “How do you organize it? How do you get people to focus? What can you do to maximize the chances that you actually get good, relatively compact stories that other people would want to hear?”

The app takes what HereAfter AI has learned about that process via human interviewers and automates it. 

Your Legacy Avatar is geared towards retrieving those recordings. If you are asked a question about your favorite college course, for example, the AI has one piece of labeling right off the bat, Vlahos explains: it knows that if someone asks about college, this answer may be appropriate. 

“A lot of these questions will have multiple labels that are associated with them,” Vlahos says. 

The labeling system means you don’t have to ask the Legacy Avatar the exact question used as a prompt to get to an answer; stories are marked with multiple labels relating to the content of the recording. NLU (natural language understanding) algorithms and machine-learning AIs then use the labels to pull appropriate answers. 

The goal is a Legacy Avatar that is more dynamic than, say, a saved voicemail, and more user-friendly — and soothing — than an unwieldy archive of raw recordings. 

Speaking with the HereAfter

Speaking of raw recordings, I’m sitting in my overcast office, an autumnal candle burning and Darjeeling tea steaming, and I’m having a more difficult time than I expected answering HereAfter AI’s prompts. 

I’m not usually shy about sharing parts of myself, but struggle a bit, constantly cycling through prompts. Some are just not applicable, for example when it asks me about my son, and I can tell the app those prompts mean nothing yet. Others are surprisingly painful; perhaps if I was earnestly building my Legacy Avatar for real, I wouldn’t mind telling it about a time I had to say goodbye. 

Presented like a text conversation, the app provides prompts and words of encouragement in an earnest and impeccably mannered way. “Excellent. I’m glad you recorded that,” my faceless, voiceless interviewer tells me. “I’m guessing your family will enjoy these stories.” 

The goal is a Legacy Avatar that is more dynamic than a saved voicemail, and more user-friendly — and soothing — than an unwieldy archive of raw recordings.

In between me telling the app about a night I looked my best — the answer involves a “fur” coat — what I admire about my folks, my early childhood memories, and my uncle, I record bits of dialogue that serve double duty, both helping the AI to understand how I speak and providing the canned responses for the interface, like if my Legacy Avatar doesn’t pick up on what is being asked of it, or has no answer yet.

(After recording these, the app is both grateful and apologetic, in a C-3PO-esque way, acknowledging it may feel weird or seem unnecessary. Personally, these were my favorite parts, pretending to be a voice actor; think of it like getting your H. Jon Benjamin on!)

Eventually, I find myself sliding into some kind of slightly uneasy tête-à-tête. I share an inflection point, when a best friend gave me advice so profound I split my life into before and after, and choke up. I think to myself just one more, even as the room goes darker and my post-work bowl calls.

The process serves as a time to reflect, not just recall.

The Sting and the Spark

“I don’t think this type of technology has the ability to erase the sting of death, or soften the sting of death necessarily,” Vlahos says. 

“I still really, really miss my dad.”

But a person isn’t the only thing you lose; memories can fade, a slowly unfolding second loss. It’s this softening that Legacy Avatars are meant to help fix. Being able to hear his dad’s voice on command, hearing him tell a joke, or sing a song, sharing his love story — this brings the memories back into focus.

“I guess I separate the two things,” Vlahos says. Can it ease the pain of death? Probably not — and no technological solution may ever exist, or should ever exist, I think, to alleviate that pain.” 

But can it provide a richer way to remember? Yes, Vlahos believes.

“That part of the grieving process” — the inevitable fading of memory — “no longer has to be a given.”

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox
Related
Should we turn the electricity grid over to AI?
AI could one day be woven throughout the grid management system — here are the pros and cons.
AI skeptic Gary Marcus on AI’s moral and technical shortcomings
From hallucinations to regulatory battles, Gary Marcus argues the AI status quo has failed us and it’s time citizens demand something more.
Flexport is using generative AI to create the “holy grail” of shipping
Flexport is using generative AI to read documents, talk to truckers, and create a “knowledge agent” that’s an expert in shipping.
The West needs more water. This Nobel winner may have the answer.
Paul Migrom has an Emmy, a Nobel, and a successful company. There’s one more big problem on the to-do list.
Can we automate science? Sam Rodriques is already doing it.
People need to anticipate the revolution that’s coming in how humans and AI will collaborate to create discoveries, argues Sam Rodrigues.
Up Next
Subscribe to Freethink for more great stories