
(Image/Midjourney)
Think back to yesterday. Maybe you remember waking to an alarm, the smell of coffee, a conversation with a friend, where you parked your car. You didn’t memorize any of it on purpose. Your brain simply wove together the sights, sounds, times and places of your day into a story you can replay on demand.
That kind of memory, known as episodic memory, is what makes you, you. And for all its importance to human identity, scientists still don’t fully understand it.
Dong Song wants to change that. An associate professor jointly appointed in the Department of Neurological Surgery at the Keck School of Medicine of USC and the Alfred E. Mann Department of Biomedical Engineering at the Viterbi School of Engineering, Song has proposed a new way to study how episodic memories form, using artificial intelligence paired with a technology that can both read and write signals directly to the brain.
His newly published perspective paper in Advanced Science lays out a framework for studying episodic memory not in a lab, but as it actually unfolds in everyday life. The ultimate goal: to one day help people who have lost the ability to form new memories.
“You are largely what you remember,” Song said. “Episodic memory is really essential for your identity. What you did yesterday, when you met someone last time, all the important things to maintain a normal life.”
What Is Episodic Memory?
Episodic memory is the brain’s autobiographical record: the mental movie reel of where you were, what happened and when. Scientists have known for decades that a structure deep in the brain called the hippocampus is critical to forming these memories. The evidence traces back to a patient known as H.M., whose hippocampus was removed in the 1950s to treat epilepsy. Afterward, he could recall the distant past but couldn’t retain anything new.
Later research identified specialized neurons that handle different pieces of episodic memory: place cells that track location, time cells that record sequence and concept cells that recognize people and objects. But how those pieces combine into the seamless act of remembering a birthday, a meeting or an ordinary Tuesday, has remained unclear.
The obstacle is complexity. Lived experience throws an enormous number of variables at the brain simultaneously: sights, sounds, smells, locations, people and time, all intertwined. Lab experiments sidestep that by stripping memory down to something manageable, like showing a person a word list and asking what they recall minutes later. Clean, controlled and nothing like real life.
“In the lab, you want to simplify things,” Song said. “But in reality, where, when and what are all interacting, all intertwined. Using the old approach, there’s no way to understand the complex interaction.”
Artificial Intelligence Enters the Picture
Song’s solution is to use AI as a translator between two worlds: lived experience and the brain signals that encode it.
Imagine someone going about a typical day wearing a brain sensor while a camera and microphone record what they see and hear. Those recordings are fed into an AI system along with brain data. The system searches for patterns, moments when bursts of neural activity reliably coincide with something in the outside world, such as a familiar face, a place or a sound. Over time, the AI builds a dictionary linking brain signals to real-world events.
Song’s ultimate goal is to move beyond correlation and test causation. Scientists have long identified brain patterns linked to memory formation. But do those patterns actually create memories?
To test this idea, Song proposes using a brain-machine interface (BMI), a device that can both read brain activity and send small electrical pulses back into the brain. First, AI would identify the specific pattern of brain activity linked to a particular memory. Then the BMI would recreate that pattern in the brain using brief electrical pulses, even though the person wouldn’t actually experience a thing
In effect, the device would plant an artificial memory. If the person then says they remember a fictitious event, it suggests that the brain pattern itself helped create the memory.
“You manipulate that code through brief stimulation, you write in that code, and then you ask this person, ‘Did you remember that?’ The person says yes. Then this is conclusive,” Song said. “This is what I mean by causation, not just pure correlation.”
Song’s team has demonstrated pieces of this in rodents, nonhuman primates and human epilepsy patients, where AI-generated stimulation patterns enhanced recall. Next steps include better recording technology, more refined AI models and experiments in real-world settings.
Who Stands to Benefit
The most immediate candidates among those who could benefit are people living with Alzheimer’s disease, traumatic brain injury or post-traumatic stress disorder.
Alzheimer’s gradually destroys the brain’s ability to convert short-term experiences into lasting memories. A patient might recall a conversation from 40 years ago but not one from this morning. If scientists can pinpoint the neural patterns responsible for that conversion, a BMI might step in to reinforce them, serving as a prosthetic for a failing memory system.
For PTSD patients, the technology might work in reverse, selectively weakening traumatic memories. Song called that possibility theoretical but plausible and estimated that meaningful improvements for some patients could arrive within five to 10 years.
“Helping people with memory problems, finding therapeutic methods, that’s the main research direction of my lab,” Song said. “The clinical translation is always something extremely important to me.”
The power to read, write and alter human memory raises profound questions about personal identity, the authenticity of one’s recollections and the potential for misuse. Song’s paper argues that advances in memory engineering must be matched by equally rigorous ethical and regulatory guardrails.
“This is a really sensitive topic,” Song said. “This potential needs to be realized very carefully, in a very rigorous ethical framework.”
The research was supported by DARPA’s Restoring Active Memory program, the NIH/NIDA BRAIN Initiative’s Theories, Models and Methods program and DARPA’s Investigating how Neurological Systems Process Information in REality program.
Published on March 24th, 2026
Last updated on March 24th, 2026

