USC researchers are using cutting-edge technologies to accelerate Alzheimer’s discoveries — and transform how those discoveries are made. (Illustrations/Bratislav Milenkovic)
Health
How AI and advanced computing are accelerating Alzheimer’s research
Cutting-edge technologies developed by USC researchers are changing not just the pace of Alzheimer’s disease discoveries but also the ways scientists make those discoveries.
Imagine reading a book letter by letter instead of chapter by chapter. Focusing solely on the alphabet would make it challenging to develop a holistic understanding of the work’s plot, characters and themes.
For decades, scientists have been searching human DNA for markers related to Alzheimer’s disease in much the same way.
“Your DNA has 3 billion letters,” says Paul Thompson, professor and Popovich Chair in Neurodegenerative Diseases at the Keck School of Medicine of USC. “In traditional genome sequencing, every single letter of the genetic code is assessed one at a time — without much big-picture understanding.”
Traditional methods have led to discoveries of genes such as APOE ε4 that strongly contribute to Alzheimer’s risk. But now, a powerful artificial intelligence tool developed by Thompson and his collaborators — called a “genomic language model” — will enable the detection of more subtle influences across thousands of genes at once. These discoveries, in turn, will guide the development of new drugs that can target genetic defects to treat or even prevent Alzheimer’s.
“Genomic language models screen the whole gigantic ‘book’ of DNA from hundreds of thousands of people,” Thompson says. “The model will find more complex patterns that drive brain aging and specific biological processes — very complex patterns that no human could identify.”
Thompson directs the ENIGMA Consortium, a USC-based global network of researchers using imaging and genomics to advance knowledge about brain diseases. ENIGMA’s AI4AD (Artificial Intelligence for Alzheimer’s Disease) initiative focuses on using AI methods to tackle key challenges in Alzheimer’s research.
The initiative is one of many research endeavors led by USC researchers that leverage AI and other advanced computing technologies to accelerate Alzheimer’s discovery. These software and hardware innovations are opening new horizons for better understanding of this complex disease, earlier and more precise diagnostic methods, faster drug discoveries and novel treatment options. Alzheimer’s researchers’ cross-disciplinary embrace of AI demonstrates one of USC President Beong-Soo Kim’s top priorities: for the university to lead and innovate in the AI space.
“It’s like having a new telescope to survey the universe,” Thompson says of AI. “There’s this whole new landscape of discoveries possible.”
It’s like having a new telescope to survey the universe. There’s this whole new landscape of discoveries possible.
— Paul Thompson, professor and Popovich Chair in Neurodegenerative Diseases at the Keck School of Medicine of USC
AI sees what the eye cannot
During the past couple of decades, advanced imaging techniques have allowed researchers to make great strides in identifying brain changes associated with Alzheimer’s. Amyloid plaques and tau tangles — hallmarks of the disease — can be seen on PET scans. High-resolution MRIs allow scientists to visualize microscopic features of brain organization and brain function implicated in Alzheimer’s pathology.
These tools make it possible to peer inside the human skull without surgery and transcend visual limits. And within the past several years, AI has transformed how researchers interpret and make meaning from advanced imaging.
“AI tools accumulate evidence for tiny irregularities that are too subtle for the human eye to notice,” Thompson says. They can also track patterns across imaging datasets whose large size exceeds the capacity of human memory. For example, AI algorithms developed by AI4AD merge data from hundreds of thousands of MRIs, PET scans and vascular images to identify Alzheimer’s subtypes and relate these subtypes to specific genetic predictors and outcomes.
At the USC Leonard Davis School of Gerontology, Associate Professor Andrei Irimia and his collaborators, including Paul Bogdan, associate professor of electrical and computer engineering at the USC Viterbi School of Engineering, developed a deep neural network that assesses biological brain aging from MRI scans. The AI model evaluates atrophy in key regions of the brain and accurately predicts whether an individual’s brain is “older” or “younger” than their chronological age. Those with a biological brain age older than their chronological age have a heightened risk of neurodegenerative diseases, including Alzheimer’s.
Irimia’s lab has used updated versions of the model to track the pace of brain aging over time, a measure that Irimia says is clinically useful to assess the effectiveness of therapies aimed at slowing neurodegeneration. Deep neural networks have also enabled his research group to map the genetics of how individual regions of the brain age and discover reproductive factors that influence brain aging in women, among other findings.
“When you use these insights from AI from imaging, you’re able to predict the probability of whether a person will convert from normal cognition to Alzheimer’s or not with up to 91% accuracy, which is much better than existing models,” Irimia says in reference to his team’s research to predict future cognitive impairment. “AI allows us to get more done, to make faster progress, and also to look at problems that are more difficult than we used to be able to grasp.”
AI allows us to get more done, to make faster progress, and also to look at problems that are more difficult than we used to be able to grasp.
— Andrei Irimia, associate professor at the USC Leonard Davis School of Gerontology
Speaking volumes about brain health
Advanced imaging offers a cutting-edge snapshot of Alzheimer’s brain biomarkers. But given the high costs and limited availability of PET scans and high-resolution MRIs, these technologies are not practical for widespread screening.
Shrikanth Narayanan — University Professor at USC Viterbi and vice president for presidential initiatives — is using AI to investigate an indicator of brain health that can be measured widely in everyday life: speech and language.
Early Alzheimer’s brain changes are linked to nuanced changes in speech and spoken language patterns: struggling to name familiar objects and people, pausing speech more often and losing one’s train of thought more readily. “The ability to retrieve, plan and produce speech in a social context is going to be affected,” Narayanan says. “Hence, we can use patterns of speech as markers of brain-health status.”
Narayanan is a leading expert in speech and language processing, which enables machines to interpret and analyze human speech and language. His team has designed unobtrusive wearable devices to record and process natural speech activity and developed novel AI methods to detect subtle but clinically meaningful changes in speech patterns.
The wearable devices were recently used in a pilot study of older adults’ speech in India as part of the Longitudinal Aging Study in India — Diagnostic Assessment of Dementia (LASI-DAD), a project aimed at evaluating population-level patterns of dementia among the country’s more than 1.47 billion people. Narayanan is collaborating on LASI with principal investigator Jinkook Lee, professor of economics at the USC Dornsife College of Letters, Arts and Sciences and director of the Program on Global Aging, Health and Policy.
India has hundreds of languages and dialects, and many people there speak and use multiple languages in any given conversation. “AI has to track not only what they’re saying and what languages they’re speaking, but also if they’re mixing or switching languages,” Narayanan says.
Narayanan’s team is currently analyzing the speech data collected in the pilot study and evaluating how speech patterns correlate with other measures of health captured by wearable activity trackers, including heart rate, sleep and movement. The goal is to tease out speech signals that reliably indicate dementia risk to support early detection and intervention.
Fast-tracking drug discovery

Developing drugs to treat Alzheimer’s disease has historically been slow going. The search for treatments has been marked by decades of setbacks and sluggish progress. That tide is now turning, with USC researchers at the forefront of new pharmaceutical breakthroughs, but challenges remain along the path from research bench to bedside, which takes on average 10 to 15 years.
One of the challenges is the painstaking trial-and-error work in which scientists synthesize, screen and test thousands of compounds in the lab to see which ones effectively target the biological mechanisms of a disease.
At the USC Michelson Center for Convergent Bioscience, Vsevolod Katritch has developed a computational platform to speed up the discovery process. The platform, called V-SYNTHES, uses a combination of AI and physics-based tools to evaluate small-molecule candidates for biological targets in quantities that far exceed what humans could synthesize in a lab or even list in a computer database.
“Our approach allows us to streamline discovery because from trillions of possible compounds, we can computationally predict 100 or 200 that can be synthesized and are likely to work for the target,” says Katritch, professor of quantitative and computational biology and chemistry at USC Dornsife.
An example of the platform’s utility for Alzheimer’s drug discovery is a collaboration between Katritch and Hussein Yassine, professor of neurology at Keck School of Medicine. Yassine studies people who carry a gene variant called APOE ε4, the strongest genetic risk factor for Alzheimer’s. He found that APOE ε4 carriers with elevated levels of an enzyme known as calcium-dependent phospholipase A2, or cPLA2, go on to develop dementia, while those without elevated levels remain cognitively healthy.
cPLA2 breaks down protective omega-3 fatty acids in the brain and triggers damaging inflammation. Yassine and Katritch partnered to identify a small-molecule candidate that can suppress cPLA2 without side effects and without inhibiting other essential enzymes. The research team scanned billions of compounds to find those that can penetrate the brain’s protective barrier and fit into the enzyme’s active site, like a key in a lock.
In less than six months, they identified leading candidates to begin testing in the lab. With support from the National Institutes of Health, the researchers are now working with Stan Louie, professor of clinical pharmacy at the USC Alfred E. Mann School of Pharmacy and Pharmaceutical Sciences, to translate their discoveries into a new drug therapy.
In January, Katritch, Yassine and Louie established the Physics and AI Steered Drug Discovery Center (PHAST-DDC) among USC Dornsife, Keck School of Medicine and USC Mann. The center makes the next-generation drug-discovery technologies from Katritch’s lab, including V-SYNTHES2 and V-SYNTHES-DL (Deep Learning), available to researchers across USC. One third of the targets the center will be testing during this year are for Alzheimer’s disease drug discovery.
Our approach allows us to streamline discovery because from trillions of possible compounds, we can computationally predict 100 or 200 that can be synthesized and are likely to work for the target.
— Vsevolod Katritch, professor of quantitative and computational biology and chemistry at USC Dornsife
Machine-enhanced memory
One of the most devastating features of Alzheimer’s disease is memory loss. Early in the disease, people begin to lose episodic memory, which is the ability to recall personal experiences and their rich, multilayered contexts: sights, sounds, relationships, emotions. That’s because the hippocampus, a brain region crucial for forming new episodic memories, is among the first major structures affected by Alzheimer’s pathology.
The hippocampus functions like a memory “bridge” that gathers electrochemical signals from the brain’s visual and auditory cortexes and emotion centers into coherent memories and transfers them to other brain regions for storage. In Alzheimer’s disease, that bridge is broken by neurodegeneration.
Dong Song, associate professor of neurological surgery and biomedical engineering at USC Viterbi, has designed an implantable brain-machine interface (BMI) that replicates the bridge function of the hippocampus. The device relies on a computational model Song developed from vast amounts of brain data to mimic the natural signaling behavior of the brain.
Song collaborated with Charles Liu, professor of clinical neurosurgery at Keck School of Medicine, to test the BMI in patients struggling with episodic memory loss due to epilepsy. Patients were shown sets of images and asked to recall these images seconds or days later. The device improved their recall by up to more than 50%.
Now, Song is harnessing the power of AI combined with the BMI to study how the brain encodes more complex episodic memories formed in everyday life. AI systems can find patterns in the multimodal brain signals involved in such memories and in how these signals move from one brain region to another. “The alignment between natural memories and neural signals is highly complex,” Song says. “That’s where AI can play a major role.”
To advance the BMI hardware, Song is collaborating with Ellis Meng — the Shelly and Ofer Nemirovsky Chair in Convergent Biosciences and professor of biomedical engineering and electrical and computer engineering at USC Viterbi. The current BMI relies on rigid electrodes made of metallic microwire that do not conform to the brain’s “Jell-O-like” structure and can only be implanted in the brain for short periods of time. Song and Meng are developing flexible electrode arrays made from polymers that are 100 times softer than existing electrodes.
“The electrodes are suitable for long-term interface with the body,” Meng says. “They can be placed very close to neurons, maximizing the signal you can record from the brain.”
Song anticipates that AI-enabled BMI systems will be clinically available for improving episodic memory in Alzheimer’s patients within a decade.
The alignment between natural memories and neural signals is highly complex. That’s where AI can play a major role.
— Dong Song, associate professor of neurological surgery and biomedical engineering at USC Viterbi
Innovation at the speed of AI
USC Alzheimer’s researchers now find themselves at an inflection point where AI has rapidly shifted from a curiosity to an indispensable tool at each stage of the scientific process.
“A year ago, people would say, ‘I don’t think AI is capable of too much. It makes a lot of mistakes,’” Thompson says. “Now, it sees features, statistics and patterns with such unbelievable accuracy that scientists and mathematicians are continually shocked.”
“You just can’t get your head around how fast it is moving,” says Arthur Toga, the Ghada Irani Chair in Neuroscience at Keck School of Medicine and director of the USC Mark and Mary Stevens Neuroimaging and Informatics Institute. He and his collaborators developed the center’s Image and Data Archive, which holds the most widely used repository of Alzheimer’s disease observational data in the world.
“We don’t use the same query mechanisms to search the archive we would have just six months ago,” Toga says. “The AI system will find the data, analyze it, produce the results in graphic form and present it all back to you. And if it had to write some software to do that, it gives you the code that it wrote for itself.”
“AI research is one of the fastest-changing, if not the fastest-changing, areas of science now,” Irimia says. “It’s amazing to witness how models that were top-of-the-line just months ago or a year ago are now deprecated and superseded by newer models.”
But that doesn’t mean AI is replacing human expertise. “It’s not like you push a button on the computer and you get a drug for Alzheimer’s,” Katritch emphasizes. “AI gives a range of predictions that need to be experimentally tested. The human factor is still very important.”
window.onload = function() { jQuery(function($) { let animationsPaused = false; $(‘#animation-toggle’).on(‘click’, function() { animationsPaused = !animationsPaused; if (animationsPaused) { $(this).text(‘Start Animation’); $(‘#main-gif-animated’).hide(); $(‘#main-gif-still’).show(); } else { $(this).text(‘Stop Animation’); $(‘#main-gif-animated’).show(); $(‘#main-gif-still’).hide(); } }); }); };
#animation-toggle { font-size: 0.875rem; line-height: 1.4286; padding-right: 1.875rem; padding-left: 1.875rem; position: relative; display: inline-flex; justify-content: center; align-items: center; letter-spacing: 0.10625rem; text-align: center; text-transform: uppercase; min-height: 3.125rem; border: 0.0625rem solid #000; border-radius: 0.625rem; color: #000; background-color: #fff; cursor: pointer; } @media (min-width: 1025px) { .f–image { display: block !important; } .f–image > figure img { display: none; } .f–image > figure img { display: none; } .f–image > figure figcaption { display: block; } } .f–ambient-video { margin-bottom: 10px !important; } .f–ambient-video > video { height: auto !important; max-height: none !important; }