Facing the heartbreak of memory loss? Dementia, a group of disorders that gradually erodes our ability to remember, think, and function, affects millions worldwide. Alzheimer's disease (AD) and frontotemporal dementia (FTD) are two prominent forms, each inflicting its own unique damage on the brain. In 2025, approximately 7.2 million Americans aged 65 and older are expected to be affected by AD. FTD, while less common, often strikes individuals in their 40s to 60s, making it a significant cause of early-onset dementia.
These diseases, though distinct in their impact, often present overlapping symptoms, leading to potential misdiagnoses. AD primarily targets memory and spatial awareness, whereas FTD affects areas responsible for behavior, personality, and language. Accurate diagnosis is not just a scientific pursuit but a clinical necessity, profoundly influencing treatment, care, and the quality of life for those affected.
Traditional diagnostic methods like MRI and PET scans are effective but come with significant costs, time constraints, and the need for specialized equipment. Electroencephalography (EEG), on the other hand, offers a more accessible alternative. This non-invasive technique measures brain activity using sensors, providing a window into the brain's electrical signals. However, these signals can be noisy and vary greatly between individuals, making analysis challenging. Even with the application of machine learning to EEG data, differentiating between AD and FTD has remained a hurdle.
To address this, researchers at the College of Engineering and Computer Science at Florida Atlantic University have developed a deep learning model that aims to revolutionize the detection and evaluation of AD and FTD. This innovative model enhances the accuracy and interpretability of EEG data by analyzing both frequency- and time-based brain activity patterns linked to each disease.
The study, published in the journal Biomedical Signal Processing and Control, revealed that slow delta brain waves are a key biomarker for both AD and FTD, particularly in the frontal and central regions of the brain. However, in AD, brain activity disruption was more widespread, affecting other brain regions and frequency bands like beta, indicating more extensive brain damage. This difference helps explain why AD is often easier to detect than FTD.
But here's where it gets controversial... The model achieved remarkable accuracy, distinguishing individuals with dementia (AD or FTD) from cognitively normal participants with over 90% accuracy. It also predicted disease severity with relative errors of less than 35% for AD and 15.5% for FTD.
Because AD and FTD share similar symptoms and brain activity, distinguishing between them has been a challenge. By employing feature selection, researchers significantly boosted the model's specificity – its ability to correctly identify individuals without the disease – from 26% to 65%. Their two-stage design, which first identifies healthy individuals and then differentiates between AD and FTD, achieved an impressive 84% accuracy, placing it among the top EEG-based methods to date.
The model cleverly combines convolutional neural networks and attention-based LSTMs to detect both the type and severity of dementia from EEG data. Furthermore, Grad-CAM provides valuable insights by visualizing which brain signals influenced the model's decisions, helping clinicians understand its inner workings. This approach offers a novel perspective on how brain activity evolves and which regions and frequencies drive diagnosis – something traditional tools often miss.
“What makes our study novel is how we used deep learning to extract both spatial and temporal information from EEG signals,” explains Tuan Vo, the study's first author. “By doing this, we can detect subtle brainwave patterns linked to Alzheimer’s and frontotemporal dementia that would otherwise go unnoticed. Our model doesn’t just identify the disease – it also estimates how severe it is, offering a more complete picture of each patient’s condition.”
The findings also shed light on the differences between the two diseases, revealing that AD tends to be more severe, affecting a broader range of brain areas and leading to lower cognitive scores, while FTD's effects are more localized to the frontal and temporal lobes. These insights align with previous neuroimaging studies but add new depth by showing how these patterns manifest in EEG data – an inexpensive and noninvasive diagnostic tool.
“Our findings show that Alzheimer’s disease disrupts brain activity more broadly, especially in the frontal, parietal and temporal regions, while frontotemporal dementia mainly affects the frontal and central areas,” says Dr. Hanqi Zhuang, co-author of the study. “This difference explains why Alzheimer’s is often easier to detect. However, our work also shows that careful feature selection can significantly improve how well we distinguish FTD from Alzheimer’s.”
And this is the part most people miss... Overall, this research demonstrates how deep learning can streamline dementia diagnosis by combining detection and severity assessment into a single system. This approach could reduce the need for lengthy evaluations, providing clinicians with real-time tools to track disease progression.
“This work demonstrates how merging engineering, AI and neuroscience can transform how we confront major health challenges,” adds Dr. Stella Batalama, Dean of the College of Engineering and Computer Science. “With millions affected by Alzheimer’s and frontotemporal dementia, breakthroughs like this open the door to earlier detection, more personalized care, and interventions that can truly improve lives.”
What are your thoughts? Do you believe that AI-driven diagnostic tools will revolutionize the way we approach dementia care? Are you optimistic about the potential for earlier detection and more effective treatments? Share your perspectives in the comments below!