News

Video

Envision Summit 2025: Retinal imaging to diagnose and identify neurodegenerative diseases using machine learning.

At the Envision Summit 2025 in San Juan, Puerto Rico, Sharon Fekrat, MD, FACS, FASRS talked about multi-modal retinal and choroidal imaging to diagnose and identify neurodegenerative diseases using machine learning.

At the Envision Summit 2025 in San Juan, Puerto Rico, Sharon Fekrat, MD, FACS, FASRS talked about multi-modal retinal and choroidal imaging to diagnose and identify neurodegenerative diseases using machine learning.

Video Transcript:

Editor's note: The below transcript has been lightly edited for clarity.

Sharon Fekrat, MD, FACS, FASRS:

My name is Sharon Fekrat, and I am a retina specialist at the Duke University School of Medicine. I am vice chair of faculty affairs and professor of ophthalmology and neurology. I'm also proud to say that I am director of the iMIND Research Group.

Today I had a chance to talk about our research, multi-modal retinal and choroidal imaging to diagnose and identify neurodegenerative diseases using machine learning. And so we have a very exciting group of students and residents, and we started our research in 2017 when we had these 96-year-old identical twins in my clinic. One of them had very advanced Alzheimer's disease, and the other one was cognitively normal—using a smartphone and driving. And I knew that this was an opportunity to take pictures of the retina and look for differences. And boy, did we find some differences.

The twin with advanced Alzheimer's disease has markedly decreased vessel density in her retina. And so we knew we were on to something. We have since developed a convolutional neural network that can identify Alzheimer's disease compared to those that are cognitively normal. We also have a convolutional neural network that can identify mild cognitive impairment from normals. And very excitingly, we have one now that can identify Parkinson's disease in comparison to those who are cognitively normal.

So a lot of exciting work, and I was really glad to share it with everyone. You know, our group is very excited to use machine learning, because traditional statistics alone looking at the quantitative metrics from the retinal images such as OCT and OCT angiography. The traditional statistics show us differences between the two groups, but we're not exactly clear–should we be looking in this part of the retina? Should we be looking at a different part of the retina, what metrics should we be looking at? And so machine learning sort of neutralizes that playing field. We're looking at, you know, attention maps and trying to figure out what exactly the machine learning models are looking at on the image inputs. But yes, attention maps are still something that is very novel to many of us.

Newsletter

Don’t miss out—get Ophthalmology Times updates on the latest clinical advancements and expert interviews, straight to your inbox.

Related Videos
(Image credit: Ophthalmology Times) NeuroOp Guru: Using OCT to forecast outcomes in ethambutol optic neuropathy
(Image credit: Ophthalmology Times) Dilsher Dhoot, MD, on the evolution of geographic atrophy therapy: where are we now?
(Image credit: Ophthalmology Times Europe) Anat Loewenstein, MD, shares insights on the real-world results of remote retinal imaging
(Image credit: Ophthalmology Times) Two-wavelength autofluorescence for macular xanthophyll carotenoids with Christine Curcio, PhD
(Image credit: Ophthalmology Times) FLIO and the brain: Making the invisible visible with Robert Sergott, MD
(Image credit: Ophthalmology Times) Structure-function correlates using high-res OCT images with Karl Csaky, MD, PhD
(Image credit: Ophthalmology Times) SriniVas Sadda, MD, on high-res OCT of atrophic and precursor lesions in AMD
(Image credit: Ophthalmology Times) Christine Curcio, PhD, shares histology update supporting review software and revised nomenclature for <3 μm OCT
Steven R. Sarkisian, Jr., MD, ABO, speaks about glaucoma at the 2025 ASCRS annual meeting
© 2025 MJH Life Sciences

All rights reserved.