New data show that a retinal prosthesis system is capable of providing face detection functionality to blind individuals, said Paulo E. Stanga, MD.
Seattle-New data show that a retinal prosthesis system is capable of providing face detection functionality to blind individuals, said Paulo E. Stanga, MD.
The approved device (Argus II Retinal Prosthesis System, Second Sight) is intended to restore some functional vision for people who have lost their sight.
The question we asked was if faces can be detected with a retinal prosthesis, said Dr. Stanga, professor of ophthalmology and retinal regeneration at the University of Manchester, England.
“In the normal use of retinal prostheses, facial detection is limited by a number of factors,” Dr. Stanga said. “Many Argus II users have had difficulty distinguishing faces from other similar sized and shaped objects.”
Approved by the FDA in February, the device can provide electrical stimulation of the retina, which, in turn, induces visual perception in blind individuals with retinitis pigmentosa. The system can provide visual capabilities to those currently unable to see anything except perhaps extremely bright lights.
To address question of facial recognition, two experiments were conducted by Dr. Stanga and colleagues to evaluate if device users can locate human faces with their systems using a facial detection algorithm and if this the speed of detection would be improved if changes were made to the field of view that is mapped onto the Argus implant.
They used two different magnifications. The image-processing algorithm captured a field-of-view that matched the field-of-view of the implanted array (20° diagonally) while for others, the entire field-of-view of the camera (53°) was captured and “zoomed out” to fit the array. The time to locate the “target” was significantly shorter when the wider field-of-view was used.
“Faces were detected and localized in 100% of the trials,” Dr. Stanga said. “Each patient went through ten trials.”
In the second experiment, they looked at facial detection in a real-world setting, and if during a conversation, a user could determine when eye contact is lost.
“Both subjects detected the loss of eye contact 100% of the time,” Dr. Stanga said. “The time was 5.5 seconds and 6.4 seconds.”
Therefore, they concluded that this feasibility study was able to demonstrate that image-processing algorithms can enable device users to perform daily tasks that are not limited by the resolution or the sensitivity of the array.
For more articles in this issue of Ophthalmology Times Conference Brief,click here.