A smartphone user can image the eye using the RGB selfie camera and the front-facing near-infrared camera included for facial recognition. Measurements from this imaging could be used to assess the user's cognitive condition. (CREDIT: Digital Health Lab)
SAN DIEGO, Calif. — You may soon be able to screen yourself for neurological diseases like dementia and ADHD using nothing but a smartphone. All you’d have to do is take a selfie — of your eyes. Researchers at the University of California-San Diego are developing a new app that uses eye recordings to assess cognitive health.
The app uses both a near-infrared camera (built into most new smartphones available today) and a “regular selfie camera” to track pupil size dilations. Those pupil measurements can then help to assess a person’s cognitive condition, study authors explain.
“While there is still a lot of work to be done, I am excited about the potential for using this technology to bring neurological screening out of clinical lab settings and into homes,” says first study author Colin Barry, an electrical and computer engineering Ph.D. student at UC San Diego, in a university release. “We hope that this opens the door to novel explorations of using smartphones to detect and monitor potential health problems earlier on.”
The pupils offer a peek into cognitive functioning. For example, when someone is thinking hard about a tough mental task, or hears an unexpected loud sound, the pupils tend to expand.
The app keeps close track of changes in pupil diameter through a pupil response test. Researchers believe this eye-selfie test can quickly screen for and even monitor any number of neurological diseases and disorders.
How does the brain disease app work?
Normally, that test requires specialized and expensive equipment, making it hard to perform consistently outside of a lab setting. Luckily, engineers from the Digital Health Lab, led by UC San Diego electrical and computer engineering professor Edward Wang, worked together with researchers at the UC San Diego Center for Mental Health Technology (MHTech Center) to develop an affordable, easier way to administer the test.
“A scalable smartphone assessment tool that can be used for large-scale community screenings could facilitate the development of pupil response tests as minimally-invasive and inexpensive tests to aid in the detection and understanding of diseases like Alzheimer’s disease. This could have a huge public health impact,” adds Eric Granholm, a psychiatry professor at UC San Diego School of Medicine and director of the MHTech Center.
The app conceived by the UC San Diego team makes use of smartphones’ near-infrared cameras to detect and track the pupil. Within the near-infrared spectrum, it’s rather easy to differentiate the pupil from the iris, even in eyes with dark iris colors. So, the app is able to calculate pupil size super accurately (to the sub-millimeter!) across numerous eye colors.
Additionally, the app uses a more traditional color picture taken with the smartphone’s selfie camera to measure the stereoscopic distance between the smartphone and the user. The system then converts the pupil size from the near-infrared image into millimeter units.
The new ‘gold standard’ in the palm of your hand?
It’s worth mentioning that measurements made by the app were comparable to those taken by a device called a pupillometer, which scientists consider the gold standard for measuring pupil size. The development team also included a number of features intended to make the app more user friendly for older adults.
“For us, one of the most important factors in technology development is to ensure that these solutions are ultimately usable for anyone. This includes individuals like older adults who might not be accustomed to using smartphones,” Barry explains.
Study authors worked directly with a group of older individuals to design a simple app interface. Features include voice commands, image-based instructions, and an affordable plastic scope to help direct user’s eyes to within the view of the smartphone camera.
“By testing directly with older adults, we learned about ways to improve our system’s overall usability and even helped us innovate older adult specific solutions that make it easier for those with different physical limits to still use our system successfully,” concludes Prof. Wang. “When developing technologies, we must look beyond function as the only metric of success, but understand how our solutions will be utilized by end-users who are very diverse.”
Moving forward, researchers will continue their work on this project. More specifically, they’re now turning their attention toward enabling similar pupillometry functions on older smartphone models. Future studies involving older adults self-screening for dementia are in the planning phase now.
The team presented their findings at the ACM Computer Human Interaction Conference on Human Factors in Computing Systems (CHI 2022).