The smartphones parents use to snap adorable photos of their children may soon become a powerful tool for early autism detection, one as effective as it is easy to use.
A new mobile application being developed by UB undergrad Kun Woo Cho will be able to track and analyze the eye movements of a person looking at images on-screen. Because the gaze patterns of children with autism spectrum disorder (ASD) tend to differ starkly from those of typically developing children, the results provide an immediate and reliable indication of a child’s risk. In a pilot study, the analysis had an accuracy rating of 93.96 percent.
Typically developing children tend to look across key areas of interest that relate to the social scene presented. Here, the gaze pattern, marked in yellow, is concentrated around the faces of the people, as well as on what they themselves are looking at—the candles.
Here, the gaze pattern of a child with ASD is shown to be scattered around the background of the scene, with little or no focus on the people pictured or the social situation taking place.
Cho, a junior majoring in computer science and engineering, and the developer of the application’s computational metric, known as Gaze-Wasserstein, grabbed top honors late last year while presenting her work at the IEEE Wireless Health Conference. She has even gained the attention of tech giant Apple. With the help of her research adviser, lab co-workers and study co-authors, Cho continues to work on researching and developing the prototype app, and getting it ready for distribution. Once in the hands of parents, this biofeedback breakthrough will hopefully lead to easier and earlier diagnoses—and better outcomes—for children with ASD.