research news

AI institute shows NSF, IES how it’s building education tools from ground up

Alison Hendricks stand at a podium at the National AI Institute for Exceptional Education.

Alison Hendricks, associate professor in the Department of Communicative Disorders and Sciences, discusses her work at the National AI Institute for Exceptional Education. Photo: Meredith Forrest Kulwicki

By TOM DINKI

Published May 7, 2026

Print
“Our partnerships — with industry, schools and nonprofits — are critical for translation. They help us move from prototypes to deployable systems. ”
Venu Govindaraju, senior vice president for research, innovation and economic development

To work properly, AI needs data. A lot of data. 

That presented something of a challenge for researchers at the National AI Institute for Exceptional Education. The type of data they needed — from children’s handwriting samples to videos of parents reading to their children — simply didn’t exist at a large enough scale to train AI models. 

“We’ve had to kind of create these datasets ourselves,” said Alison Hendricks, associate professor in the Department of Communicative Disorders and Sciences.

Since then, the institute has collected more than 2,000 handwriting samples from elementary schools and produced hundreds of video recordings of children interacting with parents, educators and speech specialists. 

Typically happening behind the scenes, this foundational, infrastructure-building work was on display this week during a site visit for the institute, which is supported by a five-year, $20 million grant awarded in 2023 by the U.S. National Science Foundation (NSF) and the Institute of Education Sciences (IES) at the U.S. Department of Education 

A review panel assembled by NSF and IES received an update on the institute, which is developing AI tools that can help the 3.4 million children in the U.S. who need speech and language services while easing the nationwide shortage of speech-language pathologists (SLPs).

To date, the institute has produced more than 150 studies and built a network of over 70 organizational partners. Altogether, approximately 700 children, 180 SLPs, 80 teachers and 140 parents and caregivers have participated in the research. 

“Our partnerships — with industry, schools and nonprofits — are critical for translation. They help us move from prototypes to deployable systems,” said the institute’s principal investigator, Venu Govindaraju, senior vice president for research, innovation and economic development. “This ensures that what we build is informed by real needs and reaches the people it is intended to serve.”

Researchers are developing both the AI screener, a suite of tools designed to identify children who may need a formal speech or language evaluation, and the AI Orchestrator, a set of virtual teaching assistants that can help SLPs support children with diagnosed speech and language-processing disorders.

"Examples of the tools include InkSight, which can be used to screen for dyslexia and dysgraphia using children’s writing samples, and SLPStudio, which generates intervention materials personalized for a student based on their learning profile," said Srirangaraj "Ranga" Setlur, managing director of the institute.

Govindaraju and Setlur, who have led the development of these tools, said they will be field tested over the next year in area school districts.

InkSight is based in part on the Dysgraphia and Dyslexia Behavioral Indicator Checklist (DDBIC), which was co-developed by institute researcher Abbie Olszewski, associate professor of literacy studies at the University of Nevada, Reno. 

“Some dysgraphia characteristics involve spacing, use of margin and other writing characteristics. There’s some overlapping characteristics, such as spelling errors and letter reversals. And then you have some dyslexia-only characteristics like lack of vocabulary diversity and idea development,” Olszewski said. 

For InkSight, the institute collected over 1,600 handwriting samples from elementary school students in Reno, as well as more than 600 from students in nearby Grand Island.

The collection process proved to be a learning experience for everyone on the team. Olszewski said she initially assumed researchers would need handwriting samples from about 50 students — only to learn from the computer scientists that they would need data from thousands.

Conversely, Olszewski said visiting a Reno elementary school and watching handwriting assessments administered firsthand was something of a wake-up call for the computer scientists.

“We could talk all we wanted when planning how to do these studies, but actually having them come and see and collect some data with us and work through some of the bugs in the technology was super helpful,” she said.

Even after data collection, researchers have had to work to ensure AI systems can properly analyze it. For example, the model would initially autocorrect children’s grammar and spelling mistakes, as well as erase faint handwriting while enhancing irrelevant marks on the page.

“This motivated us to develop a suite of recognition models that are tailored specifically to children’s handwriting,” said Sahana Rangasrinivasan, a dual PhD student in UB’s Department of Computer Science and Engineering and Amrita Vishwa Vidyapeetham in India. “This will enable a more faithful transcription of a child’s handwriting.”

Another one of the institute's screening tools passively monitored 100 children over three months at childcare centers across the U.S. The tool was 90% accurate at identifying which children were at risk of having a language delay and which were not.

“SLPs are wonderful, but they have a lot of work to do. They need support and they need help,” said institute co-PI Julie Kientz, professor and chair of the Department of Human Centered Design & Engineering at the University of Washington.

One part of the AI Orchestrator is PaiCoach, which supports parent-child interactions at home. 

Three families with children who have autism spectrum disorder and limited verbal ability recorded a total of 473 videos of themselves reading to their children. AI then analyzed the videos and gave the parents a score. It also gave timestamped feedback on specific moments, like complimenting them on asking their child a question about the book but reminding them to pause a few seconds to give their child time to think and respond. 

The results thus far suggest PaiCoach can improve parents’ performance and their child’s communicative responses. 

“After they use the tool, look at the parents’ sense of confidence. The parents feel more empowered,” said institute co-PI Jinjun Xiong, a former UB engineering professor who is now founding dean of the College of AI, Cyber and Computing at the University of Texas, San Antonio.

In addition to screening and intervention tools for SLPs, the institute is developing a virtual reality environment to support the training and evaluation of SLPs and special education teachers, Setlur said.

Govindaraju noted that some of this work is being powered by Empire AI, the more than $500 million statewide research consortium whose supercomputing center is located at UB.  

“We’ve become a nexus, bringing together researchers, practitioners and stakeholders around AI in special education,” he said.