Helping Kids Find Their Voice

At UB’s National AI Institute for Exceptional Education, “AI for good” means critical support for millions of kids with speech and language disorders

Harris reading to her son.

Guided by PaiCoach, one of the apps under development at the National AI Institute for Exceptional Education, Destiny Harris works on her active reading and engagement techniques with son Nicholas DiRosa.

BY CAITLIN DEWEY

MOST NIGHTS, when 4-year-old Nicholas DiRosa begins winding down after dinner, his mom, Destiny Harris, invites him to read a book with her. He settles into his blue, kid-sized armchair. She sits on the floor beside him. Harris then positions her phone to record them both; she’ll later upload the video to an app that reviews their interactions.

Here, at the two-minute mark, it might flag that she lost Nicholas’ attention.

There, at the five-minute mark, it might praise her for pausing to ask him a question.

The app, dubbed Parent’s AI Coach (or PaiCoach), uses artificial intelligence to help the parents of autistic children perfect their active reading and engagement techniques. Those techniques empower parents, in turn, to model critical communication skills to kids who might find them challenging.

Such apps are widely regarded as a promising new frontier for exceptional education—and University at Buffalo researchers are on the leading edge. In 2023, the university won a highly competitive $20 million grant from the National Science Foundation and the Department of Education’s Institute for Education Sciences to establish the National AI Institute for Exceptional Education. The AI Institute, which is headquartered at UB and convenes researchers from nine universities across the country, seeks to revolutionize the treatment of speech and language disorders in children using artificial intelligence.

Venu Govindaraju meets with UB students inside the National AI Institute for Exceptional Education.

Venu Govindaraju meets with UB students inside the National AI Institute for Exceptional Education.

In the past two years alone, researchers affiliated with the institute have piloted tools that speed up the diagnosis of language disorders, personalize treatment plans and automate clinical paperwork so that speech therapists can spend more time with their patients. Eventually, the institute hopes to offer schools, parents and speech-language pathologists a suite of practical, interlocking tools that will help diagnose and treat disorders quickly and cheaply, regardless of geographic location.

Along with initiatives like Empire AI, the $500 million statewide research consortium headquartered at UB, the National AI Institute is part of a much larger push to establish the university as a national engine for applied artificial intelligence research. In recent years, UB researchers have attracted tens of millions of dollars in federal and philanthropic funding to study AI applications in fields ranging from health care and education to data science and engineering, earning national recognition for projects that emphasize both real-world impact and technical rigor. At a ribbon-cutting for the institute in April 2024, New York Senator Charles Schumer heralded UB as a trailblazer not just in AI but in “AI as a force for good.”

“This is not science for science’s sake—we aren’t just publishing papers,” says Venu Govindaraju, PhD ’92, MS ’88, UB’s senior vice president for research, innovation and economic development and the director of the AI Institute for Exceptional Education. Instead, he says, UB researchers are focused on developing practical tools and applications that will measurably improve outcomes for kids, teachers, speech therapists and parents.

THE CHALLENGE IS DAUNTING, but important: Speech and language disorders impact millions of American kids. These children may struggle to speak fluently, understand language or maintain back-and-forth conversation; some also contend with dyslexia or dysgraphia—learning disabilities that disrupt how the brain processes written language.

Traditionally, such conditions are diagnosed and treated by a speech-language pathologist, preferably before the child starts elementary school. Many speech and language problems, including developmental delays and issues related to autism, can improve or even resolve with early intervention.

But there are far more children seeking speech and language support than there are qualified pathologists to provide it. According to the U.S. Bureau of Labor Statistics and estimates from the National AI Institute, there are fewer than 80,000 in-school SLPs for the roughly 3.4 million children seeking services.

In some small and rural school districts, that means kids never get screened for language disorders, says Jinjun Xiong, the founding scientific director of the AI Institute. In other districts, the shortage of SLPs has caused at-risk children to go without treatment, even after they’ve received an Individualized Education Program that legally guarantees them a right to services. These challenges often reverberate across a child’s education, affecting literacy, confidence and long-term academic outcomes. SLPs, meanwhile, already report that their caseloads are too high.

“It’s a difficult problem. More and more people need these services, but we’re unable to get enough SLPs into the profession,” Xiong says. “We began asking: Can we use AI?”

The answer, resoundingly, has been “yes.” After deciding to tackle this issue in 2022, Xiong and Govindaraju began assembling a multidisciplinary team—ultimately bringing together roughly 200 researchers, educators and students from across UB and eight partner universities—to pursue a federal research grant. That effort led to the creation of the institute, now one of 29 National Artificial Intelligence Research Institutes in the country and one of only five devoted to education. Proposals are vetted through a highly competitive review process and must demonstrate not only technical excellence, but clear plans for real-world impact.

To that end, the institute’s leadership includes experts from fields like communication sciences and human-computer interaction, ensuring the development of rigorous, practical solutions that educators, therapists and parents will actually want to use. At the same time, the institute draws heavily on UB’s expertise in applied AI research, stemming back to the university’s breakthroughs on automatic handwriting recognition more than 30 years ago.

That work, which Govindaraju led with several colleagues from UB, trained early computer systems to identify and understand handwritten text in part by breaking words and letters down to their component parts. The U.S. Postal Service later adopted the technology to automate mail sorting, while computer scientists and engineers at UB pushed that line of research into other domains, from forensic analysis to document recognition. Today, at the National AI Institute, a team of researchers working under Govindaraju’s direction is using similar concepts to develop a computer model that can diagnose dyslexia and dysgraphia based on a child’s handwriting sample.

“It’s been a nice journey,” Govindaraju says, of watching that research evolve. “And to help a child in that way, so they’re not falling behind—the impact is close to the heart.”

Like the post office work before it, this new model posed some novel technical challenges: Traditional machine-reading systems are trained on printed text or adult hand-writing, and they typically correct errors like misspellings or irregular letter forms in order to capture the intended content of the text. But in the context of language disorders and learning disabilities, those errors are important diagnostic indicators; the system has to preserve and interpret them.

UB researchers teamed up with SLPs, occupational therapists and teachers both to train the new model and to tailor it to their real-world needs. In early trials, it successfully identified atypical handwriting patterns—like letter reversals, inversions and irregular spacing—that can signal dyslexia or dysgraphia. In the future, a school using the UB model could, for instance, routinely screen student handwriting samples and flag children with atypical writing patterns for further evaluation. That’s important, because evidence from literacy research suggests that interventions started in early elementary school can be nearly twice as effective in closing the gap with typical readers than interventions started in the third grade or later.

“If we can catch the issue early, we can help children improve their speech and language skills so that they’re able to learn to read—and eventually, to read to learn,” says Brian Graham, the superintendent of the Grand Island Central School District and an advisor to the institute. “It’s a really great idea. By catching them early, you can exit them early, too.”

Destiny and her son laughiing.

THIS IS THE ULTIMATE GOAL of the National AI Institute for Exceptional Education: to extend cutting-edge, research-backed interventions to as many students as possible. Over the next several years, in fact, the institute hopes to create a unified suite of diagnostic and therapeutic tools that could be cheaply and easily deployed in classrooms and homes across the country, available to anyone with an internet connection. In addition to the handwriting screener, UB researchers have also demonstrated that AI can administer and score at least one standard assessment for speech and language disorders almost as well as humans can. Scientists hope that, in the future, AI will also be able to analyze video and audio streams of classroom interactions and identify children who need follow-up with a speech therapist.

For now, however, tools like the AI speech assessment, dubbed “AutoRSR,” are closest to deployment. The Grand Island school district plans to pilot that application in 2026, when they will offer the parents of roughly 200 kindergarten students the opportunity to opt in to a free screening in partnership with UB. Participating children will speak 16 sentences into a computer—a process that takes less than 15 minutes—and receive a private report that their parents can share with a teacher or pediatrician. The system does not save the child’s voice or other data without permission.

“Privacy is at the center of all our considerations,” says Xiong, who acknowledges that a lot of Americans remain skeptical of AI, especially where children are involved. Unlike many for-profit tech companies, however, academic researchers follow strict rules around data collection, consent and oversight, and UB researchers have taken particular care to ensure their tools are transparent to the people using them.

“You don’t see the same type of pushback [against AI] when you’re helping people in a genuine sense,” Xiong adds.

Elsewhere at the institute, researchers are working to address the treatment side of the equation by developing tools to assist SLPs, special educators and parents. One project uses generative AI to help speech therapists quickly create personalized teaching materials, such as flashcards, for use in therapy sessions. Another line of research seeks to automate speech therapists’ clinical notes, letting them spend less time on paperwork and more time working directly with children.

None of these tools are designed to replace human therapists or teachers, Govindaraju says. Instead, UB has intentionally focused on innovations that make educators’ jobs easier, allowing them to reach more children or offer higher-quality interventions to the kids they’re already working with. Under a philosophy dubbed “human-centered AI,” researchers include educators, SLPs and parents in every stage of the study, and design processes and tools that respond specifically to their needs and concerns. That approach has informed everything from the interface of individual tools to the composition of the institute’s leadership and advisory board. Before researchers even released a working prototype of PaiCoach, the app that Destiny Harris uses with her son Nicholas, they conducted two rounds of structured interviews with speech-language pathologists, special educators and parents of children with speech challenges.

“Parents and specialists really co-design these tools with us,” says Qingxiao Zheng, a postdoctoral associate in the Department of Computer Science and Engineering who designed the look and feel of PaiCoach. “That’s why we call it ‘human-centered AI’—we always want to build a human into that loop.”

In addition to Zheng, PaiCoach’s interdisciplinary research team includes Zhaohui Li, another postdoctoral associate in the computer science department, and Yusuf Akemoglu, a visiting research scholar focused on early childhood special education. For much of his career, Akemoglu has trained parents of autistic and nonverbal children on early interventions that encourage their kids to speak, gesture or otherwise engage in two-way communication. PaiCoach is designed to support and reinforce that training by watching how parents and kids communicate during one key activity—reading books together— and then offering gentle, real-time feedback that parents can carry into their other interactions. The app also lets parents track their child’s progress and ask questions of a chatbot that’s been trained on Akemoglu’s methods.

Akemoglu stresses that traditional therapy and special education services are still critical. But because they’re available around the clock, and at little or no cost, AI tools can help answer parents’ questions and build their confidence between sessions, ultimately improving their children’s results. Research shows that consistent, frequent practice is key to successful outcomes in speech therapy, which puts a lot of pressure on caregivers.

“I’ve worked with parents throughout my research career, and this is something I hear all the time,” Akemoglu says. “People will say that this is the first time they’ve felt someone is really trying to help them as a parent.”

ON A RECENT WEEKDAY AFTERNOON, Akemoglu and Zheng met up in a bright, glass-walled conference room on UB’s North Campus, steps from the Lockwood Library. They both work out of the institute’s new 3,000-square-foot headquarters, which includes a bank of open cubicles, a high-ceilinged reception room and an observation lab outfitted to look like an elementary school classroom.

The two researchers reviewed screenshots from the PaiCoach app on a large smart screen, brainstorming potential tweaks and features. They have already completed three rounds of initial user studies to refine the app’s design and functionality. Last November, they launched a formal clinical study to test PaiCoach’s effectiveness in improving children’s communication skills. Once the researchers demonstrate that the app works, they hope to make it available to the general public. Future versions could include a portal allowing SLPs to track a family’s progress, or an expansion that extends PaiCoach’s guidance into other activities, such as play or bedtime routines.

In a country where roughly 1 in 31 8-year-olds has an autism spectrum disorder, the app’s potential reach—and its human stakes—are enormous. That, Govindaraju says, is what makes the institute’s work so compelling. Whereas much current AI research prioritizes technical novelty for its own sake, UB is focused on developing tools that help people live happier, healthier, more productive lives. As Sethuraman Panchanathan, former director of the National Science Foundation, said while opening the institute: “This is the AI that is most important to unleash the potential everywhere across our nation.”

Some people are already beginning to see that potential. In the weeks since Destiny Harris began using the PaiCoach app, she’s observed a striking change in Nicholas. The pair used to struggle to read together: Harris would look up from the page to find her son staring off into the distance or fight to keep him seated for more than a couple minutes. Like many children with autism disorders, Nicholas has faced major speech challenges. As a toddler, he didn’t speak at all, and he still regularly trails off, babbles or grows frustrated when he can’t come up with the right words, Harris says.

Despite securing multiple therapies for the little boy, including speech, occupational and behavioral services, Harris says she often felt “clueless” about how to help him. She used to get migraines straining to dream up games that might hold Nicholas’ attention.

But with the help of PaiCoach, that sort of engagement “comes really naturally”—especially during story time, Harris says. She feels more at ease getting feedback from the app than she ever did reading in front of a human clinician. And as she and Nicholas have developed a structured routine around reading, he has grown more and more engaged—sitting in his chair, choosing a book and even smiling for Harris’ camera phone when she takes it out to record their session for the app.

“Seeing him actually engage with reading and get excited and answer things the proper way—it really makes me happy,” Harris says. “This technology has surpassed all of my expectations.”