Launched in April 2024 by Governor Kathy Hochul, Empire AI is a bold partnership of New York’s leading public and private universities coming together to establish a state-of-the-art artificial intelligence computing center, housed at SUNY’s University at Buffalo. Empire AI is already facilitating statewide innovation, research, and development of AI technologies.
AI-enhanced medical imaging
A multidisciplinary team will employ Empire AI’s computing power to create an AI tool that helps doctors better diagnose and monitor diseases. It will generate biomedical images by combining pathology reports with image-generating AI models that are trained on medical scans. The tool could lead to more sophisticated models that integrate genetic data with medical imaging.
Using AI to Improve the Speed and Accuracy of Disease Detection from Tissue Samples
This project aims to develop a powerful AI foundation model designed specifically for analyzing histopathology images—microscopic views of tissue samples used to diagnose diseases such as cancer, autoimmune disorders, and infections. Unlike general medical imaging, histopathology requires capturing highly detailed cellular structures and subtle variations that are critical for precise diagnosis. By training on large, diverse datasets and leveraging high‑performance GPU clusters, the model will enable applications ranging from disease classification and automated reporting to advanced image search and retrieval. In partnership with the University at Buffalo’s Pathology Department, this work seeks to advance early detection, improve diagnostic accuracy, and create scalable AI tools for both clinical and research uses.
Harnessing vision‑language technology to tackle agricultural and environmental challenges
This project is developing a large vision-language model (LMM) tailored for the agriculture and climate sectors. Existing multimodal AI tools can converse across many topics, but they often miss domain-specific details in emerging, specialized areas. By training the model to recognize nuanced agro-climatic concepts and link visual data with complex climate and farming questions, the research aims to create a digital assistant that supports decision-making in sustainable food production and climate resilience.
Using speech, text, and visual cues to support personalized education and developmental assessment
This project, part of the National AI Institute for Exceptional Education, is developing an AI-powered screening tool that can interpret multiple types of information—such as speech, text, and visual cues—to support identification of children’s learning and developmental needs. By training a large language model on a rich, multimodal dataset, the system is designed to recognize patterns in communication and behavior that traditional assessments may not easily detect. The goal is to support educators and specialists in tailoring interventions and strategies for each child, with the potential to inform approaches to assessment and personalized learning.
An Interactive Multimodal Language Model for Children
UB researchers will utilize Empire AI to build an agentic, multimodal large language model to perform real-time, interactive speech-language therapies for children with speech and language challenges. This model will be used by the National AI Institute for Exceptional Education and is designed to plan, deliver, and adapt interventions in real-time based on the child’s responses and scene dynamics.
Combining medical imaging with computational physics to deliver fast, personalized insights into neurovascular health
The Measurement and Physics-Driven Generative Models (MPDGM) project combines advanced AI with physics-based modeling to transform standard brain blood flow images into detailed 3D simulations. By integrating medical imaging and computational physics, the research seeks to capture more nuanced patterns in how blood moves through neurovascular systems. Leveraging UB’s Empire AI supercomputing, this approach has the potential to generate faster, more individualized insights that could one day support physicians in understanding complex brain vessel conditions.
Enhancing Human-AI Interaction
A UB research team within the Department of Computer Science and Engineering and the Department of Communicative Disorders and Sciences focuses on an area of AI research known as personalization of large language models. This involves training models to learn the preferences and behaviors of users to fine-tune their experiences using chat bots. The project will leverage Empire AI to develop tools to ensure that people with ALS (also known as Lou Gehrig’s disease), cerebral palsy and other motor neuron diseases have equal access to AI. Researchers plan to enrich augmentative and alternate communication devices with conversational AI.
AI to study protein structures for drug design
A team of scientists is developing an AI model (SWAXSFold) to improve AlphaFold, a celebrated AI program that predicts the structure of proteins and could lead to the development of new drugs to treat diseases. AlphaFold struggles to consider how proteins change shape due to pH, temperature and other conditions. Empire AI’s computing capacity enables SWAXSFold to address this limitation by using experimental data to predict how proteins look in these environments.
AI-Powered Polypharmacology
UB researchers in the Department of Biomedical Informatics are creating a model (CANDO) that uses AI to analyze billions of interactions between molecules. CANDO’s development will be greatly aided by Empire AI. Its primary goal is to help identify new medicines to treat diseases, however, other potential applications include new research probes, reagents, detergents, pesticides, sensors and more.
Advancing surgical precision and trauma care through AI-enhanced multimodal imaging and data analysis
This research develops advanced AI tools that combine visible and near-infrared imaging to assist surgeons during gastrointestinal procedures. By fusing these multimodal images, the system is designed to support identification of critical anatomy and sterile surgical instruments. Additionally, the AI analyzes electronic medical records and audio communication to potentially inform trauma triage decisions. Together, these technologies aim to advance research into surgical and trauma care applications.
From transaction data to news archives, AI reveals how information shapes markets
These projects explore how information moves through financial markets and how it can be used to gain a performance edge. One effort applies generative AI to predict the next public market trade—treating sequences of trades like sequences of words—by training large language models on vast transaction and quote datasets. Another uses AI to mine insights from historical financial news and reports, while carefully avoiding “look‑ahead bias” so that back‑tests mirror real‑world conditions. Together, they aim to improve predictive models, uncover hidden patterns in trading behavior, and help investors and researchers better process the immense flow of market information.











