Research News

To advance robot swarms, UB engineers turn to video games

Students create a simulated environment to demonstrate on a small scale how teams of autonomous air and ground robots can work together.

Inside UB's SMART Motion Capture Lab, students create a simulated environment to demonstrate how autonomous air and ground robots can work together. Photo: Douglas Levere

By CORY NEALON

Published January 24, 2020

Print
headshot of Souma Chowdhury.
“We don’t want the AI system just to mimic human behavior; we want it to form a deeper understanding of what motivates human actions. That’s what will lead to more advanced AI. ”
Souma Chowdhury, assistant professor
Department of Mechanical and Aerospace Engineering

The key to improving robot swarm technology may lie within video games.

That’s according to a research team from UB’s Artificial Intelligence Institute, which received a $316,000 federal grant to study the decisions people make — as well as biometric information such as their brain waves and eye movements — while gaming.

Researchers will use this data to build artificial intelligence they believe can improve coordination among teams of autonomous air and ground robots.

“The idea is to eventually scale up to 250 aerial and ground robots, working in highly complex situations. For example, there may be a sudden loss of visibility due to smoke during an emergency. The robots need to be able to effectively communicate and adapt to challenges like that,” says the grant’s principal investigator, Souma Chowdhury, assistant professor of mechanical and aerospace engineering, School of Engineering and Applied Sciences.

Co-investigators include David Doermann, director of the UB’s Artificial Intelligence Institute and SUNY Empire Innovation Professor of computer science and engineering; Eshan Esfahani, associate professor of mechanical and aerospace engineering; and Karthik Dantu, assistant professor of computer science and engineering.

Swarm robotics research is inspired by many sources; ant colonies and schooling fish are examples. But the potential to improve AI systems by learning from humans is enormous, says Chowdhury, a member of UB's Sustainable Manufacturing and Advanced Robotic Technologies (SMART) Community of Excellence.

Man working with a simulated environment.

The idea is to eventually scale up to 250 aerial and ground robots, working in highly complex situations. Photo: Douglas Levere

The study, which is funded by the Defense Advanced Research Projects Agency (DARPA), will center on real-time strategy games. These are time-based (as opposed to turn-based) and involve using resources to build units and defeat opponents. Examples include StarCraft, Stellaris and Company of Heroes.

Students will play a basic real-time strategy game developed by the research team. In addition to recording the decisions the gamers make, researchers will track their eye movements with high-speed cameras and their brain activity through electroencephalograms.

Student wearing headgear to measure brain waves is playing a video game at a computer.

Inside UB's SMART Motion Capture Lab, researchers measure brain waves from students who play video games that simulate environments for autonomous robots. Photo: Douglas Levere

The team will use the data to create artificial intelligence algorithms that guide the behavior of autonomous air and ground robots.

“We don’t want the AI system just to mimic human behavior; we want it to form a deeper understanding of what motivates human actions. That’s what will lead to more advanced AI,” Chowdhury says.

Eventually, the team will integrate and evaluate artificial intelligence it develops into more sophisticated virtual environments developed by DARPA’s partner organizations.

“This project is one example of how machine intelligence systems can address complex, large-scale heterogeneous planning tasks, and how the University at Buffalo Artificial Intelligence Institute is tackling fundamental issues at the forefront of AI,” says Doermann.