Eureka!

The Real Problem with Killer Robots

How we miss the human factor of this high-tech threat

Graphic of a robot.

By Bert Gambini

Print
“When we are defining what robots are and what they do, we also define what it means to be a human in this culture and this society.”
Tero Karppi, assistant professor of media study

Killer robots have broken out of the world of science fiction and entered into ours.

Such talk may sound like an exercise in fantasy, especially if you’re thinking along the lines of films like “The Terminator.” But agencies really are working to build the operative foundation of fully autonomous weapons that seek, identify and attack enemy targets. An $18 million Pentagon budget line is devoted to developing these technologies, according to The New York Times. Meanwhile, an international coalition, known as the Campaign to Stop Killer Robots, has formed to halt them in their tracks.

But even if the campaign is successful, banning these instruments may be a temporary solution. At the root of the problem, according to a UB research team, is the fact that society is entering into a new reality in which weapon systems like these have become possible. Killer robots raise big questions that will define the coming age of automation, artificial intelligence and robotics.

“Are humans better than robots to make decisions? If not, then what separates humans from robots? When we are defining what robots are and what they do, we also define what it means to be a human in this culture and this society,” says Tero Karppi, assistant professor of media study, whose paper with Marc Böhlen, professor of media study, and graduate student Yvette Granata appeared in the International Journal of Cultural Studies last October.

Governance and control of systems like killer robots need to go beyond the end products, they caution. “We need to go back and look at the history of machine learning, pattern recognition and predictive modeling, and how these things are conceived,” says Karppi, an expert in critical platform and software studies whose interests include automation, artificial intelligence and how these systems fail us. “We have to deconstruct the term ‘killer robot’ into smaller cultural techniques,” he says.

Tero Karppi.

Tero Karppi

Cultural techniques refer to the elements that give rise to technical developments in the first place. In media theory, the cultural-techniques approach looks at multiple evolutionary chains of thought, technology, imagination and knowledge production in an attempt to understand how all of these come to generate new systems and concepts. Cultural techniques provide insight into the process of becoming—in other words, how we got to now.

“Cultural techniques create distinctions in the world,” says Karppi, noting that the conceptual differences between friend and foe, human and machine, life and death pose not only technical but also ethical and conceptual challenges for robot developers. “Previously, humans have had the agency on the battlefield to pull the trigger, but what happens when this agency is given to a robot and because of its complexity we can’t even trace why particular decisions are made in particular situations?”

The realization of killer robots could be a dangerous distraction. The authors argue that we must reconsider the composition of the actual threat. “We shouldn’t focus on what is technologically possible,” Karppi says, “but rather the ideological, cultural and political motivations that drive these technological developments.”