Killer robots more likely to be blamed for deaths than other military war machines, study finds
When it comes to creative ways of dispatching people - killer robots have long been the go to guys for Hollywood producers.
From Robocop's ED-209 to Blade Runner's replicants, when it comes to the business of technological termination, armed automatons are a science fiction favourite.
But with rapid advances in Artificial Intelligence and robotics, could fiction soon become science fact?
Researchers at the University of Essex wanted to know how people would feel if robots were deployed by Armed forces in the future.
Led by the Department of Psychology’s Dr Rael Dawtry the study found people perceive robots to be more culpable if described in a more advanced way.
Speaking on the University's website, Dr Dawtry said: “As robots are becoming more sophisticated, they are performing a wider range of tasks with less human involvement.
“Some tasks, such as autonomous driving or military uses of robots, pose a risk to peoples’ safety, which raises questions about how - and where - responsibility will be assigned when people are harmed by autonomous robots.
“This is an important, emerging issue for law and policy makers to grapple with, for example around the use of autonomous weapons and human rights."
Robots have already been deployed in war zones. Unmanned drones have been used by the US and other armed forces.
The Ukrainian Army is using a robotic evacuation platform. It's a technological development that means soldiers, using a remote control from a games console, can send in a robotic device to retrieve fallen comrades from a safe distance.
The platform can also carry a remote controlled gun. The idea of a robotic platform or a moving robotic machine gun may seem outlandish but it is working in trials.
And robotic systems have become increasingly sophisticated.
The University of Essex study, published in the Journal of Experimental Social Psychology, carried out three experiments, investigating whether blame is assigned to robots involved in accidents. Specifically, we tested whether people blame targets labelled ‘robots’ more than ‘machines’, and whether they do so because robots are assigned humanlike agency.
In one of the experiments, volunteers were given a description of an armed two legged robot. The researchers varied information about the robot, in some instances it was assigned more 'sophisticated' capabilities.
They then read a fictional scenario describing a military raid on a compound to capture or kill a terrorist. In the scenario, a commanding officer ordered a ‘technical operator’ to deploy the robot.
The study says: "Under circumstances described as ‘….not fully clear’, the robot discharged its machine gun, and on entering the compound, the team discovered two deceased men, and an injured civilian girl who later died."
Volunteers blamed a robot more when it was described in more sophisticated terms despite the outcomes being the same.
Dr Dawtry added: “These findings show that how robots’ autonomy is perceived– and in turn, how blameworthy robots are – is influenced, in a very subtle way, by how they are described.
“For example, we found that simply labelling relatively simple machines, such as those used in factories, as ‘autonomous robots’, lead people to perceive them as blameworthy, compared to when they were labelled ‘machines’.
“One implication of our findings is that, as robots become more objectively sophisticated, or are simply made to appear so, they are more likely to be blamed.”
It is hoped the study will help influence Governments as use of the technologies advances.
Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To know...