Using robots in war zones to examine and disarm hazards or recover objects have become increasingly common.
There is an underlying understanding that it is far more acceptable to lose a robot than the death of a soldier.
However, when robots become valuable members of the team, there is a tendency to treat them like colleagues rather than machines.
According to a recent press release, University of South Australia Professor of Human Computer Interaction, Professor Mark Billinghurst, has collaborated with Dr James Wen and other members of the United States Air Force Academy (USAFA) to explore these connections and their impact on team efficiency and productivity on the front-line.
Background of the Initiative
Their research shows that for robots to be fully integrated within a human-machine team (HMT), they must first be accepted as teammates.
To facilitate this, a lot of work has been done over the years to make robots more ‘human-like’ by altering their physical characteristics and capabilities.
While humanising robots strengthens the working relationships between soldiers and their robots, it also inflates the value of the robot team members in the minds of military personnel.
This leads to an increased emotional response when the robot is put under stress.
The researchers designed a simulation-based application that tracked the emotional responses of two teams of participants, who undertook a range of simulated tasks with either a personified or non-personified robot.
Findings
The study showed that teams working with a personified robot were 12% less likely to put their robot at jeopardy of destruction compared to teams working with a non-personified robot.
Moreover, they were more sensitive to the robot’s health and the possibility of seeing the robot ‘killed’ in action.
This is the first time that research has measured how actions can be altered by empathy when potential harm is induced in a simulation.
The results show first-hand how emotional connections can impact decision making in the field.
Evidence shows that teams working with a personified robot are significantly more mindful about limiting damage and harm towards it.
However, this can have significant consequences.
Showing empathy has consequences
Empathy shown by a soldier towards a military robot has the potential to interfere with performance on the frontline.
Rather than sacrificing the robot, participants who were working with a personified robot had to increase their workloads.
They were willing to take more personal risks and would stop before putting the robot at risk, which impacts their decision making under pressure.
According to the Professor, such hesitation and having an empathic response in these circumstances could have dangerous consequences for military personnel.
Where split second decisions can determine the difference between life and death, it will become increasingly important to monitor soldiers working in collaboration with robots.
It is expected that military robots will be increasingly used in the future necessitating further research, training and evaluation on the topic.
The research also has implications for a wide range of other human/robot collaborative tasks in non-military settings, such as on the factory floor, in hospitals, or even in the home.