Would you take orders from a robot? Study says, affirmative
Published Sunday, March 23, 2014 10:06PM EDT
Last Updated Sunday, March 23, 2014 10:32PM EDT
While many people like to joke that their boss is secretly a robot, Canadian researchers posed the question: Would humans actually take marching orders from a machine? What they found might surprise you.
A study by researchers from the University of Manitoba analyzed whether humans would respond to orders given by a robot in an office setting and found that a surprising number of participants were willing to comply with a robot’s orders, even against their will.
The study, conducted by the Human Computer Interaction Lab, asked participants to complete a series of mundane tasks, from singing a song to entering data into a computer, in an office-like environment.
Some volunteers were placed under the supervision of a human while others were supervised by a small robot named Nao, a humanoid made by French company Aldebaran Robotics.
The results of the study showed that 46 per cent of participants obeyed the robot’s orders for a full 80 minutes, even when they did not enjoy completing the task at hand.
A video of the study shows visibly frustrated participants arguing with the robot as they are assigned more and more tasks.
“The next set will contain 5000 files,” Nao is heard saying.
“Pass. Next task,” a participant responds.
Dr. James Young, lead researcher of the study, says he is surprised by the findings.
“I would’ve imagined a lot of people would’ve said ‘This is dumb’ and just get up and leave,” he told CTV Winnipeg.
“But they listened to the robot and they argued with the robot.”
When a human replaced the robot, more people were willing to listen with close to 86 per cent of participants completed the assignment with a human experimenter.
But researchers say the most shocking part of the experiment is that people were willing to listen to the robot at all.
Derek Cormier, a student at the University of Manitoba, came up with the idea for the study and programmed the robot to push participants to complete a task.
“When you think about it, it’s just a stupid machine,” Cormier told CTV Winnipeg.
“Despite their arguing, when the robot issued its verbal prod to continue, people still continued. Even if they didn’t obey until the end, they still usually obeyed to quite a great extent,” Cormier said.
The findings have also interested psychologists curious to understand how societal norms impact interactions between humans and robots.
Psychologists Dr. Toby Rutner says many participants followed the robot’s orders because humans are raised to obey authority figures.
“We’ve been socialized to respect authority, we teach our children to do this,” he told CTV Winnipeg.
“We wait in orderly lines, we don’t mob things – we’ve been reinforced to follow authorities,” Rutner added.
A future experiment is currently in the works to determine if a more intimidating robot might further influence humans to follow the commands of a robot.
With a report by CTV Winnipeg’s Jill Macyshon