Innovation

Can the presence of a robot affect whether humans behave ethically?

A series of studies at Cornell investigate how humans behave in the company of robots. Here are the findings, and what they mean.

Image: Johnathan Bloom

Robots are joining us at work, schools, and even at home. But, beyond assisting us, can they influence the way we act? Researchers at Cornell have conducted a series of experiments to address whether—and how—the presence of a robot can influence human behavior.

Guy Hoffman, professor at the Sibley School of Mechanical and Aerospace Engineering at Cornell University, and one of the authors of the studies, spoke with TechRepublic about the results.

Rather than naming it "honesty," Hoffman said what he's studying is more about compliance, akin to insider trading: "it may not be dishonest, but it's definitely breaking the rules." So what happens when there's a robot in the room with a human? Here's what he found.

Study 1: Robot presence in the lab decreases dishonest human behavior.

Hoffman, in collaboration with researchers from IDC Herzliya and Carnegie Mellon University, used a simple robot for the experiment. It had "two eyes, a face on a robotic neck, and is sitting on the corner of the table, looking at you every once in awhile." Subjects were given a boring task: to watch a screen and name which side had more dots on it. They had the potential to earn more money if they did not follow the rules.

For one pool, a person sitting in the corner would tell subjects to alert them when they were finished. Another group was instructed to tell the robot when they were finished with the task. A third group was alone in the room, without a human or a robot instructor.

The findings? People cheated most when they were alone. They disobeyed the rules equally with the robot or the person present. Essentially, subjects treated the robot as if it were a human.

Another interesting finding, Hoffman said, is how people felt about what they'd done. "People felt guilty when the person was in the room, but they didn't feel guilty when the robot was in the room or when they were alone in the room," Hoffman said.

Study 2: Robot presence in the field does not decrease dishonest human behavior.

The second study, still under review, was a 9-day field study, conducted in an uncontrolled environment. Hoffman, in collaboration with Jodi Forlizzi from Carnegie Mellon University, wanted to see whether the presence of a robot would affect whether people passing a table full of food labeled "don't touch" would take some. In another condition, a person was sitting at the table. And in a third condition, there was no robot or human present. The study took place in a university hallway and was videotaped.

SEE: Angelica Lim: Flutist. Global roboticist. Proud master of a robot dalmatian named Sparky. (TechRepublic)

"We found the robot made people engage more with the situation, trying to understand what was going on," said Hoffman. "They'd wonder: 'Why is there food? Why is there a robot?' It raised their curiosity. In the end, however, they would take as much food with the robot present as without a robot." With a human present, however, people were far less inclined to take the food.

Hoffman was a bit surprised by the results, which were different from what was found in the first study. "We thought the robot have some effect on the results. We feel that somehow it comes down to this idea of being judged more than being monitored."

Hoffman spoke to Avital Mentovich, a criminologist at Essex University, about the results. "She believes that the differences are due to the fact that with robots, people evaluate the instrumental questions, like 'Will I be punished? Will there be consequences? Will I get fined or incarcerated?', but people do not feel the normative social pressure of 'being judged' by a robot, as they do when a human observes immoral behavior."

Study 3 (in piloting stage): Do dishonest robots encourage dishonest human behavior?

A third experiment, which Hoffman, along with researchers from IDC Herzliya presented in the International Conference for Human-Robot Interaction in New Zealand in March 2016, looks at what happens when a robot and a subject play a game together. The question is whether the robot will influence human behavior, or rule-breaking, the same way a human peer might. The researchers predict that the presence of a robot can "Decrease corruption by providing a monitoring presence, without increasing it by collusion."

"When you feel like the environment is corrupt it makes you more corrupt," said Hoffman. In this dice game, the human and robot need to work together to look at numbers on dice. What number you identify determines the amount you will be paid—if human and bot "see" the same number, they are paid that amount on the dice.

Subjects play the game with a robot and with a computer and then with a human. Hoffman's theory is that the robot will push subjects to behave in a "more compliant and obedient manner, just like we saw in our first experiment. You have this kind of monitoring presence that makes you feel I should be honest."

SEE: 6 ways the robot revolution will transform the future of work

"It's the best of both worlds," said Hoffman. "The robot makes people feel like they're being watched a little bit, but it doesn't make them feel like if the robot is being corrupt and they can also be corrupt. We can use the non-social aspect of the robot to our advantage of having them be a partner that only makes them better citizens."

Study 4: People seem to blame robots less for doing harm than they blame AI software.

Hoffman and Avital Mentovich, a researcher from University of Essex, are currently working on a study that examines the blame people place on AI versus on robots. What they found, so far, is that "people blame AI more than they blame robots for the same damage that was done. They think that the AI is more in control over decisions than robots are," said Hoffman, "which doesn't make any sense from the engineering point of view."

"There is a very different flavor of blame attribution," he said. This could have significant takeaways when it comes to things like autonomous cars, drones—who bears responsibility? They also looked at whether people blame the software itself or the programmer. "It seems that people think the programmer and the software have the same amount of responsibility," said Hoffman, "but the robot has much less responsibility. Is it because, somehow, the robot is dumb?"

What's next?

The Cornell research shows that robots can, and will, have an impact on human behavior. It begs further questions: Could the shape and appearance of the robot affect the outcome? Hoffman said that these studies used a robot that was very mechanical looking—but making a more humanoid version could change the results. Previous research, he said, showed that while human behavior didn't change when in the presence of a more human-looking robot, people felt more competitive. "They felt more that the robot was smarter and more cunning," said Hoffman. "I think the robots anthropomorphic form definitely has an effect."

Whether we are prepared or not, robots are about to become a lot more prevalent in the world, and we need to understand how they affect society. How will robots in the office affect us at work? How will they change the way we see things? How will a robot working alongside a fast-food employee affect his behavior? Will it be different from working with another human? Or working alone? Hoffman sees all of these as important questions.

"Especially in places where robots will be seen first," said Hoffman, "like hospitals, nursing homes, schools, and the military, following the rules is very important. When they work with children or patients, or in the military or in government, we should understand how these robot affect human behavior."

After we can fully understand how these robots affect us, Hoffman said, we can begin to think about how to "design these robots to promote behaviors we want to promote."

Also see...

About Hope Reese

Hope Reese is a Staff Writer for TechRepublic. She covers the intersection of technology and society, examining the people and ideas that transform how we live today.

Editor's Picks