(msnbc)
A talking robot car that ran over a child or a battlefield robot that shot innocent civilians might never be taken to court, but a new experiment shows how likely it will be for humans to place blame on their mechanical servants as though the robots were people. The social psychology experiment, which involved a robot programmed to tell a lie, showed college students holding the robot morally accountable for its actions more often than not.

The college students did not consider "Robovie" as being morally accountable on a human level, but they judged the robot as being somewhere between a human and a vending machine. Many became noticeably upset and confrontational when the robot lied about how many items the students had found in a scavenger hunt, preventing them from winning a $20 prize.

"Most argued with Robovie," said Heather Gary, a doctoral student in developmental psychology at the University of Washington in Seattle. "Some accused Robovie of lying or cheating."

About 65 percent of the 40 students said Robovie was at least somewhat morally accountable for lying.

There have been a handful of accidental deaths at robotic hands so far, and in none of the cases was the blame placed on the robot, but the experiment suggests that future humanoid robots capable of socially interacting with humans will face moral judgments.

Humans could grow upset with their robot servants for stepping on a household pet, for instance, or feel resentful toward their talking robot car if a malfunction led to a deadly accident. On the battlefield, survivors of a robotic rampage might similarly be angry toward a humanoid military robot...
(more)