(wired)
The human operators who control America’s killer drones are susceptible to the same psychological stress that infantrymen sometimes experience after combat. But better drones and control systems could help reduce the controllers’ stress levels — by allowing the people to blame the robots for the awful human cost of remote air strikes.

But there’s a downside. Sometimes you don’t want drone operators avoiding feelings of guilt.

At least that’s what Stanford University researcher Ryan Calo has concluded. Calo, one of the country’s top experts on the legal and ethical aspects of robot technology, has written extensively on the subject — and closely tracks the work of other researchers in his field. “It really matters how you design the controls,” Calo tells Danger Room. “Design and interface design … can change incentives and can change the psychological impact.”

When a missile gets fired or a bomb dropped — something that’s happened hundreds of times in America’s fast-expanding robotic air war — someone or something is going to get blamed for any resulting deaths. The question is whether a human being absorbs all of that culpability, which can mean an enormous emotional burden.

For drone operators, many of whom live in the U.S. and steer their armed drones via satellite from air-conditioned trailers, combat stress can be accentuated by the contrast between their jobs and their otherwise peaceful surroundings. “You shoot a missile, you kill a handful of people,” Missy Cummings, an MIT drone developer and former pilot, told Salon. “And then — this is what is strange — you go home. Your shift is over.”
(more)