It Is Perfectly Moral To Torture A Robot

Advertisement

star wars torture droid

Kyle Russell/Business Insider

The IT-O Interrogator droid from "Star Wars"

Carnegie Mellon University roboticist Heather Knight recently published a paper on the nature of the relationship between humans and robots.

Advertisement

Knight arrives at the conclusion that humans and robots need each other to be at their most productive - robots are efficient workers that never get bored, and humans have the proper sense to give the robots well-defined instructions on what to do, or else they wouldn't do anything at all.

Early in the paper, Knight addresses the question of how humans "ought" to treat machines. She argues that "the more we regard a robot as a social presence, the more we seem to extend our concepts of right and wrong to our [behavior] toward them."

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

In other words, understand that robots relate to the world entirely differently from people, but treat them the way you'd like to be treated yourself. It's the golden rule, after all, and Knight suggests it (rather importantly) applies to robots as well. Not out of any sense of decency to the robots, but because of what such behavior would suggest about us as people:

As Carnegie Mellon ethicist John Hooker once told our Robots Ethics class, while in theory there is not a moral negative to hurting a robot, if we regard that robot as a social entity, causing it damage reflects poorly on us. This is not dissimilar from discouraging young children from hurting ants, as we do not want such play behaviors to develop into biting other children at school.

Advertisement

So there's no direct harm in torturing a machine, but it's pretty unpleasant to manifest torture in the real world. It's unattractive, no?