You will recall, I'm sure, Isaac Asimov's 3 Laws of Robotics, either from his seminal book or the movie of the same name, I Robot. They are:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
"However, the new Canning Laws are certainly not a carte blanche for homicidal droids to obliterate fleshies without limit; au contraire.
Canning proposes that robot warriors should be allowed to mix it up among themselves freely, autonomously deciding to blast enemy weapon systems. Many enemy “systems” would, of course, be themselves robots, so it's clear that machine-on-machine violence isn't a problem. The difficulty comes when the automatic battlers need to target humans. In such cases Mr Canning says that permission from a human operator should be sought."
Canning has prepared a presentation called “Concept of Operations for Armed Autonomous Systems” which is available in pdf format here.
As Mr. Lewis aptly mentions, under the rules proposed by Canning one wonders if a killer robot would be allowed to destroy an AK47 (machinery) that happens to be in the hands of a human enemy. He very drolly concludes, "If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear. Effectively the robot is allowed to disarm enemies by prying their guns from their cold dead hands."
If you enjoyed this post, take a few seconds of your time and subscribe to our feed! Barry's Best is updated daily!
0 comments :
Post a Comment