Welcome

Welcome to my blog! If you are a new visitor, please click here.

Wednesday, January 30, 2013

The 3 Laws

Anyone who is a sci-fi lover knows about Isaac Asimov's 3 Laws of Robotics.  These three laws are as follows:

1) A robot cannot take the life of a human, or through inaction cause a human to die.
2) A robot must obey all orders given to it by humans, except when those orders violate the first law.
3) A robot must preserve itself, except when doing so violates the first two laws.

There are many, many flaws in these laws.  What if I was a less-than-scrupulous person who told a robot that wasn't mine that it now belongs to me and must follow all of my orders and only my orders?  Robot theft/slavery would be rampant.  What if I didn't like someone, so I told their robot to kill itself?  Easy way to have revenge.  What if I attacked the robot's owner and told the robot that if it does anything, I'll kill its owner?  The robot would probably have a meltdown as it bounced between laws and tried to interpret them.

Most of these issues can be solved with some simple bi-laws, things such as "A robot must obey all orders given to it by its owner or any confirmed law enforcement officer..." and, "A robot cannot take the life of a human, or through inaction cause a human to die, unless doing so would save at least as many humans as would die or be killed."  And, of course, this has other problems, as we see in the movie "I, Robot."  In this film, a sentient A.I. of vast intelligence decides that the best way to save humanity is to enslave all of humanity, preventing them from killing each other.  And in the anime Casshan: Robot Hunter, we learn that the "evil" robots which were programmed to save planet Earth did so by killing all humans, the things causing Earth's destruction.  It is feasible a robot would murder everyone it deemed capable or likely to kill another human, would would eliminate everyone in the military, every police officer, every spy, every world leader, etc.  not to mention every violent criminal.  It would take just one glitch to cause havoc.  This is why it was decided that if these laws ever came to be, there would be a 0th Law (similar to the 0th Law of Thermodynamics).

0) A robot cannot harm humanity, or by inaction cause humanity to come to harm.

This would, of course, also change the other laws so that it would take precedence over them.  But still, this offers many flaws that can be exploited.  The most major flaw of all is the purpose of robotics.  In today's world, robots are usually used in industry, to do work humans do not wish to do, and these laws are perfect if robots continue to do such work, and such work alone, in perpetuity.  But we've already started to enter the realm of drone warfare
.  If we want robots to take the place of our men and women on the front lines, they had better be willing to kill humans, otherwise the first human that steps on the field wins by default.  They had better be programmed to not listen to the orders of anyone other than their superiors, or they'll just be told to kill themselves by their enemies.  And they had better not think of "humanity" in the same way as the other side does, or they'll simply freeze in combat.  If we want robots to gain the rights and freedoms of humans (an idea played around with in many stories, including Bicentennial ManStar TrekBlade Runner, Metropolis, and countless more), then we would need to give them the same abilities as us, even if those abilities include breaking the law.

Obviously, the Three Laws of Robotics are not meant as an actual, universal working model.  They can be used as a guideline, or used as a plot device in a story.  In truth, Asimov was a writer and a biochemist, not a roboticist or a computer programmer.  He was more interested in the story, and as a writer, so should you be.  These laws can be played around with, or they can be taken as is and brought to their various logical conclusions.

I, for one, have come up with an interesting new set of laws.  Behold, the Three Laws of Robotics Americans!

1) An American has the right to live and cannot take that right from any other American.
2) An American has the right to liberty, freedom, and choice, provided exercising such rights do not impinge on any other American's right to life or right to liberty.
3) An American has the right to pursue happiness, provided doing so does not violate any other American's rights to life, liberty, or to pursue happiness.
Violating any of these rights will strip you of all rights of the same rank and below.

What do you think?  Plot-worthy?

No comments:

Post a Comment