Wednesday, April 23, 2008

Blog 14 : Humanics

I find the laws of humanics somewhat confusing. The laws themselves are easy to understand, but the thought of enforcing these laws seems a bit ridiculous. I believe that free will is one of the most important, if not the most important distinction between being a robot, or a human. That free will that humans possess will make it nearly impossible for the laws of humanics to have any significant success in my opinion. Having said that, I think that robots and humans should share that mutual respect if you look at it from an ethical standpoint. My favorite quote from the short passage is when the human character from “The Bicentennial Man” says, “If a man has the right to give a robot any order that does not involve harm to a human being, he should have the decency never to give a robot any order that involves harm to a robot, unless human safety absolutely requires it. With great power goes great responsibility, and if the robots have Three Laws to protect men, is it too much to ask that men have a law or two to protect robots?” (460) I think it only makes sense to respect and care for the ‘thing’ that has been designed to make your life easier. However, in order for this mutual respect to happen, people have to be ethical about the situation and choose to respect the robots. That free will that humans possess will make this mutual respect very hard to come by because of the tendency most humans will have to look at the robots as inferior. So while the laws of humanics are a wonderful idea, in my opinion they will have very little success and no real relevance towards human-robot relationships.

No comments: