Wednesday, April 02, 2008

Journal 11: A.I. Reflections

Watching the movie Artificial Intelligence (AI) was both an intriguing and disturbing experience that brought up even more questions about AI research. One concept brought up in the movie that struck me was the idea of human responsibilities to robots. When Professor Hobby, a leading researcher in AI, announces that he will create a child robot that can love, one of his colleagues asks "If a robot could genuinely love a person what responsibility does that person hold toward that Mecha (robot) in return?" To this, Hobby responds, "…in the beginning, didn't God create Adam to love him?"
The professor’s answer here disturbs me. He is implying two things, first that a person would have no duty to take care of the Mecha that loved him/her and second that God created Adam for the sole purpose of loving Him. Regarding the latter implication, any scholar of Judeo-Christian theology would know that God created Adam out of love, so that God would have someone to love who could chose to love Him back. God created Adam to form a loving, covenantal, two-way relationship with his creation. Thus, if Hobby would like to model the way humans should treat robots the way God treated Adam, then humans would be required not only to be loved but to love his/her robot in return, unconditionally.
Indeed, I believe that the human has an enormous responsibility to the robot, in general. Humans make robots; androids are our creations. Thus, as creators, we have responsibilities to what we’ve created. Just as we cannot allow our creations to become out of hand and we cannot watch our robots harm other people and things, so too much we not allow the world (ourselves included) to harm our robots unless in self-defense. To robots that can love, the responsibility of the human to his/her robot becomes more intense. The human created the robot and it is because of the human that the robot has an unconditional, eternal love for him/her. To put a robot in such a situation is of questionable moral integrity anyway because even if a person loved the robot that loved him/her the human’s whole life, that robot would spend the rest of eternity missing the one it loves. Humans cannot help dying (yet), but to leave on purpose and for selfish reasons a robot that loves one is, in my opinion, absolutely abominable.
A demonstration of this behavior toward a loving robot in AI occurs when Monica leaves her robot son, David, in the woods because she cannot handle him anymore, as David has done some unacceptable things to her ‘real’ son, Martin. Watching this scene brings tears to the eye as little David positively begs his ‘mother’ whom he loves very much not to go, not to leave him in the forest (which, reluctantly, she does anyway). It is difficult not to see David as just a young child being abandoned. Indeed, what is so inhuman about his cries for his ‘Mommy’? What is it about his pleas for ‘Mommy’ to stay, to not leave him make him a non-person? Surely, Monica must have had a responsibility to this robot (only a child) that she left, much against his will, to wander the world aimlessly, never again being able to be with the one he loved?
To be fair, I must admit I believe that robots and humans can never be the same entity. A world populated by robots could carry on some information about humanity in the future, but the human experience would be nearly impossible for a robot to save. Yet, just because robots are not exactly the same beings as humans does not mean that the robots that can think, reflect, process and (most importantly) feel can be treated as lesser creatures. Who knows, the robots could even be God’s next Chosen People! Indeed, robots are made by God, indirectly (God made people and people make robots). Humans do have a hand in creating robots but humans also are co-creators of organic human beings as well. Thus, when discussing this controversial subject, it is best (as I have said before) to err on the side of caution. People owe robots responsibilities as their creators (or rather, co-creators). This feeling of duty should be heightened when a robot becomes a self-aware, reflective, and sentient, for one can never know for certain if God has placed a message, a soul within this new robotic body. David sure seemed to have a soul, at least, as he mourned for the loss of his love and as he went on a quest to become a real boy, for all he wanted in the world was to love and be loved.

No comments: