Saw the movie I, Robot last night. I must admit, when I learnt that the storyline was not the same as the novel of the same name by Issac Asimov, I was a little disappointed. However, fortunately, the movie turned out to be quite entertaining. The special effects are just ideal for this type of movies. Essentially, the I, Robot series (“I, Robot”, “Pebble In The Sky”, and “The Stars, Like Dust”)was more of a detective series rather than conventional sci-fi dealing with technology or alien life forms.

For these stories, Asimov stated something which is now widely accepted (even in science): the 3 laws of robotics, circa 1940 (see also implications for Information Technology due to these laws).

  1. First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
  2. Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
  3. Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

These laws were stated to preserve the longevity and the survival of our very own fragile human existence by extracting absolute obedience from the robots. However, Asimov, in his novels, explored how the cycle of logic can be broken and offered alternate interpretations of the three laws and their consequences. This central theme in the original novels was in my opinion preserved, in spirit at least, in the movie.

Without spoiling the movie for those who have not yet seen it, the ultimate form of protection for humanity involves some very drastic measures, given that, logically, the highest danger posed to humanity is humanity itself. What sort of protection can be offered in that case, which is consistent with the First Law? So, there is a little irony in that the creator (mankind) needs to be protected by the created (the robots) from itself.

As a postscript, does this line of argument extend to God?