Main Page | See live article | Alphabetical index

I, Robot

I, Robot is a collection of science fiction short stories by Isaac Asimov, first published in 1950. Although working well enough as standalone stories, they share a theme of the interaction of man, robots and morality, and put together tell a larger story of Asimov's fictional history of robotics.

The narrator, and central character, is Dr. Susan Calvin, chief robopsychologist at US Robots and Mechanical Men, Inc., the major manufacturer of robots. The stories are presented as her reminiscences of her past work, which chiefly are concerned with aberrant behaviour of robots, and the use of 'robopsychology' to sort them out. The book also contains the short story in which Asimov's famous Three Laws of Robotics first appear.

The title was applied earlier to a short story by Eando (Earl and Otto) Binder. Asimov originally titled his collection Mind and Iron, and initially objected when the publisher changed the title.

In the late 1970s, Harlan Ellison produced a screenplay based on Asimov's book. The film was never made, but the script appeared in book form under the title I, Robot: The Illustrated Screenplay (1994). A 1977 Alan Parsons Project album was inspired by Asimov's I Robot and is named after it. A motion picture adaptation starring Will Smith is currently under development by Twentieth Century Fox and is expected to be released in 2004.

Table of contents
1 I, Robot
2 ISBN numbers
3 I, Robot: The Game
4 External Links

I, Robot

To you, a robot is just a robot. But you haven't worked with them. You don't know them. They're a cleaner, better breed than we are.

When Earth is ruled by master-machines... when robots are more human than humankind.

Isaac Asimov's unforgettable, spine-chilling vision of the future - available at last in its first paperback edition.

-Cover blurb from the paperback edition of I, Robot, and largely inaccurate to boot. The first paragraph is a quotation of one of the book's characters, Dr Susan Calvin, but the rest reads rather like the copy of an editor who hadn't actually read the book. At the time of publication, robots were depicted in science fiction as either servile machines or evil creations that revolted in the manner of Frankenstein. Asimov himself said that in writing the Robot stories he sought to replace both views with something more rational.

Warning: Wikipedia contains spoilers


This first story in the collection is basically about the technophobia that surrounds robots, and how it is misplaced. This is not such a strange place to start, when you see that almost every science fiction story to feature a robot up until publication of this one was of the 'robot turns against creator' theme.

Robbie was a mute robot, a nursemaid for a girl called Gloria Weston. Her mother was not convinced that robots were safe, and she decided that they had to take the robot back. Her view changed when Robbie saved Gloria's life in the US Robots' factory.


A robot designed to mine selenium on Mercury's surface goes missing. The two humans on the planet, Powell and Donovan, go and retrieve the robot and try to analyse what happened. They find that its levels of obedience to two of the Three Laws of Robotics had reached an equilibrium, and it was running around in a circle maintaining this equilibrium value.

Many of these stories explore the implications of the Laws of Robotics, although in Runaround the robot is actually following the Laws as they were intended. In others, ambiguities in the language are employed to achieve the desired effect; that the robot does what it was told, but not what was intended.


Powell and Donovan are back, but this time on a space station supplying energy via beams to the planets. The robot that controls the energy beams has been given a unique ability: reasoning. Using this skill it decides that the humans that inhabit the station are unimportant, and that it serves a greater purpose (i.e. God). The humans initially attempt to reason with the robot until they decide that they can't win, and the robot can still perform its job well. The only difference is that, as far as it is concerned, it doesn't do it for the benefit of the humans, but for its deity.

A robot that invents its own religion. The robot's designation is QT. An interesting point is that the robot still obeys all Three Laws, albeit unwittingly. Why, if it doesn't believe in the humans on the planet Earth, should it act to protect them? (Powell and Donovan, noting that QT's religion conveniently requires it to perform all the functions for which it was created, conclude that the robot is still unconsciously following its programming, which is why they decide to leave it alone.)

Catch that Rabbit

Powell and Donovan are now in charge of field tests of an asteroid mining robot. For some reason, if they don't watch the robot it comes back empty. The robot has subsidiary robots under its control, and when they secretly observe the robot it starts going on absurd marches and dances with its subsidiaries whenever something unexpected happens.

Here, Asimov anthropomorphises by having a robot twiddle its thumbs when it can't think of what to do. (Which is to say that one of the characters draws that analogy; how seriously Asimov meant it is unclear.) In many cases, robopsychology - personified by Susan Calvin - runs parallel to human psychology. For instance we have already seen hysteria and religious mania.


Somehow, a robot is created that has the ability to read minds. While the heads of US Robots and Mechanical Men are trying to analyse what happened, the robot tells them what the other people are thinking. The First Law still applies to this robot, and it is lying to them about the thoughts it has read in order not to hurt their feelings, especially in terms of the problem it was initially designed to solve. However, by lying it is hurting them anyway, and when it is confronted with this fact the robot collapses.

The application of the Laws of Robotics is again the subject here, but in terms of telepathy. The lexical ambiguity that is explored here is the definition of injury, the robot having to take into account psychological injury as well as physical.

Little Lost Robot

In a research outpost, one of the researchers loses his temper and tells a robot to "Get lost". It does. It is up to US Robots' robopsychologist Dr Susan Calvin, and Mathematical Director Peter Bogert, to try and find it. The problem is that they know exactly where it is: in a room with sixty-two identical robots.

So, why was this individual robot so important? The answer is that it had had its First Law modified, to read "No robot may injure a human being", i.e. it could happily leave a human to die by other means. Again, we explore the ambiguities of the English language, a technician who wanted a robot to leave told it to "Get lost", and the robot assumed that the order meant that it should secret itself. In Little Lost Robot, the Frankenstein complex is again addressed. The reason that the robot must be found is because people are still by and large scared of robots, and if they found one with a different First Law there would be an outcry, even though the robot is still incapable of harming a human.

(An interesting note: the method by which Dr Calvin eventually identifies the modified robot has nothing to do with the modification at all.)


US Robots are approached by their biggest competitor with plans for a spacejump engine, but are wary because in performing the calculations, their rival's supercomputer destroyed itself. They determine a way in which they can feed it to their own robot computer without the same thing happening, and build a ship. When they test it, the computer starts playing practical jokes on them, such as not letting anyone control the ship and feeding the crew on beans and milk. Looking through the calculations, they discover why: a hyperspace jump causes the crew of the ship to cease existing for a brief moment which is a temporary violation of the First Law.

This story again relies on the differences in interpretation of the Laws of Robotics between the human members of US Robots and their mechanical creations. The important factor in this robot is its personality; it allows the supercomputer to calculate the answer to the hyperspace problem, but causes it to behave immaturely.


A man standing for election as Mayor of New York, Stephen Byerley, is suspected by some people to be a robot. If this is true, it will ruin his campaign because of the hysteria generated, and because robots are not allowed to run for office. He never confirms or denies his fleshly status, but while he is campaigning someone rushes the stage and he punches the intruder away. This confirms that he is human in the minds of most people, because it violates the First Law. Susan Calvin is not convinced because a robot can hit another robot.

Again this is about the public's distrust and fear of robots, despite the fact that a robot is the best candidate for Mayor. Many people choose to see Asimov's treatment of technophobia as an allegory to the anti-Semitism he experienced himself.

The Evitable Conflict

'Machines', powerful computers that control the world's economy and production, start giving instructions that appear to go against their function. Although each glitch is only small, the fact that they exist at all is alarming. Dr Calvin and Stephen Byerley, now World Co-ordinator, investigate.

It is 2052, and machines are in control of the world... luckily Asimov knows his audience will appreciate a good story, and doesn't resort to alarmism. Anyway, here the Machines' First Law is similar to the later Zeroth Law, that 'No robot may harm humanity', and it is this that causes the problems. Dr Calvin concludes that the "glitches" are deliberate acts by the Machines, allowing a small amount of harm to come to selected individuals in order to prevent a large amount of harm coming to humanity as a whole.

ISBN numbers

I, Robot: The Game

I, Robot was an arcade game made by Atari and released in 1983. It was the first commercial video game with 3-D polygonal graphics and featured flat shading. It also had very innovative gameplay, featuring a "game" mode and an "ungame" mode where one draws with the objects in the game by leaving trails. The game has no relation to the book of the same name.

External Links