I, Robot

by

Isaac Asimov

Teachers and parents! Our Teacher Edition on I, Robot makes teaching easy.
Themes and Colors
Morality and Ethics Theme Icon
Human Superiority and Control Theme Icon
Irrationality, Fear, and Folly Theme Icon
Artificial Intelligence, Consciousness, and Humanity Theme Icon
LitCharts assigns a color and icon to each theme in I, Robot, which you can use to track the themes throughout the work.
Morality and Ethics Theme Icon

I, Robot is a collection of short stories exploring humans’ development of robots, and the ethics that govern those robots’ behavior. Written between 1940 and 1950, the stories progress in time from 1996 to 2052 and center on the development of very advanced robots. But in contrast to the many stories that depict robots becoming corrupted or turning on their creators, Asimov’s stories largely show that robots follow their ethical standards more closely than humans do. Unlike humans, who are encouraged to follow ethical conduct, robots are bound to follow the laws that humans have laid out for them. Thus, through several of his short stories, Asimov demonstrates that despite (or perhaps because of) their lack of organic emotions or moral conscience, robots have the potential to behave more ethically than humans.

The three rules governing robots are laid out in “Runaround,” and establish the moral principles by which all robots must abide. The story then goes on to demonstrate how rigidly the robots adhere to these laws. “Runaround” is the first story that focuses on Powell and Donovan, two engineers who work for a robot manufacturer called U.S. Robots and Mechanical Men. Powell lists the three Laws of Robotics for Donovan: one, “a robot may not injure a human being, or, through inaction, allow a human being to come to harm.” Two, a robot must obey orders given by humans, except in any case that would conflict with the First Law. Three, a robot must protect its own existence as long as its actions do not conflict with the First or Second Laws. These three laws set a very high ethical standard which prevents robots from harming or disobeying any human in any way. As another roboticist points out in another story, these laws constitute a much higher code of conduct than humans often follow themselves.

These laws are so rigid that robots can even be rendered completely non-functioning in cases where conflicts arise between the laws. In “Runaround,” a robot named Speedy is trying to restart operations at a mining station on the planet Mercury. Donovan and Powell order Speedy to retrieve selenium (the metal used as the base’s power source), but there is an unforeseen danger near the pool. Because Speedy was very expensive to manufacture and thus valuable to humans, his sense of self-preservation according to the Third Law is strengthened. Therefore, when he gets too close to the pool, he turns back for fear of being destroyed, but when he is far enough away, the Second Law of obeying human orders kicks in and he tries to return to the pool. Thus, the robot gets stuck in a feedback loop from which he cannot escape. This demonstrates how robots’ moral mandates are so strong that they are rendered completely inoperable when they cannot follow them—their only choice is to follow this code.

Another story, “Robbie,” makes a more direct comparison between humans and robots, and argues that robots can be more ethical than their human counterparts. “Robbie” takes place in 1996, just when robots are becoming commonplace. Mr. and Mrs. Weston have a small robot nursemaid named Robbie, who has been taking care of their young daughter, Gloria, for two years. Though Robbie can’t speak, he adores taking care of Gloria. He plays hide and seek with her, gives her rides on his back, and asks her to tell him stories, all to her delight. Asimov illustrates how Robbie has real love for Gloria, and a clear desire to care for her at all times, establishing how Robbie’s compassion goes even beyond the three laws with which he has been programmed. Mrs. Weston, on the other hand, is depicted as very cold. She yells at Robbie and worries that he might “go berserk” and do something terrible to Gloria, and so she takes Robbie away. Gloria becomes inconsolable and makes herself sick worrying where Robbie might have gone. Still, her mother is adamant that she will not bring Robbie back. Thus, Asimov contrasts Gloria’s mother, who puts her own desires first, with Robbie, who is a very selfless caretaker. This illustrates the idea that robots can be built with more of a sense of humanity than even humans have.

The idea that robots are more ethical than humans is further emphasized in “Evidence,” as Asimov explicitly argues that the robots’ code of ethics is more strict than human codes of ethics. Two politicians, Francis Quinn and Stephen Byerley, are running against each other for the mayoral office of a major city. Quinn approaches two chief officials at U.S. Robots, Dr. Alfred Lanning and Dr. Susan Calvin, and asks them to confirm that Byerley is not a human, but a very advanced robot. He bases this accusation on the fact that, as district attorney, Byerley has never sought the death penalty (which would have violated the First Law) and has never been seen eating or sleeping. Dr. Calvin argues, however, that if Byerley follows the three laws, he could be a robot, or he “may simply be a very good man,” because the three rules of robotics are “the essential guiding principles of a good many of the world’s ethical systems.” This line of reasoning essentially argues that all robots have the ethical standards of very good human beings, to the point where it may even be impossible to tell whether a person is a robot or simply a morally upstanding person. In this sense, the robot code of ethics is something of an idealized set of morals for human beings. Though humans are the ones who created these moral standards and programmed the robots to obey them, it seems that it is the robots’ inhumanity—their inability to deviate from the three laws to make moral (or immoral) decisions of their own accord—is paradoxically what makes them more ethical than most people.

Asimov’s stories delve into the ethics that dictate both human and robot behavior. But even though the robots sometimes malfunction or appear to be going off track, they are always following their code of ethics. Humans, on the other hand, are often shown to have no hesitation at causing each other physical or emotional pain. While the ethical questions grow more and more complex in each story, Asimov’s ultimate conclusion is clear: robots have the potential to be more ethical than the very humans who imbued them with their ethical code.

Related Themes from Other Texts
Compare and contrast themes from other texts to this theme…

Morality and Ethics ThemeTracker

The ThemeTracker below shows where, and to what degree, the theme of Morality and Ethics appears in each chapter of I, Robot. Click or tap on any chapter to read its Summary & Analysis.
How often theme appears:
chapter length:
Get the entire I, Robot LitChart as a printable PDF.
I, Robot PDF

Morality and Ethics Quotes in I, Robot

Below you will find the important quotes in I, Robot related to the theme of Morality and Ethics.
Introduction Quotes

“Then you don’t remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. […] But you haven’t worked with them, so you don’t know them. They’re a cleaner better breed than we are.”

Related Characters: Dr. Susan Calvin (speaker), Reporter
Page Number: iii
Explanation and Analysis:
Robbie Quotes

“You listen to me, George. I won’t have my daughter entrusted to a machine—and I don’t care how clever it is. It has no soul, and no one knows what it may be thinking. A child just isn’t made to be guarded by a thing of metal.”

Related Characters: Mrs. Weston (speaker), Gloria, Mr. Weston, Robbie
Page Number: 7
Explanation and Analysis:

It took split-seconds for Weston to come to his senses, and those split-seconds meant everything, for Gloria could not be overtaken. Although Weston vaulted the railing in a wild attempt, it was obviously hopeless. Mr. Struthers signalled wildly to the overseers to stop the tractor, but the overseers were only hu man and it took time to act.

It was only Robbie that acted immediately and with precision.

Related Characters: Gloria, Mrs. Weston, Mr. Weston, Robbie
Page Number: 21
Explanation and Analysis:
Runaround Quotes

He called a last time, desperately: “Speedy! I’m dying, damn you! Where are you? Speedy, I need you.”

He was still stumbling backward in a blind effort to get away from the giant robot he didn’t want, when he felt steel fingers on his arms, and a worried, apologetic voice of metallic timbre in his ears.

Related Characters: Gregory Powell (speaker), Mike Donovan, Speedy
Page Number: 44
Explanation and Analysis:
Reason Quotes

“Obedience is the Second Law. No harm to humans is the first. How can he keep humans from harm, whether he knows it or not? Why, by keeping the energy beam stable. He knows he can keep it more stable than we can, since he insists he’s the superior being, so he must keep us out of the control room. It’s inevitable if you consider the Laws of Robotics.”

Related Characters: Gregory Powell (speaker), Mike Donovan, Cutie
Page Number: 65
Explanation and Analysis:
Catch That Rabbit Quotes

“Remember, those subsidiaries were Dave’s ‘fingers.’ We were always saying that, you know. Well, it’s my idea that in all these interludes, whenever Dave became a psychiatric case, he went off into a moronic maze, spending his time twiddling his fingers.”

Related Characters: Gregory Powell (speaker), Mike Donovan, Dave
Page Number: 90
Explanation and Analysis:
Liar! Quotes

But Susan Calvin whirled on him now and the hunted pain in her eyes became a blaze, “Why should I? What do you know about it all, anyway, you…you machine. I’m just a specimen to you; an interesting bug with a peculiar mind spread-eagled for inspection. It’s a wonderful example of frustration, isn’t it? Almost as good as your books.” Her voice, emerging in dry sobs, choked into silence.

The robot cowered at the outburst. He shook his head pleadingly. “Won’t you listen to me, please? I could help you if you would let me.”

Related Characters: Dr. Susan Calvin (speaker), Herbie (speaker), Milton Ashe
Page Number: 96
Explanation and Analysis:
Escape! Quotes

“When we come to a sheet which means damage, even maybe death, don’t get excited. You see, Brain, in this case, we don’t mind—not even about death; we don’t mind at all.”

Related Characters: Dr. Susan Calvin (speaker), The Brain
Page Number: 149
Explanation and Analysis:

She went on, “So he accepted the item, but not without a certain jar. Even with death temporary and its importance depressed, it was enough to unbalance him very gently.”

She brought it out calmly, “He developed a sense of humor—it’s an escape, you see, a method of partial escape from reality. He became a practical joker.”

Related Characters: Dr. Susan Calvin (speaker), Gregory Powell, Mike Donovan, The Brain
Page Number: 168
Explanation and Analysis:
Evidence Quotes

“Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”

Related Characters: Dr. Susan Calvin (speaker), Stephen Byerley, Francis Quinn, Alfred Lanning
Page Number: 184
Explanation and Analysis:

“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]

“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”

Related Characters: Dr. Susan Calvin (speaker), Stephen Byerley (speaker)
Page Number: 196
Explanation and Analysis:
The Evitable Conflict Quotes

“Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn’t you say so?”

“I would.”

“And what is most likely in the future to cause economic dislocations? Answer that, Stephen.”

“I should say,” replied Byerley, unwillingly, “the destruction of the Machines.”

“And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us.”

Related Characters: Dr. Susan Calvin (speaker), Stephen Byerley (speaker)
Page Number: 222
Explanation and Analysis:

“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”

“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]

“How horrible!”

“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”

Related Characters: Dr. Susan Calvin (speaker), Stephen Byerley (speaker)
Page Number: 224
Explanation and Analysis: