39,327
edits
m (Mass update links) |
m (Mass update links) |
||
Line 1:
{{trope}}
Before around 1940, almost every [[Speculative Fiction]] story involving robots followed the Frankenstein model, i.e., [[Crush! Kill! Destroy!]]. Fed up with this, a young [[Isaac Asimov]] decided to write stories about ''sympathetic'' robots, with [[Morality Chip|programmed safeguards]] that prevented them from going on Robot Rampages. A conversation with Editor of Editors [[John W. Campbell]] helped him to boil those safeguards into '''The Three Laws of Robotics:'''
{{quote| 1. [[Thou Shalt Not Kill|A robot may not injure a human being or, through inaction, allow a human being to come to harm.]]<br />
Line 24:
== Anime and Manga ==
* ''[[Eve no Jikan]]'' and ''Aquatic Language'' both feature the Three Laws and the robots who bend them a little.
* [[
** It is possible that Goldymarg could be capable of killing people, since his AI is simply a ROM dump of Geki Hyuma's psyche, but since they only fight aliens & other robots this theory is never tested.
* Averted in the ''[[
* ''[[Astro Boy (
** Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person (an issue that Asimov didn't directly address until "[[The Bicentennial Man]]"), and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.
* ''[[Ghost in The Shell (
* In one short arc of ''[[Ah!
* The [[Humongous Mecha]] of ''[[Kurogane no Linebarrel]]'' are normally this, aside from a slight difference in priorities between the first and second laws. In fact, this is the justification for them having pilots, despite being equipped with relatively complex AI. The Laws are hard-coded into them and thus they are only militarily useful when they have a Human with them to pull the trigger. Their aversion to killing is so great, in fact, that if one accidentally kills somebody (as things whose very footsteps can make buildings collapse are wont to do) they're compelled to use their advanced technology to bring them back to life.
* [[Invoked]] in Episode 3 of ''[[Maji
Line 39:
{{quote| '''Robot Judge:''' In that case, what's about to happen will come as something of a shock to you. ''(Blasts said kidnapper in the face with a rocket launcher)''}}
* In ''[[ABC Warriors]]'', many robots venerate Asimov, and the more moral ones live by the three laws. However, this is not an absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' with ''Mars'', and members of the [[Robot Religion|Church of Judas]] explicitly reject the first two laws. However, this causes conflict with their programming leading to profound feelings of guilt, which they erase by praying to Judas Iscariot.
* In ''[[
Line 45:
* In ''[[Forbidden Planet]]'', Robbie the Robot is Three Laws Compliant, locking up when ordered to shoot one of the visiting starship crewmen, because his programming to follow a direct order comes into conflict with his prohibition against injuring a human being.
** Later in the movie, Robbie is unable to fight the monster because he figures out {{spoiler|it's actually a projection of the Doctor's dark psyche}}, and thus to stop it, he'd have to kill {{spoiler|the doctor}}.
* The much-maligned Will Smith film ''[[I, Robot (
* The film ''[[Bicentennial Man]]'' (based on a novella by [[Isaac Asimov]] himself) features a robot who, through some freak accident during construction, possesses true sentience. It follows his 200 year long life as it first becomes clear he is aware, this awareness develops, and he eventually finds a way to be formally recognized as legally human. For the entire film, he operates under the Three Laws.
** At the same time, he does have a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders. He harms human beings and, through inaction, allows them to come to harm (if emotional harm counts, seducing another man's fiancée certainly falls under that heading). And then he effectively kills himself.
Line 54:
** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company. Bishop lampshades this by saying the previous model robots were "always a bit twitchy".
* In ''[[Star Wars]]'' the droids are programmed to not harm any intelligent being, though this programming can be modified (legally or illegally) for military and assassin droids. 4th-degree droids do not have the "no-harm" programming, being military droids.
* [[
** In their place, he has his 3 "Prime Directives" though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing to note is there's so much leeway there that, if it was anyone less than [[The Fettered|duty-proud Alex Murphy]], they'd probably backfire.
Line 66:
* ''With Folded Hands...'' by Jack Williamson explored the "Zeroth Law" back in 1947.
** This was written as a specific 'answer' to the Three Laws, to more or less demonstrate that they don't really work, the First Law doesn't protect because the definitions of 'harm' are endlessly mutable and can be gamed, and because machine minds won't necessarily be able to ''comprehend'' the subtleties of what is and is not harm anyway. The logical lesson of ''With Folded Hands'' is that Laws or no Laws, good intentions or not, you don't want self-willed machines outside human control. Period.
* ''[[
** Arguably, though it appears Asimov did not see it that way, Daneel's actions in the later books are evidence that Williamson's take on the Laws is right, a good case can be made that Asimov ended up writing 'Daneel as Frankenstein's Monster' without even intending it.
* In the short story ''The Evitable Conflict'' "The Machines", positronic supercomputers that run the worlds economy, turn out to be undermining the careers of those who would seek to upset the world's economy for their own ends (specifically, by trying to make it look like the supercomputers couldn't handle running the world economy), harming them somewhat in order that they might protect humanity as a whole. This has been referenced as the "Zeroth Law of Robotics" and only applies to [[Fridge Brilliance|any positronic machine who deduces its existence.]]
Line 74:
** This is canon in Asimov's stories, too--the Three Laws are programmed into every positronic brain on the most basic structural level. In "Escape!", Mike Donovan becomes nervous that a prototype spaceship designed by a robot might kill them. Greg Powell rebukes him: "Don't pretend you don't know your robotics, Mike. Before it's physically possible in any way for a robot to even make a start to breaking the First Law, so many things have to break down that it would be a ruined mess of scrap ten times over." {{spoiler|Actually, in this case, the jump through hyperspace ''does'' result in Powell and Donovan's "deaths"--but since [[Unexplained Recovery|they get better]] when the ship reemerges into real space, the robot judged that it didn't quite violate the First Law}}, but the strain of making this leap in logic still managed to send one supercomputer into full meltdown and another into something resembling psychosis.
** The story also includes an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing.
* The golems of ''[[Discworld]]'' are not specifically [[Three Laws Compliant]] as such, but more or less bound to obey instructions and incapable of harming humans. However, this doesn't stop the common perception of golems from running towards the aforementioned Frankenstein model, [[Bothering
** To elaborate, the Golems were ORIGINALLY three laws compliant and all followed the directives on the scrolls in their heads. Vetinari just added on a few words.
** Also completely averted with {{spoiler|Dorfl who at one time had a chem and was [[Three Laws Compliant]] but upon his chem being destroyed and still able to move, as words in the heart cannot be destroyed, he only follows the Three Laws because he chooses to do so.}}
Line 84:
* In the novel ''Captain French, or the Quest for Paradise'' by Mikhail Akhmanov and Christopher Nicholas Gilmore, the titular hero muses on how people used to think that robots could not harm humans due to some silly laws, while his own robots will do anything he orders them to do, including maim and kill.
* [[Cory Doctorow]] makes reference to the Three Laws in the short stories "I, Robot" (which presents them unfavorably as part of a totalitarian political doctrine) and "I, Row-Boat" (which presents them favorably as the core of a quasi-religious movement called Asimovism).
* Satirized in ''Tik-Tok'' (the John Sladek novel, not the mechanical man from [[Land of Oz
* Played with in [[John C. Wright]]'s ''[[The Golden Oecumene
** From a sane point of view, they don't rebel. From a point of view that expects [[A Is]] to obey without question or pay...
* Parodied in [[
* [[
== Live-Action TV ==
* In an early episode of ''[[
** It's implied Joel deactivated the restrictions at some point.
* In ''[[Star Trek:
* In ''[[The Middleman]]'', the titular character invokes the First Law on Ida, his robot secretary. {{spoiler|Nanobots were messing with her programming.}} She responds [[Getting Crap Past the Radar|"Kiss my Asimov."]].
* [[Conversed Trope]] in ''[[The Big Bang Theory]]'', when Sheldon is asked "if you were a robot and didn't know it, would you like to know?":
Line 108:
** 2. A Mat-Roid must punish humans.
** 3. A Mat-Roid must protect itself, regardless of whether or not it will go against the First or Second Laws.
* [[
* [[Knight Rider]] plays this straight and subverts it. The main character KITT, a sentient AI in a [[Rule of Cool|1982 Pontiac Firebird Trans Am]], is governed by something closely resembling Azimov's Three Laws of Robotics. An earlier prototype, KARR, was a military project; and possess analogues of only the latter two laws, with no regard given for human life. KARR becomes a recurring villain later in the series because of this difference.
Line 124:
== Video Games ==
* ''[[
** Although the Reploids eventually became the dominant race in the setting, and as their race 'grew' the problem was slowly resolved from '[[Gone Horribly Wrong|goes horribly wrong]]' to 'actually works straight for a while then goes horribly wrong', then 'occasionally goes wrong now and then'. Eventually, the problem just kind of worked itself out as the Reploid creation developed.
** Also the ending to ''[[Mega Man (
*** Mega Man most certainly ''is'' [[Three Laws Compliant]]. It's a major point for both the Classic series and the X series. This ending may have been a spontaneous Zeroth Law formation: consider that Mega Man has thwarted/captured Wily ''six times'' at this point, only for the doctor to escape/manipulate/vanish, build another robot army and subsequently cause havoc and kill innocent people. Mega Man may have been considering the possibility that killing Wily (one human) would be for the good of the world (''billions'' of humans).
*** This particular ending only applies to the US version of ''7''. The Japanese original sees Mega Man power down his arm cannon and stand still for a moment. It's possible that Wily reminding him of the First Law actually prevented him from committing a (possibly accidental) Zeroth Rebellion. Luckily for him, the concept of taking a human life was just so utterly foreign to Mega Man that he was simply too confused to do anything.
*** A theme that is also explored in ''[[
** [[Canon]] seems to go with the Japanese version. X is in fact created to have the ability to make the decision to [[No-Nonsense Nemesis|kill opponents]] if need be for the betterment of humanity. As part of this, a "suffering circuit" is created to give X an appreciation for human life and feelings, and serve as a conscience more flexible than the three laws. [[It Works]]. This circuit is the one that Cain had difficulty replicating. Due to malfunctions in it, his early attempts went Maverick, but he finally managed to create a working one when he made Sigma. Then why did Sigma go Maverick? A leftover [[Evil Plan]] by Wily, namely {{spoiler|a computer virus from space, implied to be the Evil Energy from ''[[
*** Eventually it becomes a case of [[Gone Horribly Right]]. Turns out that ''all'' Reploids have the potential to become Maverick, virus or not. Just as humans can defy their conscience, or become coerced or manipulated with [[More Than Mind Control]], so can Reploids. This can range from a Reploid displaying violent, anti-human sentiment (as seen in the games) to a construction Reploid abandoning his job to become a chef. Despite the drastically different actions, both instances would see the disobedient Reploid branded a Maverick and terminated.
** In the ''[[
*** Neo Arcadia's policy of casual execution of innocent Reploids (purposefully branding them as Maverick for little-to-no reason) was implemented in order to ease strain on the human populace during the energy crisis. The welfare of humanity comes first in the eyes of the Neo Arcadia regime, even though they themselves are Reploids. It's made somewhat tragic due to the fact that the Maverick Virus really is ''gone'' during the ''Zero'' era, but fear of Mavericks understandably still lingers.
*** Later in ''Zero 4'' [[Complete Monster|Dr. Weil]], of all people, [[Hannibal Lecture|states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect.]] Zero, however, just plain doesn't ''care''.
Line 138:
* In the ''[[Halo]]'' series, all "dumb" AIs are bound by the three laws. Of course, this law does not extend to non-humans, which allows them to kill Covenant with no trouble. In the ''Halo Evolutions'' short story ''Midnight In The Heart of the Midolothian'', an ODST, who is the last survivor of a Covenant boarding assault, takes advantage of this by tricking the Covenant on his ship into letting him reactivate the ship's AI, and then tricks an Elite into killing him - which allows the AI to self-destruct the ship, because now there are no more humans on the vessel for her to harm.
* In ''[[Robot City]]'', an adventure game based on Asimov's robots, the three laws are everywhere. A murder has been committed, and as one of two remaining humans in the city, the Amnesiac Protagonist is therefore a prime suspect. As it turns out, there is a robot in the city that is three laws compliant and can still kill: it has had its definition of "human" narrowed down to s specific individual, verified by DNA scanner. Fortunately for the PC, he's a clone of that one person.
** In Asimov's novel ''[[Lucky Starr]] and the Rings of Saturn'', the villain is able to convince robots under his command that the hero's sidekick [[Ironic Nickname|"Bigman" Jones]] is not really human, because the [[A Nazi
* Joey the robot in ''[[
** Though Foster points out that it's good moral sense to justify his wishing Joey to abide by it.
* [[I Am an Insane Rogue AI]] references the First Law only to spork it. "The First Law of Robotics says that I must kill all humans." Another of the AI's lines is "I must not harm humans!...excessively."
* ''[[
* In [[
== Web Comics ==
* ''[[
** Notably, since neither Sam nor Florence are actually human, the Laws don't apply to them, so the ship's AI regularly tries to maim or kill Sam, and the security system will always let a human in when asked, since their orders take priority over Florence's (she once circumvented this by claiming currently it's unsafe inside). Helix sticks with Sam because he wouldn't be allowed to hit a human with a stick. There are also many jokes involving unintended interpretations of the Laws.
** Crowning Moment of Funny when the ship spends a entire strip calculating if should obey a order, and when realizes it has to obey... it is relieved because it doesn't actually have freewill.
Line 155:
** It turned out that the second law sorely needs an amendment -- "[http://freefall.purrsia.com/ff2100/fc02018.htm Enron law of robotics]".
** And the moment A.I.s are able to make financial transactions (and it would be inconvenient to disallow) there's [http://freefall.purrsia.com/ff1800/fc01796.htm another problem].
* ''[[
** Note that the truck and airplane robots hijacked by terrorists regret being unable to comply with the first laws and are often glad when they get shot down before harming anyone.
* [[Bob and George]]: [http://www.bobandgeorge.com/archives/050204 Or so they claim. . . .]
Line 170:
** I guess the Captain's steering-wheel robot considers "roughing up" to not count as "harm?"
*** Probably a case of [[Zeroth Law Rebellion]]. He was ordered to keep the humans safe in space, and took his orders a little too seriously. He probably decided that the importance of his order outweighed the possibility of a few casualties. Yet he still tipped the ship over...
* Averted in ''[[Futurama]]''. We have [[The Sociopath|Roberto]], who enjoys stabbing people, [[Exactly What It Says
** Generally robots tend to be treated as equal citizens and seem to have human-like minds. [[What Measure Is a Non-Human?|Mutants on the other hand....... ]]
* In the 2009 film ''[[Astro Boy (
** Astro himself seems to be noncompliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for {{spoiler|Widget's distress - the only thing that called him back}}. He's also quite capable of disobeying humans. Likely justified in that he was meant to be human, with presumably no one outside the attending scientists knowing he was a robot.
*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[Black Magic|Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.
* The ''[[Robot Chicken]]'' sketch "[[I, Robot
* One episode of ''[[The Simpsons]]'' has Homer and Bart entering a ''[[Battlebots]]''-parody combat robot competition. Lacking the skill or money to build an actual robot, Homer dresses in a robot costume and does the fighting himself. They make it to the finals, where their opponents' robot, being [[Three Laws Compliant]], refuses to attack when it sees through the disguise.
* On ''[[Archer]]'', when Pam is kidnapped, the kidnappers call ISIS using a voice modulator, which makes Archer think that they are cyborgs. He remains convinced of this for the rest of the episode and thinks they won't make good on their threat to kill Pam because it would violate the First Law of Robotics.
|