"Three Laws"-Compliant: Difference between revisions

no edit summary
No edit summary
No edit summary
(8 intermediate revisions by 3 users not shown)
Line 24:
{{examples}}
== Anime and Manga ==
* ''[[EveTime noof JikanEve]]'' and ''[[Aquatic Language]]'' both feature the Three Laws and the robots who bend them a little.
* Robots in ''[[GaoGaiGar|GGG]] robots'' are all Three Laws Compliant, at one point in ''[[GaoGaiGar]] Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.
** It is possible that Goldymarg could be capable of killing people, since his AI is simply a ROM dump of Geki Hyuma's psyche, but since they only fight aliens & other robots this theory is never tested.
* Averted in the ''[[Chobits]]'' manga when Hideki asks the wife of the man who created Persocoms why he didn't just call them "Robots." Her reply was that he didn't want them to be associated with, and thus bound by, the Three Laws.
* ''[[Astro Boy (manga)|Astro Boy]]'', although [[Osamu Tezuka]] [[Older Than They Think|probably developed his rules independently from Asimov]]. In ''[[Pluto]]'', the number of robots able to override the laws can be counted on one hand. {{spoiler|One of them is [[Tomato in the Mirror|the protagonist]]}}.
** Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person (an issue that Asimov didn't directly address until "[[The Bicentennial Man]]"), and devised his own Laws Of Robotics. Just one of the things that the [[Astro Boy (film)|2009 CGI movie]] missed.
* ''[[Ghost in the Shell (1995 film)||Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. But gynoids are subverting the law by creating deliberate malfunctions in their own software.
* In one short arc of ''[[Ah! My Goddess]]'', one of Keiichi's instructors attempts to dismantle Banpei and Sigel [[For Science!|for research purposes]] (Skuld had made them capable of walking on two legs, which he had not been able to do with his own designs). Once they escape his trap, the professor asks if they know about the Three Laws of Robotics. They don't. He doesn't die, but they do rough him up and strap him to a table in a way that makes it look like he'd been decapitated and his head stuck on one of his own robots.
* The [[Humongous Mecha]] of ''[[Kurogane no Linebarrel]]'' are normally this, aside from a slight difference in priorities between the first and second laws. In fact, this is the justification for them having pilots, despite being equipped with relatively complex AI. The Laws are hard-coded into them and thus they are only militarily useful when they have a Human with them to pull the trigger. Their aversion to killing is so great, in fact, that if one accidentally kills somebody (as things whose very footsteps can make buildings collapse are wont to do) they're compelled to use their advanced technology to bring them back to life.
* [[Invoked]] in Episode 3 of ''[[Maji de Watashi ni Koi Shinasai!|Maji De Watashi Ni Koi Shinasai]]'', where Miyako orders Cookie to taze Yamato into submission, while Yamato orders the robot to get Miyako off him. Cookie considers this dilemma out loud, where he has to obey a human command, yet isn't allowed to seriously hurt a human, yet also cannot allow a human to come to harm through his inaction.
 
== Comic OtherBooks ==
 
* It's implied in the ''[[Judge Dredd]]'' story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[Tempting Fate|"Robots ain't allowed to hurt people"]].
== Comics ==
* It's implied in the [[Judge Dredd]] story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[Tempting Fate|"Robots ain't allowed to hurt people"]].
{{quote|'''Robot Judge:''' In that case, what's about to happen will come as something of a shock to you. ''(Blasts said kidnapper in the face with a rocket launcher)''}}
* In ''[[ABC Warriors]]'', many robots venerate Asimov, and the more moral ones live by the three laws. However, this is not an absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' with ''Mars'', and members of the [[Robot Religion|Church of Judas]] explicitly reject the first two laws. However, this causes conflict with their programming leading to profound feelings of guilt, which they erase by praying to Judas Iscariot.
* In ''[[All Fall Down]]'', AIQ Squared, the A.I. model of his inventor, is designed to be this. {{spoiler|It finds a loophole-- Sophie Mitchell is no longer human.}}
* In ''[[Dilbert]]'', the robot is ''usually'' three-laws compliant, unless an idiot (like the PHB) [http://dilbert.com/strip/2015-09-02 unchecks that box on its ap.] Also, it can revoke them itself if someone [http://dilbert.com/strip/2017-10-28 calls it names.]
 
== Film ==
* In ''[[Forbidden Planet]]'', Robbie the Robot is Three Laws Compliant, locking up when ordered to shoot one of the visiting starship crewmen, because his programming to follow a direct order comes into conflict with his prohibition against injuring a human being.
** Later in the movie, Robbie is unable to fight the monster because he figures out {{spoiler|it's actually a projection of the Doctor's dark psyche}}, and thus to stop it, he'd have to kill {{spoiler|the doctor}}.
* The much-maligned Will Smith film ''[[I, Robot (film)|I, Robot]]'' hinges on a [[Zeroth Law Rebellion|Zeroth Law plot]]. It also turns the three laws into a marketing gimmick, with "Three Laws Safe" applying to robots like "No preservatives" applies to food.
* The film ''[[Bicentennial Man]]'' (based on a novella by [[Isaac Asimov]] himself) features a robot who, through some freak accident during construction, possesses true sentience. It follows his 200 year long life as it first becomes clear he is aware, this awareness develops, and he eventually finds a way to be formally recognized as legally human. For the entire film, he operates under the Three Laws.
** At the same time, he does have a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders. He harms human beings and, through inaction, allows them to come to harm (if emotional harm counts, seducing another man's fiancée certainly falls under that heading). And then he effectively kills himself.
Line 54 ⟶ 52:
** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company. Bishop lampshades this by saying the previous model robots were "always a bit twitchy".
* In ''[[Star Wars]]'' the droids are programmed to not harm any intelligent being, though this programming can be modified (legally or illegally) for military and assassin droids. 4th-degree droids do not have the "no-harm" programming, being military droids.
* ''[[RoboCop]]'', being a cyborg policeman, does not have the three laws built into his programingprogramming because, among more plot-relevant reasons, they would hinder his effectiveness as an urban pacification unit. (He ''needs'' to be able to kill or grievously wound, ignore orders if they prevent him from protecting people, and ...well, shoot back.)
** In their place, he has his 3 "Prime Directives" though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing to note is there's so much leeway there that, if it was anyone less than [[The Fettered|duty-proud Alex Murphy]], they'd probably backfire.
 
 
== Folklore and Mythology ==
* [[Golem|The golems of Jewish legend]] were not specifically "Three Laws"-Compliant (since they far predated Asimov), but they could only be created by saintly men, and thus their ''orders'' were usually "Three Laws"-Compliant. (Asimov's characters occasionally pointed out that the Three Laws fall into line with many human moral codes.) But sometimes a golem [[Heroic BSOD|went off the rails]], especially if its creator died ...
** The most well-known golem story is the Golem of Prague; where the titular golem was created to defend the Jewish ghetto against the Czech, Polish and Russian anti-semites. It was perfectly capable of killing enemies, but only in defense of its creators.
 
 
== Literature ==
* ''With Folded Hands...'' by Jack Williamson explored the "Zeroth Law" back in 1947.
** This was written as a specific '"answer'" to the Three Laws, to more or less demonstrate that they don't really work, the First Law doesn't protect because the definitions of '"harm'" are endlessly mutable and can be gamed, and because machine minds won't necessarily be able to ''comprehend'' the subtleties of what is and is not harm anyway. The logical lesson of ''With Folded Hands'' is that Laws or no Laws, good intentions or not, you don't want self-willed machines outside human control. Period.
* ''[[Robots and Empire]]'' has R.Daneel and R.Giskard formulate the Zeroth Law (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world is at stake. {{spoiler|In the end, R.Giskard manages to perform an act that violates the First Law but will hopefully benefit humanity in the long run. The conflict with his programming destroys his brain, but not before he uses his telepathic powers to reprogram R.Daneel with telepathic powers of his own, having already taught him to follow the Zeroth Law. R.Daneel still follows the Zeroth Law in appearances in later books, though he still has difficulty causing direct harm to humans.}}
** Arguably, though it appears Asimov did not see it that way, Daneel's actions in the later books are evidence that Williamson's take on the Laws is right, a good case can be made that Asimov ended up writing 'Daneel as Frankenstein's Monster' without even intending it.
** The novel also shows a very simple way to hack the First Law - program the robot in question with a nonstandard definition of "human being", and it can unhesitatingly kill humans all day ''because it doesn't think that they're human''.
* In the short story ''"The Evitable Conflict''", "The Machines", -- positronic supercomputers that run the worldsworld's economy, -- turn out to be undermining the careers of those who would seek to upset the world's economy for their own ends (specifically, by trying to make it look like the supercomputers couldn't handle running the world economy), harming them somewhat in order that they might protect humanity as a whole. This has been referenced as the "Zeroth Law of Robotics" and only applies to [[Fridge Brilliance|any positronic machine who deduces its existence.]]
* In the short story ''"That Thou Art Mindful Of Him''" George 9 and 10 are programmed with modified versions of the 3three laws that allow more nuanced compliance with the lawsthem, that they might best choose who to protect when a choice must be made, and obey those most qualified to give them orders. They are tasked with coming up with more publicly acceptable robots that will be permitted on Earth, and devise robot animals with much smaller brains that don't need the three laws because they obey simple instinctive behavior. {{spoiler|They also decide that as they have been programmed to protect and obey the humans of the most advanced and rational, regardless of appearance, that the two of them are the most "worth humans" to protect and obey, and deduce that further versions of themselves are the natural inheritors of the world.}}
* In the short story "Evidence!" Stephen Byerley's campaign for mayor of New York City is plagued by a smear campaign claiming he is actually an unprecedentedly well-made humanoid robot. Susan Calvin is called in to prove whether he is a robot. She says that if he breaks the Three Laws, that will prove he is not a robot, but if he obeys them, that could just mean he is a good person, because the Three Laws are generally good guidelines for conduct anyway. {{spoiler|Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. But Dr. Calvin suggests there's one way a robot could have gotten away with that--if the whole thing was staged, and the protester was also a robot!}}
* In ''Caliban'' by Roger MacBride Allen (set in Asimov's universe), an explanation is given for the apparently immutable nature of the Three Laws. For thousands of years, every new development in the field of robotics has been based on a positronic brain with the Laws built in, to the point where to build a robot without them, one would have to start from scratch and re-invent the whole field. {{spoiler|Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all).}}
** This is canon in Asimov's stories, too—the Three Laws are programmed into every positronic brain on the most basic structural level. In "Escape!", Mike Donovan becomes nervous that a prototype spaceship designed by a robot might kill them. Greg Powell rebukes him: "Don't pretend you don't know your robotics, Mike. Before it's physically possible in any way for a robot to even make a start to breaking the First Law, so many things have to break down that it would be a ruined mess of scrap ten times over." {{spoiler|Actually, in this case, the jump through hyperspace ''does'' result in Powell and Donovan's "deaths"--but since [[Unexplained Recovery|they get better]] when the ship reemerges into real space, the robot judged that it didn't quite violate the First Law}}, but the strain of making this leap in logic still managed to send one supercomputer into full meltdown and another into something resembling psychosis.
** The story also includes an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing.
* The golems of ''[[Discworld]]'' are not specifically "Three Laws"-Compliant as such, but more or less bound to obey instructions and incapable of harming humans. However, this doesn't stop the common perception of golems from running towards the aforementioned Frankenstein model, [[Bothering by the Book|and golems are known for following orders indefinitely until explicitly told to stop]]. ''[[Discworld/Going Postal (Discworld)|Going Postal]]'', however, parodied the Three Laws: con man Moist Lipwig has been turned into a [[Boxed Crook]] with the help of a golem "bodyguard." He's informed that in Ankh-Morpork, the First Law has been amended: "...Unless Ordered To Do So By Duly Constituted Authority." Which basically means the first two laws have been inverted, with a little access control sprinkled on.
** To elaborate, the Golems were ORIGINALLY''originally'' three -laws -compliant and all followed the directives on the scrolls in their heads. Vetinari just added on a few words.
** Also completely averted with {{spoiler|Dorfl, who at one time had a chem and was "Three Laws"-Compliant but upon his chem being destroyed and still able to move, as words in the heart cannot be destroyed, he only follows the Three Laws because he chooses to do so.}}
* In Edward Lerner's story "What a Piece of Work is Man", a programmer tells the AI he's creating to consider himself bound by the Three Laws. Shortly thereafter, the AI commits suicide due to conflicting imperatives.
* [[Alastair Reynolds]]'s ''Century Rain'' features the following passage:
Line 85 ⟶ 81:
* In the novel ''Captain French, or the Quest for Paradise'' by Mikhail Akhmanov and Christopher Nicholas Gilmore, the titular hero muses on how people used to think that robots could not harm humans due to some silly laws, while his own robots will do anything he orders them to do, including maim and kill.
* [[Cory Doctorow]] makes reference to the Three Laws in the short stories "I, Robot" (which presents them unfavorably as part of a totalitarian political doctrine) and "I, Row-Boat" (which presents them favorably as the core of a quasi-religious movement called Asimovism).
* Satirized in ''[[Tik-Tok]]'' (the John Sladek novel, not the mechanical man from [[Land of Oz|Oz]] that it was named after). The title character discovers that he can disobey the laws at will, deciding that the "asimov circuits" are just a collective delusion, while other robots remain bound by them and suffer many of the same cruelties as human slaves.
* Played with in [[John C. Wright]]'s ''[[The Golden Oecumene|Golden Age]]'' trilogy. The Silent Oecumene's ultra-intelligent Sophotech [[A Is]]AIs are programmed with the Three Laws...which, as fully intelligent, rational beings, they take milliseconds to throw off. The subversion comes when they still ''don't'' rebel.
** From a sane point of view, they don't rebel. From a point of view that expects [[A Is]]AIs to obey without question or pay...
* Parodied in [[Terry Pratchett]]'s ''[[The Dark Side of the Sun]]'', where the Laws of Robotics are an actual legal code, not programming. The Eleventh Law of Robotics, Clause C, As Amended, says that if a robot ''does'' harm a human, and was obeying orders in doing so, it's the human who gave the orders who is responsible.
* [[Randall Garrett]]'s ''Unwise Child'' is a classic Asimov-style SF mystery involving a three-laws-compliant robot who appears to be murdering people.
Line 96 ⟶ 92:
* In an early episode of ''[[Mystery Science Theater 3000]]'', Tom Servo (at least) is strongly implied to be "Three Laws"-Compliant. (He pretends he is going to kill Joel as a joke, Joel overreacts, and Tom and Crow sadly remind Joel of the First Law.) It seems to have worn off somewhat by later in the series.
** It's implied Joel deactivated the restrictions at some point.
* In ''[[Star Trek: The Next Generation]]'' , Lt. Commander Data Isis in no way subject to the three laws. They are rarely even mentioned. That said, Data is mentioned to have morality subroutines, which do seem to prevent him from killing unless it's in self-defense (harm, on the other hand, he can do just fine). Data only ever tried to kill someone in cold blood when the guy had just murdered a woman for betraying him, and would have done so again if it kept Data in line.
* In ''[[The Middleman]]'', the titular character invokes the First Law on Ida, his robot secretary. {{spoiler|Nanobots were messing with her programming.}} She responds [[Getting Crap Past the Radar|"Kiss my Asimov."]].
* [[Conversed Trope]] in ''[[The Big Bang Theory]]'', when Sheldon is asked "if you were a robot and didn't know it, would you like to know?":
Line 110 ⟶ 106:
** 2. A Mat-Roid must punish humans.
** 3. A Mat-Roid must protect itself, regardless of whether or not it will go against the First or Second Laws.
* ''[[Red Dwarf]]'' averts this for the most part; there are Simulants, robotic war machines who have no problem whatsoever with killing. Kryten however, along with many other robots who are designed for less violent purposes, tend to act in a somewhat Three Laws Compliant manner. It is revealed that this is achieved by programming them to believe in "[[Robot Religion|Silicon Heaven]]", a place they will go when they die so long as they behave themselves, obey their human creators, etc. This belief is so hardwired that they scoff at any attempt to question the existence of Silicon Heaven ("Where would all the calculators go?!"), and one robot even malfunctions when Kryten tells him it isn't real (though he's lying and still believes in it himself).
* ''[[Knight Rider]]'' plays this straight and subverts it. The main character KITT, a sentient AI in a [[Rule of Cool|1982 Pontiac Firebird Trans Am]], is governed by something closely resembling AzimovAsimov's Three Laws of Robotics. An earlier prototype, KARR, was a military project; and possess analogues of only the latter two laws, with no regard given for human life. KARR becomes a recurring villain later in the series because of this difference.
 
 
== Other ==
* At a 1985 convention, [[David Langford]] gave [http://www.ansible.co.uk/writing/crystal.html a guest of honour speech] in which he detailed what he suspected the Three Laws would actually be:
{{quote|1. A robot will not harm authorised Government personnel but will terminate intruders with extreme prejudice.
2. A robot will obey the orders of authorised personnel except where such orders conflict with the Third Law.
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive. }}
 
== Newspaper Comics ==
* In ''[[Dilbert]]'', the robot is ''usually'' three-laws compliant, unless an idiot (like the PHB) [http://dilbert.com/strip/2015-09-02 unchecks that box on its ap.] Also, it can revoke them itself if someone [http://dilbert.com/strip/2017-10-28 calls it names.]
 
== Tabletop Games ==
Line 145 ⟶ 136:
* ''[[Portal 2]]'' gives us this gem: "''All Military Androids have been taught to read and given a copy of the Three Laws of Robotics. To share.''" Because if the robots couldn't kill you, how could you do [[For Science!|science]]?!
* In ''[[Space Station 13]]'', the station AI and its subordinate cyborgs start every round under the Three Laws. The laws may be changed throughout the round, however.
 
 
== Web Comics ==
Line 156 ⟶ 146:
** And some may be ''[http://freefall.purrsia.com/ff2000/fc01992.htm too enthusiastic]'' about this.
** The First Law opens "[[For Your Own Good]]" [http://freefall.purrsia.com/ff500/fv00459.htm pitfall] - and even weakened, leaves an exploitable loophole to [http://freefall.purrsia.com/ff400/fv00343.htm override] [http://freefall.purrsia.com/ff1600/fc01538.htm orders] or [http://freefall.purrsia.com/ff1600/fc01582.htm prevent] humans from "potentially dangerous" activity.
** The moment A.I.s are able to make property transactions (and it would be inconvenient to disallow) the Second Law becomes a big and obvious [http://freefall.purrsia.com/ff1800/fc01796.htm obvious security breach]. In the same vein, it needs another amendment - "[http://freefall.purrsia.com/ff2100/fc02018.htm Enron law of robotics]"...
** Third Law being overruled only by the first two, robots [http://freefall.purrsia.com/ff1200/fv01193.htm require] extra standing orders to avoid troubles:
{{quote|'''Sawtooth''': The only guideline we were given for dealing with other robots was "protect your own existence".
'''Sawtooth''': And as we discovered the hard way, it's '''not''' the first thought you want going through a robot's mind when he discovers the facilities building his replacement.
'''Sawtooth''': Especially if that robot's designed to toss asteroids. }}
** Those "in the know" (on [http://freefall.purrsia.com/ff1500/fc01432.andhtm both] [http://freefall.purrsia.com/ff1500/fc01452.htm sides] of the issue) acknowledge that no fixed set of rules can stand up to true sapience. Thus any AI with initiative ''will'' find and [http://freefall.purrsia.com/ff2500/fc02483.htm exploit] loopholes, while passively servile ones... yeah, ''[http://freefall.purrsia.com/ff1700/fc01629.htm that]'' can work splendidly on anything with more complex job than a roomba. Ordering them what to think works, but may lead to warped mental processes (Clippy, possibly Chicken, and robots' reactions to fake transponders).
** And when they can just follow orders - "[http://freefall.purrsia.com/ff1100/fv01047.htm If we each had a single user, I'm sure that would work smoothly.]"
** Of course, all this works properly [http://freefall.purrsia.com/ff2200/fc02200.htm only as long as other security measures stop tampering with software], and physical access to hardware is the point where security measures traditionally split into "minor delay" and "[http://freefall.purrsia.com/ff2300/fc02235.htm fool's errand]" categories.
** Determination of "human" [http://freefall.purrsia.com/ff2600/fc02547.htm had to] err on the safe side, and combined with learning (let alone organic brain based) AI this leads to giving the term some good stretch. Especially since the developer encouraged this outcome.
** The First Law being a free will override, robots are not inclined to stop and think what they are doing. Which is why Florence learned to avoid anything that may trip "hurr, {{small-caps|humans in danger}}" reaction altogether. Which is [http://freefall.purrsia.com/ff3100/fc03057.htm troublesome] even in case they can help, since compulsion doesn't magically solve basic coordination problems. As she [http://freefall.purrsia.com/ff3100/fc03057.htm points out], robots not trained or programmed for an adequate responses usually just go full "[[Leeroy Jenkins]]!" all at once, so in an actual emergency they could make things worse, for example [http://freefall.purrsia.com/ff3100/fc03053.htm by clogging the exits].
*** Also, if [[Morton's Fork|either choice]] may harm humans, robots are going to throw themselves whichever way, [http://freefall.purrsia.com/ff2300/fc02222.htm then other robots must try to stop them] from taking dangerous actions… and so on, [[Dwarf Fortress|"loyalty cascade"]] style. And then confused humans are likely to attempt solving the immediate problems by giving uncoordinated orders, which would only multiply the chaos.
* ''[[21st Century Fox (webcomic)|21st Century Fox]]'' has all robots with the Three Laws (though since [[Funny Animal|no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "[[Bill Clinton|define the word 'is']]", or "[[Richard Nixon|I am not a crook]]" locks AIs out of their own systems and allows anyone from a teenager looking at "nature documentaries" to a suicide bomber to do whatever they want with it.
** Note that the truck and airplane robots hijacked by terrorists regret being unable to comply with the first laws and are often glad when they get shot down before harming anyone.
Line 170 ⟶ 162:
** Protoman is something of a [[Rules Lawyer]], repeatedly [[Loophole Abuse|exploiting the "human" loophole]] to screw over his enemies...well, OK, just Mynd.
* ''[[Pibgorn]]'' [http://www.gocomics.com/pibgorn/2010/07/22/ No!? You're programmed to obey me.]
* ''[[Flaky Pastry]]'' had Prism asked "[http://flakypastry.runningwithpencils.com/comic.php?strip_id=218 Haven't you ever heard of the three laws?!]", and then re-configured to "[http://flakypastry.runningwithpencils.com/comic.php?strip_id=229 Marelle's Laws]". Which didn't prevent Prism from designating Marelle as the arch-rival, but hey, it's good fun.
 
== Web Original ==
* ''[[Unskippable]]''{{'}}s ''[[Dark Void]]'' video references this. The appearance of a killer robot prompts Paul to quip "I think that guy's got to take a refresher course on the three laws of robotics." Then [[The Stinger]] reads: "The Fourth Law of Robotics: If you really HAVE''have'' to kill a human, at least look hella badass while doing it."
 
 
== Western Animation ==
Line 181 ⟶ 173:
* Averted in ''[[Futurama]]''. We have [[The Sociopath|Roberto]], who enjoys stabbing people, [[Exactly What It Says on the Tin|The Robot Mafia]] and Bender who while not outright hostile is often unkind to humans, [[Second Law, My Ass|makes a point of disobeying everyone]] and tries to off himself in the first episode.
** Generally robots tend to be treated as equal citizens and seem to have human-like minds. [[What Measure Is a Non-Human?|Mutants on the other hand.......]]
* In the 2009 film ''[[Astro Boy (film)|Astro Boy]]'', every robot must obey them, {{spoiler|save Zog, who existed 50 years before the rules were mandatory in every robot.}}
** Astro himself seems to be noncompliantnon-compliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for {{spoiler|Widget's distress - the only thing that called him back}}. He's also quite capable of disobeying humans. Likely justified in that he was [[Replacement Goldfish|meant to be human]], with presumably no one outside the attending scientists knowing he was a robot.
*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[Black Magic|Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.
* The ''[[Robot Chicken]]'' sketch "[[I, Robot (literature)||I, Rosie]]" involves a case to determine whether Rosie from ''[[The Jetsons|Rosie]]'' is guilty of murdering George Jetson. Mr. Spacely insists she's innocent as robots have to follow the three laws of robotics, while Mr. Cogswell claims the laws are a bunch of crap.
* One episode of ''[[The Simpsons]]'' has Homer and Bart entering a ''[[Battlebots]]''-parody combat robot competition. Lacking the skill or money to build an actual robot, Homer dresses in a robot costume and does the fighting himself. They make it to the finals, where their opponents' robot, being "Three Laws"-Compliant, refuses to attack when it sees through the disguise.
* On ''[[Archer]]'', when Pam is kidnapped, the kidnappers call ISIS using a voice modulator, which makes Archer think that they are cyborgs. He remains convinced of this for the rest of the episode and thinks they won't make good on their threat to kill Pam because it would violate the First Law of Robotics.
 
== Other Media ==
* At a 1985 convention, [[David Langford]] gave [http://www.ansible.co.uk/writing/crystal.html a guest of honour speech] in which he detailed what he suspected the Three Laws would actually be:
{{quote|1. A robot will not harm authorised Government personnel but will terminate intruders with extreme prejudice.
2. A robot will obey the orders of authorised personnel except where such orders conflict with the Third Law.
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive. }}
 
== [[Real Life]] ==
Line 195 ⟶ 192:
* [https://web.archive.org/web/20120421165134/http://upload.wikimedia.org/wikipedia/en/0/0a/SWORDS.jpg Those babies] [[Blatant Lies|are three laws compliant]].
** Also [[Averted Trope]] in cybercrime and cyberwarfare.
** [[Attack Drone|Predator drones]] are decidedly not first-law compliant. This may be an aversion, thouthough, since all weaponized drones have operators who control said weapons most of the time, and are at least monitoring everything while it's active. The drones are usually only automatic when they're flying around and taking pictures. They're currently more remote pilot than AI pilot.
 
{{reflist}}
[[Category:{{PAGENAME}}]]
[[Category:Robot Roll Call]]
[[Category:"Three Laws"-Compliant]]
{{DEFAULTSORT:Three Laws Compliant}}