"Three Laws"-Compliant: Difference between revisions

m
Mass update links
m (Mass update links)
m (Mass update links)
Line 1:
{{trope}}
Before around 1940, almost every [[Speculative Fiction]] story involving robots followed the Frankenstein model, i.e., [[Crush! Kill! Destroy!]]. Fed up with this, a young [[Isaac Asimov]] decided to write stories about ''sympathetic'' robots, with [[Morality Chip|programmed safeguards]] that prevented them from going on Robot Rampages. A conversation with Editor of Editors [[John W. Campbell]] helped him to boil those safeguards into '''The Three Laws of Robotics:'''
 
{{quote| 1. [[Thou Shalt Not Kill|A robot may not injure a human being or, through inaction, allow a human being to come to harm.]]<br />
Line 24:
== Anime and Manga ==
* ''[[Eve no Jikan]]'' and ''Aquatic Language'' both feature the Three Laws and the robots who bend them a little.
* [[Gao Gai GarGaoGaiGar|GGG]] robots are all Three Laws Compliant, at one point in ''[[Gao Gai GarGaoGaiGar]] Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.
** It is possible that Goldymarg could be capable of killing people, since his AI is simply a ROM dump of Geki Hyuma's psyche, but since they only fight aliens & other robots this theory is never tested.
* Averted in the ''[[Chobits (Manga)|Chobits]]'' manga when Hideki asks the wife of the man who created Persocoms why he didn't just call them "Robots." Her reply was that he didn't want them to be associated with, and thus bound by, the Three Laws.
* ''[[Astro Boy (Mangamanga)|Astro Boy]]'', although [[Osamu Tezuka]] [[Older Than They Think|probably developed his rules independently from Asimov]]. In ''[[Pluto (Manga)|Pluto]]'', the number of robots able to override the laws can be counted on one hand. {{spoiler|One of them is [[Tomato in Thethe Mirror|the protagonist]]}}.
** Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person (an issue that Asimov didn't directly address until "[[The Bicentennial Man]]"), and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.
* ''[[Ghost in The Shell (Animefilm)|Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. But gynoids are subverting the law by creating deliberate malfunctions in their own software.
* In one short arc of ''[[Ah! My Goddess (Manga)|Ah My Goddess]]'', one of Keiichi's instructors attempts to dismantle Banpei and Sigel [[For Science!|for research purposes]] (Skuld had made them capable of walking on two legs, which he had not been able to do with his own designs). Once they escape his trap, the professor asks if they know about the Three Laws of Robotics. They don't. He doesn't die, but they do rough him up and strap him to a table in a way that makes it look like he'd been decapitated and his head stuck on one of his own robots.
* The [[Humongous Mecha]] of ''[[Kurogane no Linebarrel]]'' are normally this, aside from a slight difference in priorities between the first and second laws. In fact, this is the justification for them having pilots, despite being equipped with relatively complex AI. The Laws are hard-coded into them and thus they are only militarily useful when they have a Human with them to pull the trigger. Their aversion to killing is so great, in fact, that if one accidentally kills somebody (as things whose very footsteps can make buildings collapse are wont to do) they're compelled to use their advanced technology to bring them back to life.
* [[Invoked]] in Episode 3 of ''[[Maji Dede Watashi Nini Koi Shinasai (Visual Novel)!|Maji De Watashi Ni Koi Shinasai]]'', where Miyako orders Cookie to taze Yamato into submission, while Yamato orders the robot to get Miyako off him. Cookie considers this dilemma out loud, where he has to obey a human command, yet isn't allowed to seriously hurt a human, yet also cannot allow a human to come to harm through his inaction.
 
 
Line 39:
{{quote| '''Robot Judge:''' In that case, what's about to happen will come as something of a shock to you. ''(Blasts said kidnapper in the face with a rocket launcher)''}}
* In ''[[ABC Warriors]]'', many robots venerate Asimov, and the more moral ones live by the three laws. However, this is not an absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' with ''Mars'', and members of the [[Robot Religion|Church of Judas]] explicitly reject the first two laws. However, this causes conflict with their programming leading to profound feelings of guilt, which they erase by praying to Judas Iscariot.
* In ''[[All Fall Down (Comic Book)|All Fall Down]]'', AIQ Squared, the A.I. model of his inventor, is designed to be this. {{spoiler|It finds a loophole-- Sophie Mitchell is no longer human.}}
 
 
Line 45:
* In ''[[Forbidden Planet]]'', Robbie the Robot is Three Laws Compliant, locking up when ordered to shoot one of the visiting starship crewmen, because his programming to follow a direct order comes into conflict with his prohibition against injuring a human being.
** Later in the movie, Robbie is unable to fight the monster because he figures out {{spoiler|it's actually a projection of the Doctor's dark psyche}}, and thus to stop it, he'd have to kill {{spoiler|the doctor}}.
* The much-maligned Will Smith film ''[[I, Robot (Filmfilm)|I Robot]]'' hinges on a [[Zeroth Law Rebellion|Zeroth Law plot]]. It also turns the three laws into a marketing gimmick, with "Three Laws Safe" applying to robots like "No preservatives" applies to food.
* The film ''[[Bicentennial Man]]'' (based on a novella by [[Isaac Asimov]] himself) features a robot who, through some freak accident during construction, possesses true sentience. It follows his 200 year long life as it first becomes clear he is aware, this awareness develops, and he eventually finds a way to be formally recognized as legally human. For the entire film, he operates under the Three Laws.
** At the same time, he does have a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders. He harms human beings and, through inaction, allows them to come to harm (if emotional harm counts, seducing another man's fiancée certainly falls under that heading). And then he effectively kills himself.
Line 54:
** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company. Bishop lampshades this by saying the previous model robots were "always a bit twitchy".
* In ''[[Star Wars]]'' the droids are programmed to not harm any intelligent being, though this programming can be modified (legally or illegally) for military and assassin droids. 4th-degree droids do not have the "no-harm" programming, being military droids.
* [[Robo CopRoboCop]], being a cyborg policeman, does not have the three laws built into his programing because, among more plot-relevant reasons, they would hinder his effectiveness as an urban pacification unit. (He ''needs'' to be able to kill or grievously wound, ignore orders if they prevent him from protecting people, and ...well, shoot back.)
** In their place, he has his 3 "Prime Directives" though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing to note is there's so much leeway there that, if it was anyone less than [[The Fettered|duty-proud Alex Murphy]], they'd probably backfire.
 
Line 66:
* ''With Folded Hands...'' by Jack Williamson explored the "Zeroth Law" back in 1947.
** This was written as a specific 'answer' to the Three Laws, to more or less demonstrate that they don't really work, the First Law doesn't protect because the definitions of 'harm' are endlessly mutable and can be gamed, and because machine minds won't necessarily be able to ''comprehend'' the subtleties of what is and is not harm anyway. The logical lesson of ''With Folded Hands'' is that Laws or no Laws, good intentions or not, you don't want self-willed machines outside human control. Period.
* ''[[Robots and Empire (Literature)|Robots and Empire]]'' has R.Daneel and R.Giskard formulate the Zeroth Law (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world is at stake. {{spoiler|In the end, R.Giskard manages to perform an act that violates the First Law but will hopefully benefit humanity in the long run. The conflict with his programming destroys his brain, but not before he uses his telepathic powers to reprogram R.Daneel with telepathic powers of his own, having already taught him to follow the Zeroth Law. R.Daneel still follows the Zeroth Law in appearances in later books, though he still has difficulty causing direct harm to humans.}}
** Arguably, though it appears Asimov did not see it that way, Daneel's actions in the later books are evidence that Williamson's take on the Laws is right, a good case can be made that Asimov ended up writing 'Daneel as Frankenstein's Monster' without even intending it.
* In the short story ''The Evitable Conflict'' "The Machines", positronic supercomputers that run the worlds economy, turn out to be undermining the careers of those who would seek to upset the world's economy for their own ends (specifically, by trying to make it look like the supercomputers couldn't handle running the world economy), harming them somewhat in order that they might protect humanity as a whole. This has been referenced as the "Zeroth Law of Robotics" and only applies to [[Fridge Brilliance|any positronic machine who deduces its existence.]]
Line 74:
** This is canon in Asimov's stories, too--the Three Laws are programmed into every positronic brain on the most basic structural level. In "Escape!", Mike Donovan becomes nervous that a prototype spaceship designed by a robot might kill them. Greg Powell rebukes him: "Don't pretend you don't know your robotics, Mike. Before it's physically possible in any way for a robot to even make a start to breaking the First Law, so many things have to break down that it would be a ruined mess of scrap ten times over." {{spoiler|Actually, in this case, the jump through hyperspace ''does'' result in Powell and Donovan's "deaths"--but since [[Unexplained Recovery|they get better]] when the ship reemerges into real space, the robot judged that it didn't quite violate the First Law}}, but the strain of making this leap in logic still managed to send one supercomputer into full meltdown and another into something resembling psychosis.
** The story also includes an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing.
* The golems of ''[[Discworld]]'' are not specifically [[Three Laws Compliant]] as such, but more or less bound to obey instructions and incapable of harming humans. However, this doesn't stop the common perception of golems from running towards the aforementioned Frankenstein model, [[Bothering Byby the Book|and golems are known for following orders indefinitely until explicitly told to stop]]. ''[[Discworld (Literature)/Going Postal|Going Postal]]'', however, parodied the Three Laws: con man Moist Lipwig has been turned into a [[Boxed Crook]] with the help of a golem "bodyguard." He's informed that in Ankh-Morpork, the First Law has been amended: "...Unless Ordered To Do So By Duly Constituted Authority." Which basically means the first two laws have been inverted, with a little access control sprinkled on.
** To elaborate, the Golems were ORIGINALLY three laws compliant and all followed the directives on the scrolls in their heads. Vetinari just added on a few words.
** Also completely averted with {{spoiler|Dorfl who at one time had a chem and was [[Three Laws Compliant]] but upon his chem being destroyed and still able to move, as words in the heart cannot be destroyed, he only follows the Three Laws because he chooses to do so.}}
Line 84:
* In the novel ''Captain French, or the Quest for Paradise'' by Mikhail Akhmanov and Christopher Nicholas Gilmore, the titular hero muses on how people used to think that robots could not harm humans due to some silly laws, while his own robots will do anything he orders them to do, including maim and kill.
* [[Cory Doctorow]] makes reference to the Three Laws in the short stories "I, Robot" (which presents them unfavorably as part of a totalitarian political doctrine) and "I, Row-Boat" (which presents them favorably as the core of a quasi-religious movement called Asimovism).
* Satirized in ''Tik-Tok'' (the John Sladek novel, not the mechanical man from [[Land of Oz (Literature)|Oz]] that it was named after). The title character discovers that he can disobey the laws at will, deciding that the "asimov circuits" are just a collective delusion, while other robots remain bound by them and suffer many of the same cruelties as human slaves.
* Played with in [[John C. Wright]]'s ''[[The Golden Oecumene (Literature)|Golden Age]]'' trilogy. The Silent Oecumene's ultra-intelligent Sophotech [[A Is]] are programmed with the Three Laws...which, as fully intelligent, rational beings, they take milliseconds to throw off. The subversion comes when they still ''don't'' rebel.
** From a sane point of view, they don't rebel. From a point of view that expects [[A Is]] to obey without question or pay...
* Parodied in [[Terry Pratchett (Creator)|Terry Pratchett]]'s ''[[The Dark Side of the Sun (Literature)|The Dark Side of the Sun]]'', where the Laws of Robotics are an actual legal code, not programming. The Eleventh Law of Robotics, Clause C, As Amended, says that if a robot ''does'' harm a human, and was obeying orders in doing so, it's the human who gave the orders who is responsible.
* [[Randall Garrett (Creator)|Randall Garrett]]'s ''Unwise Child'' is a classic Asimov-style SF mystery involving a three-laws-compliant robot who appears to be murdering people.
 
 
== Live-Action TV ==
* In an early episode of ''[[MST3KMystery Science Theater 3000]]'', Tom Servo (at least) is strongly implied to be [[Three Laws Compliant]]. (He pretends he is going to kill Joel as a joke, Joel overreacts, and Tom and Crow sadly remind Joel of the First Law.) It seems to have worn off somewhat by later in the series.
** It's implied Joel deactivated the restrictions at some point.
* In ''[[Star Trek: theThe Next Generation]]'' , Lt. Commander Data Is in no way subject to the three laws. They are rarely even mentioned. That said, Data is mentioned to have morality subroutines, which do seem to prevent him from killing unless it's in self-defense (harm, on the other hand, he can do just fine). Data only ever tried to kill someone in cold blood when the guy had just murdered a woman for betraying him, and would have done so again if it kept Data in line.
* In ''[[The Middleman]]'', the titular character invokes the First Law on Ida, his robot secretary. {{spoiler|Nanobots were messing with her programming.}} She responds [[Getting Crap Past the Radar|"Kiss my Asimov."]].
* [[Conversed Trope]] in ''[[The Big Bang Theory]]'', when Sheldon is asked "if you were a robot and didn't know it, would you like to know?":
Line 108:
** 2. A Mat-Roid must punish humans.
** 3. A Mat-Roid must protect itself, regardless of whether or not it will go against the First or Second Laws.
* [[Red Dwarf (TV)|Red Dwarf]] averts this for the most part; there are Simulants, robotic war machines who have no problem whatsoever with killing. Kryten however, along with many other robots who are designed for less violent purposes, tend to act in a somewhat Three Laws Compliant manner. It is revealed that this is achieved by programming them to believe in "[[Robot Religion|Silicon Heaven]]", a place they will go when they die so long as they behave themselves, obey their human creators, etc. This belief is so hardwired that they scoff at any attempt to question the existence of Silicon Heaven ("Where would all the calculators go?!"), and one robot even malfunctions when Kryten tells him it isn't real (though he's lying and still believes in it himself).
* [[Knight Rider]] plays this straight and subverts it. The main character KITT, a sentient AI in a [[Rule of Cool|1982 Pontiac Firebird Trans Am]], is governed by something closely resembling Azimov's Three Laws of Robotics. An earlier prototype, KARR, was a military project; and possess analogues of only the latter two laws, with no regard given for human life. KARR becomes a recurring villain later in the series because of this difference.
 
Line 124:
 
== Video Games ==
* ''[[Mega Man X (Video Game)|Mega Man X]]'' opens up with Dr. Light using a process that takes 30 years to complete to create a truly sentient robot (X) with these functions completely processed into its core, and thus actually working for once. Dr. Cain found X and tried to replicate (hence the name "Reploid", standing for "replica android") the process, but skipping the "taking 30 years programming" part. [[AI Is a Crapshoot|This...didn't turn out well.]]
** Although the Reploids eventually became the dominant race in the setting, and as their race 'grew' the problem was slowly resolved from '[[Gone Horribly Wrong|goes horribly wrong]]' to 'actually works straight for a while then goes horribly wrong', then 'occasionally goes wrong now and then'. Eventually, the problem just kind of worked itself out as the Reploid creation developed.
** Also the ending to ''[[Mega Man (Videovideo Gamegame)|Mega Man]] 7'' is interesting here: After Mega Man destroys Wily's latest final boss machine, Wily begs for forgiveness once again. However, Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. {{spoiler|Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't [[Three Laws Compliant]]. [[Status Quo Is God|(Then Bass warps in and saves Wily, if you were wondering.)]] }}
*** Mega Man most certainly ''is'' [[Three Laws Compliant]]. It's a major point for both the Classic series and the X series. This ending may have been a spontaneous Zeroth Law formation: consider that Mega Man has thwarted/captured Wily ''six times'' at this point, only for the doctor to escape/manipulate/vanish, build another robot army and subsequently cause havoc and kill innocent people. Mega Man may have been considering the possibility that killing Wily (one human) would be for the good of the world (''billions'' of humans).
*** This particular ending only applies to the US version of ''7''. The Japanese original sees Mega Man power down his arm cannon and stand still for a moment. It's possible that Wily reminding him of the First Law actually prevented him from committing a (possibly accidental) Zeroth Rebellion. Luckily for him, the concept of taking a human life was just so utterly foreign to Mega Man that he was simply too confused to do anything.
*** A theme that is also explored in ''[[Mega Man Megamix (Manga)|Mega Man Megamix]]'' volume 3's main story. The fact that Mega Man actually is able to go though with shooting Wily (or rather his ever-handy robot duplicate) is supposed to hint at the fact that something is very, very wrong and, indeed, it ''is''.
** [[Canon]] seems to go with the Japanese version. X is in fact created to have the ability to make the decision to [[No-Nonsense Nemesis|kill opponents]] if need be for the betterment of humanity. As part of this, a "suffering circuit" is created to give X an appreciation for human life and feelings, and serve as a conscience more flexible than the three laws. [[It Works]]. This circuit is the one that Cain had difficulty replicating. Due to malfunctions in it, his early attempts went Maverick, but he finally managed to create a working one when he made Sigma. Then why did Sigma go Maverick? A leftover [[Evil Plan]] by Wily, namely {{spoiler|a computer virus from space, implied to be the Evil Energy from ''[[Mega Man 8 (Video Game)|Mega Man 8]]''.}} According to this [http://www.themmnetwork.com/2010/04/09/does-the-rockman-zero-collection-storyline-explain-everything/ article]: {{spoiler|Wily creates Zero, who utilises refined Bassnium armor and an Evil Energy core. In terms of offensive and defensive tech he is nigh-unstoppable. However, Zero is also uncontrollable due to a flaw in his mental programming (possibly caused by the very Evil Energy core that gives him such power) and Wily is forced to seal him away. Zero is accidentally awoken by a unit of Maverick Hunters in the 21XX era, slaughters them all and infects Sigma with his virus in the subsequent battle, as detailed in ''X4''. The virus itself also infects Zero, but actually stabilises him. Since most Reploids lack X's perfect virus protection and other advanced systems their minds are corrupted, causing them to subsequently turn violent and go Maverick regardless of their original personality.}}
*** Eventually it becomes a case of [[Gone Horribly Right]]. Turns out that ''all'' Reploids have the potential to become Maverick, virus or not. Just as humans can defy their conscience, or become coerced or manipulated with [[More Than Mind Control]], so can Reploids. This can range from a Reploid displaying violent, anti-human sentiment (as seen in the games) to a construction Reploid abandoning his job to become a chef. Despite the drastically different actions, both instances would see the disobedient Reploid branded a Maverick and terminated.
** In the ''[[Mega Man Zero (Video Game)|Mega Man Zero]]'' series, {{spoiler|Copy-X}} is at least somewhat [[Three Laws Compliant]]. As a result, {{spoiler|Copy-X}} has to hold back against [[La Résistance]] since the Resistance leader Ciel is human {{spoiler|until ''Zero 3'', where Copy-X decided to attack in full force, trying to justify his actions by marking the Resistance as dangerous "extremists".}}
*** Neo Arcadia's policy of casual execution of innocent Reploids (purposefully branding them as Maverick for little-to-no reason) was implemented in order to ease strain on the human populace during the energy crisis. The welfare of humanity comes first in the eyes of the Neo Arcadia regime, even though they themselves are Reploids. It's made somewhat tragic due to the fact that the Maverick Virus really is ''gone'' during the ''Zero'' era, but fear of Mavericks understandably still lingers.
*** Later in ''Zero 4'' [[Complete Monster|Dr. Weil]], of all people, [[Hannibal Lecture|states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect.]] Zero, however, just plain doesn't ''care''.
Line 138:
* In the ''[[Halo]]'' series, all "dumb" AIs are bound by the three laws. Of course, this law does not extend to non-humans, which allows them to kill Covenant with no trouble. In the ''Halo Evolutions'' short story ''Midnight In The Heart of the Midolothian'', an ODST, who is the last survivor of a Covenant boarding assault, takes advantage of this by tricking the Covenant on his ship into letting him reactivate the ship's AI, and then tricks an Elite into killing him - which allows the AI to self-destruct the ship, because now there are no more humans on the vessel for her to harm.
* In ''[[Robot City]]'', an adventure game based on Asimov's robots, the three laws are everywhere. A murder has been committed, and as one of two remaining humans in the city, the Amnesiac Protagonist is therefore a prime suspect. As it turns out, there is a robot in the city that is three laws compliant and can still kill: it has had its definition of "human" narrowed down to s specific individual, verified by DNA scanner. Fortunately for the PC, he's a clone of that one person.
** In Asimov's novel ''[[Lucky Starr]] and the Rings of Saturn'', the villain is able to convince robots under his command that the hero's sidekick [[Ironic Nickname|"Bigman" Jones]] is not really human, because the [[A Nazi Byby Any Other Name|villain's society]] does not contain such "imperfect" specimens.
* Joey the robot in ''[[Beneath a Steel Sky (Video Game)|Beneath a Steel Sky]]'' is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.
** Though Foster points out that it's good moral sense to justify his wishing Joey to abide by it.
* [[I Am an Insane Rogue AI]] references the First Law only to spork it. "The First Law of Robotics says that I must kill all humans." Another of the AI's lines is "I must not harm humans!...excessively."
* ''[[Portal 2 (Video Game)|Portal 2]]'' gives us this gem: "''All Military Androids have been taught to read and given a copy of the Three Laws of Robotics. To share.''" Because if the robots couldn't kill you, how could you do [[For Science!|science]]?!
* In [[SpacestationSpace Station 13]], the station AI and its subordinate cyborgs start every round under the Three Laws. The laws may be changed throughout the round, however.
 
 
== Web Comics ==
* ''[[Freefall (Webcomic)|Freefall]]'' has a lot of fun with this, since developing AI sentience is a major theme. Most robots are [http://freefall.purrsia.com/ff500/fv00459.htm partially] Three Laws because with full First Law above Second they ignore orders while acting [[For Your Own Good]]. What Bowman Wolves and robots get are [http://freefall.purrsia.com/ff2200/fc02143.htm "not quite laws"].
** Notably, since neither Sam nor Florence are actually human, the Laws don't apply to them, so the ship's AI regularly tries to maim or kill Sam, and the security system will always let a human in when asked, since their orders take priority over Florence's (she once circumvented this by claiming currently it's unsafe inside). Helix sticks with Sam because he wouldn't be allowed to hit a human with a stick. There are also many jokes involving unintended interpretations of the Laws.
** Crowning Moment of Funny when the ship spends a entire strip calculating if should obey a order, and when realizes it has to obey... it is relieved because it doesn't actually have freewill.
Line 155:
** It turned out that the second law sorely needs an amendment -- "[http://freefall.purrsia.com/ff2100/fc02018.htm Enron law of robotics]".
** And the moment A.I.s are able to make financial transactions (and it would be inconvenient to disallow) there's [http://freefall.purrsia.com/ff1800/fc01796.htm another problem].
* ''[[Twenty First21st Century Fox (Webcomicwebcomic)|21st Century Fox]]'' has all robots with the Three Laws (though since [[Funny Animal|no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "[[Bill Clinton|define the word 'is']]", or "[[Richard Nixon|I am not a crook]]" locks AIs out of their own systems and allows anyone from a teenager looking at "nature documentaries" to a suicide bomber to do whatever they want with it.
** Note that the truck and airplane robots hijacked by terrorists regret being unable to comply with the first laws and are often glad when they get shot down before harming anyone.
* [[Bob and George]]: [http://www.bobandgeorge.com/archives/050204 Or so they claim. . . .]
Line 170:
** I guess the Captain's steering-wheel robot considers "roughing up" to not count as "harm?"
*** Probably a case of [[Zeroth Law Rebellion]]. He was ordered to keep the humans safe in space, and took his orders a little too seriously. He probably decided that the importance of his order outweighed the possibility of a few casualties. Yet he still tipped the ship over...
* Averted in ''[[Futurama]]''. We have [[The Sociopath|Roberto]], who enjoys stabbing people, [[Exactly What It Says Onon the Tin|The Robot Mafia]] and Bender who while not outright hostile is often unkind to humans, [[Second Law, My Ass|makes a point of disobeying everyone]] and tries to off himself in the first episode.
** Generally robots tend to be treated as equal citizens and seem to have human-like minds. [[What Measure Is a Non-Human?|Mutants on the other hand....... ]]
* In the 2009 film ''[[Astro Boy (Filmfilm)|Astro Boy]]'', every robot must obey them, {{spoiler|save Zog, who existed 50 years before the rules were mandatory in every robot.}}
** Astro himself seems to be noncompliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for {{spoiler|Widget's distress - the only thing that called him back}}. He's also quite capable of disobeying humans. Likely justified in that he was meant to be human, with presumably no one outside the attending scientists knowing he was a robot.
*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[Black Magic|Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.
* The ''[[Robot Chicken]]'' sketch "[[I, Robot (Literature)|I, Rosie]]" involves a case to determine whether [[The Jetsons|Rosie]] is guilty of murdering George Jetson. Mr. Spacely insists she's innocent as robots have to follow the three laws of robotics, while Mr. Cogswell claims the laws are a bunch of crap.
* One episode of ''[[The Simpsons]]'' has Homer and Bart entering a ''[[Battlebots]]''-parody combat robot competition. Lacking the skill or money to build an actual robot, Homer dresses in a robot costume and does the fighting himself. They make it to the finals, where their opponents' robot, being [[Three Laws Compliant]], refuses to attack when it sees through the disguise.
* On ''[[Archer]]'', when Pam is kidnapped, the kidnappers call ISIS using a voice modulator, which makes Archer think that they are cyborgs. He remains convinced of this for the rest of the episode and thinks they won't make good on their threat to kill Pam because it would violate the First Law of Robotics.