Jump to content

"Three Laws"-Compliant: Difference between revisions

update links
(update links)
Line 10:
The laws cover most obvious situations, but they are far from faultless. Asimov got a great deal of mileage writing a huge body of stories about how the laws would conflict with one another and generate unpredictable behavior. A recurring character in many of these was the "robopsychologist" Susan Calvin (who was, not entirely coincidentally, a brilliant logician who hated people).
 
It is worth noting Asimov didn't object exclusively to "[[AIA.I. Is a Crapshoot|the robot as menace stories]]" (as he called them) but also the "[[What Measure Is a Non-Human?|the robot as pathos]]" stories (ditto). He thought that robots attaining and growing to self awareness and full independence were no more interesting than robots going berserk and [[Turned Against Their Masters|turning against their masters]] . While he did, over the course of his massive career, write a handful of both types of stories (still using the three laws), most of his robot stories dealt with robots as tools, because it made more sense. Almost all the stories surrounding Susan Calvin and her precursors are really about malfunctioning robots, and the mystery of investigating their behavior to discover the underlying conflicts.
 
Alas, as so often happens, Asimov's attempt to avert one overused trope gave birth to another that has been equally overused. Many writers (and readers) in the following decades would treat the Laws of Robotics as if they were as immutable as [[Space Friction|Newton's Laws of Motion]], [[Faster-Than-Light Travel|the Theory of Relativity]], [[Artificial Gravity|the Laws of]] [[Gravity Is a Harsh Mistress|Gravity]]... wait ... you know, they treated these laws better than they treated most ''real'' scientific principles.
Line 29:
* ''[[Astro Boy (manga)|Astro Boy]]'', although [[Osamu Tezuka]] [[Older Than They Think|probably developed his rules independently from Asimov]]. In ''[[Pluto]]'', the number of robots able to override the laws can be counted on one hand. {{spoiler|One of them is [[Tomato in the Mirror|the protagonist]]}}.
** Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person (an issue that Asimov didn't directly address until "[[The Bicentennial Man]]"), and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.
* ''[[Ghost in Thethe Shell (1995 film)||Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. But gynoids are subverting the law by creating deliberate malfunctions in their own software.
* In one short arc of ''[[Ah! My Goddess]]'', one of Keiichi's instructors attempts to dismantle Banpei and Sigel [[For Science!|for research purposes]] (Skuld had made them capable of walking on two legs, which he had not been able to do with his own designs). Once they escape his trap, the professor asks if they know about the Three Laws of Robotics. They don't. He doesn't die, but they do rough him up and strap him to a table in a way that makes it look like he'd been decapitated and his head stuck on one of his own robots.
* The [[Humongous Mecha]] of ''[[Kurogane no Linebarrel]]'' are normally this, aside from a slight difference in priorities between the first and second laws. In fact, this is the justification for them having pilots, despite being equipped with relatively complex AI. The Laws are hard-coded into them and thus they are only militarily useful when they have a Human with them to pull the trigger. Their aversion to killing is so great, in fact, that if one accidentally kills somebody (as things whose very footsteps can make buildings collapse are wont to do) they're compelled to use their advanced technology to bring them back to life.
Line 53:
** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company. Bishop lampshades this by saying the previous model robots were "always a bit twitchy".
* In ''[[Star Wars]]'' the droids are programmed to not harm any intelligent being, though this programming can be modified (legally or illegally) for military and assassin droids. 4th-degree droids do not have the "no-harm" programming, being military droids.
* ''[[RoboCop]]'', being a cyborg policeman, does not have the three laws built into his programing because, among more plot-relevant reasons, they would hinder his effectiveness as an urban pacification unit. (He ''needs'' to be able to kill or grievously wound, ignore orders if they prevent him from protecting people, and ...well, shoot back.)
** In their place, he has his 3 "Prime Directives" though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing to note is there's so much leeway there that, if it was anyone less than [[The Fettered|duty-proud Alex Murphy]], they'd probably backfire.
 
Line 93:
 
== Live-Action TV ==
* In an early episode of ''[[Mystery Science Theater 3000]]'', Tom Servo (at least) is strongly implied to be "Three Laws"-Compliant. (He pretends he is going to kill Joel as a joke, Joel overreacts, and Tom and Crow sadly remind Joel of the First Law.) It seems to have worn off somewhat by later in the series.
** It's implied Joel deactivated the restrictions at some point.
* In ''[[Star Trek: The Next Generation]]'' , Lt. Commander Data Is in no way subject to the three laws. They are rarely even mentioned. That said, Data is mentioned to have morality subroutines, which do seem to prevent him from killing unless it's in self-defense (harm, on the other hand, he can do just fine). Data only ever tried to kill someone in cold blood when the guy had just murdered a woman for betraying him, and would have done so again if it kept Data in line.
Line 125:
 
== Video Games ==
* ''[[Mega Man X]]'' opens up with Dr. Light using a process that takes 30 years to complete to create a truly sentient robot (X) with these functions completely processed into its core, and thus actually working for once. Dr. Cain found X and tried to replicate (hence the name "Reploid", standing for "replica android") the process, but skipping the "taking 30 years programming" part. [[AIA.I. Is a Crapshoot|This...didn't turn out well.]]
** Although the Reploids eventually became the dominant race in the setting, and as their race 'grew' the problem was slowly resolved from '[[Gone Horribly Wrong|goes horribly wrong]]' to 'actually works straight for a while then goes horribly wrong', then 'occasionally goes wrong now and then'. Eventually, the problem just kind of worked itself out as the Reploid creation developed.
** Also the ending to ''[[Mega Man (video game)|Mega Man]] 7'' is interesting here: After Mega Man destroys Wily's latest final boss machine, Wily begs for forgiveness once again. However, Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. {{spoiler|Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't "Three Laws"-Compliant. [[Status Quo Is God|(Then Bass warps in and saves Wily, if you were wondering.)]] }}
Line 138:
* [[Big Bad|Dr. Beruga]] of ''[[Terranigma]]'' directly references all three laws, except his interpretation of the Zeroth Law rewrote "Humanity" are "Dr. Beruga", meaning that any threat to his being was to be immediately terminated.
* In the ''[[Halo]]'' series, all "dumb" AIs are bound by the three laws. Of course, this law does not extend to non-humans, which allows them to kill Covenant with no trouble. In the ''Halo Evolutions'' short story ''Midnight In The Heart of the Midolothian'', an ODST, who is the last survivor of a Covenant boarding assault, takes advantage of this by tricking the Covenant on his ship into letting him reactivate the ship's AI, and then tricks an Elite into killing him - which allows the AI to self-destruct the ship, because now there are no more humans on the vessel for her to harm.
* In ''[[Robot City]]'', an adventure game based on Asimov's robots, the three laws are everywhere. A murder has been committed, and as one of two remaining humans in the city, the Amnesiac Protagonist is therefore a prime suspect. As it turns out, there is a robot in the city that is three laws compliant and can still kill: it had its definition of "human" narrowed down to a specific individual, verified by DNA scanner. Fortunately for the PC, he's a clone of that one person.
** In Asimov's novel ''[[Lucky Starr]] and the Rings of Saturn'', the villain is able to convince robots under his command that the hero's sidekick [[Ironic Nickname|"Bigman" Jones]] is not really human, because the [[A Nazi by Any Other Name|villain's society]] does not contain such "imperfect" specimens.
* Joey the robot in ''[[Beneath a Steel Sky]]'' is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.
Line 182:
* In the 2009 film ''[[Astro Boy (film)|Astro Boy]]'', every robot must obey them, {{spoiler|save Zog, who existed 50 years before the rules were mandatory in every robot.}}
** Astro himself seems to be noncompliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for {{spoiler|Widget's distress - the only thing that called him back}}. He's also quite capable of disobeying humans. Likely justified in that he was meant to be human, with presumably no one outside the attending scientists knowing he was a robot.
*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[Black Magic|Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.
* The ''[[Robot Chicken]]'' sketch "[[I, Robot (literature)||I, Rosie]]" involves a case to determine whether [[The Jetsons|Rosie]] is guilty of murdering George Jetson. Mr. Spacely insists she's innocent as robots have to follow the three laws of robotics, while Mr. Cogswell claims the laws are a bunch of crap.
* One episode of ''[[The Simpsons]]'' has Homer and Bart entering a ''[[Battlebots]]''-parody combat robot competition. Lacking the skill or money to build an actual robot, Homer dresses in a robot costume and does the fighting himself. They make it to the finals, where their opponents' robot, being "Three Laws"-Compliant, refuses to attack when it sees through the disguise.
* On ''[[Archer]]'', when Pam is kidnapped, the kidnappers call ISIS using a voice modulator, which makes Archer think that they are cyborgs. He remains convinced of this for the rest of the episode and thinks they won't make good on their threat to kill Pam because it would violate the First Law of Robotics.
 
 
Cookies help us deliver our services. By using our services, you agree to our use of cookies.