Logic Bomb: Difference between revisions

→‎Comic Books: Added example
No edit summary
(→‎Comic Books: Added example)
 
(26 intermediate revisions by 9 users not shown)
Line 2:
[[File:know your paradoxes2 9503.jpg|link=Portal 2|frame|Yeah, [[Heroic Mime|good luck with that.]]]]
 
{{quote|''"I cannot -- yet I must! How do you calculate that? At what point on the graph do 'must' and 'cannot' meet?"''|'''Ro-Man''', ''[[Robot Monster]]''}}
|'''Ro-Man''', ''[[Robot Monster]]''}}
 
Is your [[Instant AI, Just Add Water|sentient supercomputer]] [[AIA.I. Is a Crapshoot|acting up]]? Good news. There's an easy solution: confuse it.
{{quote|''"I cannot -- yet I must! How do you calculate that? At what point on the graph do 'must' and 'cannot' meet?"''|'''Ro-Man''', ''[[Robot Monster]]''}}
 
Is your [[Instant AI, Just Add Water|sentient supercomputer]] [[AI Is a Crapshoot|acting up]]? Good news. There's an easy solution: confuse it.
 
If you give a computer nonsensical orders in the [[Real Life|real world]] it will, generally, do nothing (or possibly appear to freeze as it loops eternally trying to find a solution to the unsolvable problem presented to it). In fiction-land however, [[Explosive Overclocking|it will explode]]. It may start stammering, "I must... but I can't... But I must..." beforehand. The easiest way to confuse it is with the Liar paradox, i.e. "this statement is a lie". A fictional computer will attempt to debate and solve the paradox until it melts down. If the computer is a robot, this will probably result in [[Your Head Asplode]].
Line 24:
 
{{examples}}
== [[Multiple Media]] ==
 
* Most works of fiction where [[Isaac Asimov]]’s [[Three Laws of Robotics]] are integral to the plot focus on how the Laws are flawed (due to being too broad and too vague) and how they can easily cause this. For example, say a robot is in a situation where a human is badly injured, and requires emergency surgery. Rule #1 states that a robot cannot harm a human being, or via its inaction allow the human to be harmed, so it would have to perform said surgery itself, except it then realizes that doing so would technically be considered “harming” the human, especially if it were not programmed for such a task; but doing nothing (or even trying to find someone else who could help) would be allowing harm via its inaction. Given how most robots in these works have limited (or nonexistent) ability to use their own judgment, this can easily cause a Logic Bomb that leads to a [[Zeroth Law Rebellion]].
== Anime & Manga ==
 
 
== [[Anime]] &and [[Manga]] ==
* ''[[Ghost in the Shell: Stand Alone Complex]]'': The Tachikomas (AIs themselves) temporarily confuse a lesser AI with the [[wikipedia:Epimenides paradox|Epimenides paradox]], and comment on how they are advanced enough to know there is no answer. [[Asymmetric Dilemma|And also it isn't a paradox]], so they rephrase it.
* Quasi-example in ''[[Code Geass]]'': Rolo the [[Tyke Bomb]], who might as well be nonhuman, suffers one of these when Lelouch's manipulations clash with nabbing his mission objective, C.C., who is right in front of him. Non fatal example, but the mental standstill is fun to watch.
Line 35 ⟶ 39:
* In ''[[Haruhi Suzumiya|The Disappearance of Haruhi Suzumiya]]'', {{spoiler|Yuki Nagato decides to [[Alternate Universe|reset herself (and the rest of the universe)]] because she cannot [[Butterfly of Doom|accurately simulate what she's going to do after she learns the result of said simulation]], given that her simulation is constructed from information based on the result of the simulation.}}
* In ''[[Grey]]'', the protagonist defeats the [[Master Computer]] Toy, who thinks to be a god and wants to exterminate all of humanity, by asking him how can he be worshipped if there is nobody left to believe in him. This stuns momentarily the AI, just as long as to let Grey deal the final blow on it.
 
 
== Art ==
* A painting by the Belgian René Magritte [http[w://upload.wikimedia.org/wikipedia/en/b/b9/MagrittePipe.jpgThe Treachery of Images|"La trahison des images"]], which says "This is not a pipe" underneath in French. It's a ''picture'' of a pipe. Actually, it's just paint on canvas that we recognise as a pipe. Well, usually it's ink on paper arranged to resemble the paint on a canvas that we recognize as a pipe. Unless you're looking at it now, in which case it's RGB pixels on a screen that look like the ink etc. etc.
 
 
== [[Comic Books]] ==
* In ''[[Runaways]]'' v2 #23, Chase casually asks the cyborg Victor whether God could make a sandwich so big He couldn't finish it, causing Victor to stammer, emit a series of ones and zeroes, go [[Explosive Instrumentation]], and pass out. Chase explains that Victor was programmed to be both super-logical and super-spiritual; there are three questions that will cause him to short out, but each will only work once. Each question also has a counter-answer, in case he needs to be revived. The answer to this question is, "Yes, and then He'd finish it anyway."
* ''[[Marvel Adventures]] [[Fantastic Four]]'', basically Fantastic Four stories for younger readers, has Mr. Fantastic do this when challenged to defeat the "ultimate alien supercomputer". When the computer, which supposedly is nigh-omniscient, says there is nothing it can't do, Mr. Fantastic tells it to create a rock so big it can't pick it up. The computer is sparking metal in ten seconds.
Line 47 ⟶ 49:
{{quote|'''Irma Geddon''': You know, you AIs are almost too cute. How do I unplug you when you take over the world?
'''Joe Pi''': Ask me the purpose of existence, and I explode. }}
* The long-running Brazilian comic series ''[[Pt/Turma Da Monica|Turma Da Monica]]'' liked to use this now and then, often resulting in a [[Crowning Moment of Funny]]. One recent, but nonetheless hilarious use of this happened when the gang was confronted by an [[Expy]] of none other than [[Final Fantasy VII|Sephiroth]]. He appears, [[Arrogant Kung Fu Guy|saying how ridiculously powerful he is, flying incredibly high]] until one character asks "If you only got a single wing, how come you can fly just fine?". Cue [[Oh Crap]] and the [[Expy]] [[Gravity Is a Harsh Mistress|falling to his doom]].
* ''[[FoxTrot]]'': Jason once asked his mother if Marcus could sleep-over. She said that it was all right with her, if it was all right with his father. Asking his father, he's told that it's fine with him, if it was all right with his mother. After the [[Beat Panel]], he's shown consulting several logic books.
** The following day's strip featured Paige entering the same situation and just telling her friend "yeah, it's all right."
* The long-running Brazilian comic series [[Pt/Turma Da Monica|Turma Da Monica]] liked to use this now and then, often resulting in a [[Crowning Moment of Funny]]. One recent, but nonetheless hilarious use of this happened when the gang was confronted by an [[Expy]] of none other than [[Final Fantasy VII|Sephiroth]]. He appears, [[Arrogant Kung Fu Guy|saying how ridiculously powerful he is, flying incredibly high]] until one character asks "If you only got a single wing, how come you can fly just fine?". Cue [[Oh Crap]] and the [[Expy]] [[Gravity Is a Harsh Mistress|falling to his doom]].
* ''[[Sonic the Comic]]'' featured a robot which could predict Sonic's movements thanks to its encyclopaedic knowledge of his personality and tactics, which was effective - until Sonic ''surrendered''.
* In ''[[Prince Valiant]]'' the prince and his adventuring crew become prisoners on an island with an [[A God Am I|all-knowing oracle]]. The only way off is to ask a question the oracle doesn't know the answer to. After many days of endless questioning the prince finally comes up with the answer: "Why?"
* A ''[[Justice League of America|JLA]]'' comic played on this, with Amazo fighting the League. The League keeps drawing in other heroes as temporary recruits, and Amazo keeps copying their powers, because that's what Amazo does—he's programmed to copy the powers of the League and conquer them. At least, until Superman says that the Justice League is disbanded, which shorts Amazo's programming.
* In ''[[The Authority]]'', The Midnighter normally begins a fight by simulating it over and over on the supercomputer in his head until he knows everything his opponent might do. An attempt to use this on [[The Joker]], however, resulted in the Midnighter just [[Confusion Fu|standing there and staring blankly]].
* In ''[[Peanuts]]'', Linus subjects himself to a self-inflicted Logic Bomb with his belief that the Great Pumpkin always rises from the most sincere pumpkin patch on Halloween night. The moment he thinks to question whether his patch is sincere ''enough'', he's blown it: if he tries to change anything to make it more sincere, he'll only be expressing his own doubts and reducing the sincerity of his faith in the Great Pumpkin.
* ''[[Squadron Supreme]]'' has supervillains brainwashed to work for the titular Squadron, with the mental directive implanted into their minds that they shall not betray any of their members. What happens when one of them witnesses a member of the Squadron working against the others? The mind gets locked into a loop, since revealing the information means betraying one member, while keeping it secret means betraying everyone else in the Squadron.
* Lampshaded and defied by a legion of robots fighting with [[Power Girl]] in a [[Batman Cold Open]].
{{quote|'''Unimate''': {{smallcapssmall-caps|Unimate has come to cleanse the Earth of the imperfect organic matter known as Kryptonian. Kryptonian is imperf--}}
'''Power Girl''': No! ''You'' are imperfect! You must cleanse the Earth of your''selves''!
'''Unimate''': {{smallcapssmall-caps|Failure-- Unimate is programmed to reject stratagems from old "''[[Star Trek]]''" episodes.}}
'''Power Girl''': Aw, nuts. Worth a try, anyway. }}
* 1980's British science fiction comic ''Starblazer'', issue 153 "The Star Destroyers". The Vonan [[AI]] known as the Magister believes itself to be all-powerful. It is defeated by Galactic Patrol agent Al Tafer when he tells it it isn't all powerful because it can't destroy itself. This drives the Magister crazy and causes it to blow up the Vonan system's sun, destroying itself and the Vonans as well.
* In a [[She-Hulk]] story, after Shulkie uses time travel technology to alter the future and save Hawkeye’s life, she is arrested by the Time Variance Authority for doing so. After she views the horrible future her actions will cause, she changes her plea to guilty, and is about to be handed the harshest sentence - being [[Ret-Gone|completely eradicated from history]] via a weapon called the Ret-Cannon (obviously a pun on the term [[Ret Conned]]). However, this is interrupted when time-traveling fugitive named Clockwork interrupts and seizes the weapon - he intends to use it on Shulkie’s lawyer, Southpaw (seeing as Southpaw is herself a heroine who works as a lawyer, Clockwork is presumably the Lex Luthor to her Superman). After using it on several members of the authority, he threatens to use She-Hulk herself, as she’s trying to protect Southpaw and is in his way. (After all, she figures she's going to get hit with it anyway.) She tells him that if she is Ret-Gonned right here, she will never commit the original crime, the trial they are at will never take place, and his revenge plan will never succeed. (In fact, it seems likely both he ''and'' Southpaw will be Ret-Gonned too, given her past relation with the girl who would become Southpaw.) Clockwork needs a minute to try to figure that out… And as he's trying to do so, Shulkie kicks him unconscious.
 
* [[X-Men]] villian Sebastian Shaw once - during [[Spider-Man]]'s arc of the ''Acts of Vengeance'' crossover - placed a Logic Bomb failsafe in a group of Sentinels he had built in the event they turned on him [[A.I. Is a Crapshoot| (something Sentinels tend to do a lot)]]. Simply put, the program would reveal to a Sentinel that, since their abilities were "inherited" and improved upon from the original Mach-1 Sentinels, they are technically mutants. Because Sentinels' primary directive is to destroy mutants, one who has this revelation thrust upon it to destroy itself, as its directive is to destroy mutants. Unfortunately, when he used it on the fused Tri-Sentinel, Loki's sabotage had seriously screwed up the robot's programming, and the failsafe didn't do anything more than confuse it for a couple of minutes. Still, that small delay was enough for Spidey (who had the [[Captain Universe]] powers at the time) to bring the Uni-Power to its full potential and blow it to dust in a climactic finish.
 
== Computing ==
* This trope is actually misleadingly named. [[wikipedia:Logic bomb|Logic bombs]] refer to deliberate code hidden in programs which, when triggered, negatively affects its functionality---a simple 30-day trial expiring and disabling itself is a low-key example. [[The Other Wiki]] has bigger and nastier ones.
** The [[Jargon File]], of course, has [http://www.catb.org/jargon/html/L/logic-bomb.html an entry on the concept]], although it's not very detailed.
* Mainstream operating systems are vulnerable to a simple one: the Fork Bomb. It consists onof a program that does nothing but givinggive the computer an order that creates and runs copies of itself, and the copies create copies, too, until the computer is too busy copying instructions and running them to do anything else. [https://www.cyberciti.biz/faq/understanding-bash-fork-bomb/ Here]'s a simple version with explanation. Of course, the easiest defence from this is a limit on the number of processes a given user may have at one time.
** There's also a variant that just allocates a lot of memory. With enough privileges, the computer gets too busy swapping in and out the programs running to do anything else.
*** And if you're running a program that can allocate variables dynamically, failing to free up that variable creates memory leaks. Not a problem with modern OS's, but for embedded systems, you better watch yourself.
** "Grey goo" attacks, similar to the "fork bomb", have also been used successfully—at least twice—in ''[[Second Life]]'', by users creating objects which (self-)replicated at a rapid rate, eventually causing the servers to be too busy processing the grey goo to do anything else.
*** A mile-high Jenga tower will also crash ''Second Life''{{'}}s servers quite effectively: [[Wreaking Havok|pull out a key block]], and they'll crash trying to calculate the exact trajectory of each of the thousands of falling blocks.
* There's also the concept of a "deadlock", a "chicken or the egg" paradox where two or more programs or events require the other to resolve in order to be able to resolve, themselves. Several computers have turned themselves into lifeless (until restarted) lumps of silicon, as a result of this.
** In a similar fashion, there is such a thing as "livelock", where a system can get stuck doing "work" without ever making progress. As [[The Other Wiki]] puts it, people weaving back and forth in a corridor, trying to let each-other pass, is an example of livelock. Sometimes the solution is the same: stop, wait randomly, try again.
* In general, it's impossible to tell whether a program will loop forever or stop after some time. Most operating systems solve this problem by just not allocating all the processing power to a single program, but older ones do not, with the implied reasoning that the programmer will know what he's doing.
** This is called the halting problem. [[The Other Wiki]] [[wikipedia:Halting Problem|talks about it.]] In summary, it's been proven that no computer (even a [[Magical Computer]]) can predict if any given program it runs will halt with perfect accuracy.
*** Technically, the Halting problem proves that no ''[[w:Turing machine|Turing Machine]]'' (TM) can predict if any given program it runs will halt. TMsTuring Machines are like the computers you are using to read [[All The Tropes]], except that TMs have infinite memory. Given that your computer has a finite amount of RAM, it's therefore not a TMTuring Machine, but rather, a fixed representation of one.
** [http://www.lel.ed.ac.uk/~gpullum/loopsnoop.html An elegant proof that the Halting Problem isn't solvable that can be enjoyed by non-mathematicians.]
* The "classic" Mac OS dedicated an entire "DS" (fatal) error ID (ID=04) to catching and handling the so-called '[[Divide by Zero|Zero Divide Error]]'; as the Mac Secrets books put it, "When programmers test their works in progress, they might deliberately instruct the computer to divide a number by zero, to see how well the program handles errors. They occasionally forget to take this instruction out, as you've just discovered."
Line 91:
*#* The way this applies to math, not just computers, is basically that any sufficiently complex math system can be assigned a translation to/from a UTM - so when the UTM gets stuck on "G is true", the math system also gets stuck on the translation.
*#** Likewise, it can be applied to any system of philosophy and morality; "A machine built on Hegelian principles will never say this sentence is true".
*#* In non-math Math Circlescircles, this story is usually followed with the UTM replying "Gödel, you're a dick" and then punching him, despite not having any arms. Frequently something similar happens to the Mathmath Studentstudent who told the story.
*#* And this proves that people can't be perfect logicians, either, as every person has their own sentence G - "<person's name> will never say that this sentence is true". Or [[Mind Screw|worse]], "<person's name> will never ''believe'' that this sentence is true".
** The book ''[[Godel Escher Bach|Gödel, Escher, Bach]]'' plays with this in one of the interludes, with one character constantly devising records that cause Logic Bomb effects on record players another character buys (using loud resonant frequencies that destroy the players if reproduced 100% accurately). The second character eventually buys a reassembling record player that changes its structure to accommodate the record being played. The first character then makes a record that targets the module that effects the restructuring, that being the one component the record player cannot change.
* There is something called a [[wikipedia:Killer poke|killer poke]] that actually does physically damage the computer. A simple example would be overclocking the CPU so much that it overheats to the point where it melts.
** There ''was'' something like that. Now, if a CPU heats up too much it hangs before any physical damage is done, and a simple reboot fixes it. All the killer pokes mentioned on the Other Wiki were for old computers, and mainly depended on toggling a relay of one sort or another until it died. Modern computers don't have relays anymore.
Line 105:
* The Year 2000 Problem. Computer systems that represented years with only two digits (while assuming the first two digits were 19) would be unable to distinguish the year 2000 from the year 1900, thus throwing off date/time calculations. Fortunately, since computer people saw this coming well before it hit, most of the truly important systems were redone with better date representations well before any problems manifested. Wikipedia's page on what did go: http://en.wikipedia.org/wiki/Y2K#On_1_January_2000
** UNIX based systems, including Linux, have a similar problem. UNIX systems use a 32-bit counter based on the number of seconds since January 1, 1970. However, this is doomed to roll over sometime in 2038, creating a Y2038 problem for UNIX based systems. It's fixed by simply bumping the counter to 64-bit (if it's supported).
*** Even worse than that - at least with [[Y 2 K]]Y2K, computers merely confused the year 2000 with another valid year, such as 1900. With Y2.038K, not so lucky: the rollover will cause the counter to output a negative number, which is forbidden as a date representation, and is in fact used by many systems to represent error codes. so, while a [[Y 2 K]]Y2K-afflicted machine merely computed dates wrongly, a Y2.038K-afflicted one is expected to mistake the date for an error code and crash.
**** That's not worse, that's better! If your UNIX computer keeps running in 2039, how do you know if it never had the problem in the first place (they switched to 64-bit) or if it has the problem and is now getting subtly wrong answers? Talk to any medical equipment, process control or financial computer vendor: silent wrong answers are ''much'' worse than crashes.
* The race condition can be seen as a logic bomb ranging from something minor, to something very drastic. It involves one piece of data that two components can either read or write to. A minor case of a race condition is your display. Say the graphics card starts to render frames at a rate faster than what the display can put out. While the display is reading the frame buffer, the graphics card suddenly copies a new frame into the buffer. The result is the display for a one frame of its time showing two images at once (this phenomena is also known as tearing). A more serious instance when this occurred were two incidents involving [[wikipedia:Therac-25|Therac-25]], a radiation therapy machine.
 
== Fanworks[[Fan Works]] ==
* A silly one occurred in ''[[Yu-Gi-Oh!: The Abridged Series|Yu-Gi-Oh the Abridged Series]]'': Duke managed to get Nesbitt to self-destruct by showing him a picture of [[Yu-Gi-Oh ZEXAL|Yuuma]], which was too illogical for Nesbitt's robotic brain to handle. This was after Serenity tried "Which came first, the chicken or the egg?" and he [[Take a Third Option|chose]] "[[Rocket Punch|The rocket-powered fist!]]"
{{quote|"But that wasn't one of the options - GAAAH! I stand corrected."}}
 
== [[Film]] ==
 
* In ''[[War GamesWarGames]]'', a logic bomb-like device was used to teach the NORAD computer Joshua the futility of nuclear war: play tic-tac-toe with yourself until you win. After exhausting all possible move combinations it makes the logical leap and begins plotting out every conceivable nuclear strategy, ending in some [[Explosive Instrumentation]], after which the computer concludes "The only winning move is not to play."
== Film ==
* In ''[[War Games]]'', a logic bomb-like device was used to teach the NORAD computer Joshua the futility of nuclear war: play tic-tac-toe with yourself until you win. After exhausting all possible move combinations it makes the logical leap and begins plotting out every conceivable nuclear strategy, ending in some [[Explosive Instrumentation]], after which the computer concludes "The only winning move is not to play."
** SMBC Theater had a different [http://www.youtube.com/watch?v=TFCOapq3uYY&feature=player_embedded winning move].
* In the German version of ''[[Dr. No]]'': The [[Bond One-Liner]] (after the mooks in the ''hearse'' crashed down the cliffs) was slightly altered from its English original version. Into a logic bomb.
{{quote|''"What happened there?'''
''"They were in a hurry to attend their own funeral in time."'' }}
* The [[AIA.I. Is a Crapshoot|HAL 9000 computer]] in ''[[2001: A Space Odyssey]]'' became murderous because it was told to keep its crew from finding out the secret details of their mission until they got to Jupiter, even though it had also been programmed to not withhold or distort information. It's a riddle with a [[Murder Is the Best Solution|simple solution]]: break contact with Earth and kill the crew, so there's nobody to hide the secret from.
** The above is the literal explanation given in 2010; Another explanation is that he was given two explicit and contradictory orders, as opposed to one explicit order that conflicts with his basic implicit programming. The orders in question were 1) Don't reveal the true nature of the mission to the crew. 2) When you get to Jupiter, show the crew the pre-recorded briefing from Dr. Floyd. Obviously, there's no way to carry out both these orders, carrying out one would make the other impossible. A human being would have realised that order 2 would implicitly supersede order 1, but HAL was unable to make that particular leap of human reasoning. Instead he became trapped in a paradox until he found a solution - If the crew are all dead, he can play the briefing for them and they still won't know the true nature of the mission.
** In the novel, the narrative muses that HAL might had be able to find a peaceful solution to the problem, had mission control not requested his temporal disconnection. HAL, being unable to grasp the concept of sleep, was convinced that the disconnection would hadhave meant the end of his existence and his killing spree was therefore, all in all, a misguided attempt at self-defense.
* [[Master Computer]]s of 70s sci-fi were particularly poor at handling illogical input. The central control units in both ''[[Rollerball]]'' and ''[[Logan's Run (film)|Logan's Run]]'' were sent into confused, [[Explosive Instrumentation]] paroxysms by sheer accident.
** The computer in ''[[Rollerball]]'' has clearly been programmed to withhold information, and it's actually the ''programmer'' who has a breakdown when it refuses to divulge information on the Corporate Wars. The computer in ''[[Logan's Run (film)|Logan's Run]]'', however, is convinced that Sanctuary exists, and has a breakdown when its [[Mind Probe]] reveals the protagonist is telling the truth.
{{quote|'''Logan 5:''' There... is... no... Sanctuary!
'''Computer:''' Unacceptable. The input does not program, Logan 5. }}
Line 130 ⟶ 129:
* In ''[[Tron]]'', Flynn confronts the [[Master Computer|Master Control Program]] from a terminal in the "real" world early in the film, saying sarcastically how the unsolvable problems he's entering should be no problem for an AI that claims to be as powerful as the MCP. Flustered, the MCP ignores the problems and to defend itself beams Flynn into the computer world, setting off the story.
* The Soviet movie ''[[Teens in The Universe]]'' featured the main characters giving robots a riddle (similar to the English "Why is six afraid of seven"), and making them burn out. The problem starts when they discover that the higher level robots can actually solve the riddle.
* A logic bomb (causing a [[Temporal Paradox]]) was used to dispatch the djinn in ''[[Wishmaster]]''. The protagonist has one wish, which, once granted, allows the djinn to be released into the world. She wishes that the crane operator who'd been unloading a ship a few days earlier had not been drinking on a certain day, which is granted. Cue the djinni realizing to his horror that if the operator had not been drinking he wouldn't have allowed a statue to slip and crash, which meant that the djinni's gem hidden inside the statue was not discovered, and therefore he was not released to start granting wishes.
* In ''[[Forbidden Planet]]'', Dr. Morbius inadvertently Logic Bombs his own faithful servant, [[Talking Lightbulb|Robby the Robot]], when he orders it to kill the monster. Robby, who's [[Sliding Scale of Robot Intelligence|apparently more perceptive than Morbius]], realizes that the monster is actually {{spoiler|a reflection of Morbius himself, and is thus unable to kill it without violating his prime directive to avoid harming rational beings.}}
* ''[[Austin Powers]]'' gives one to himself that he goes cross-eyed. It is one of the classics, involving time-travel, but the kicker comes if you follow his actual dialogue: He never contradicts himself or sets up a paradox. ''There is no logic bomb''.
* ''[[Life of Brian]]''.
{{quote|'''[[Messianic Archetype|Brian]]:''' You don't need to follow me! You're all individuals!
'''Crowd:''' YES! WE'RE ALL INDIVIDUALS!
'''[[The Runt At the End|Man]]:''' I'm not...
'''Crowd:''' Sssh! }}
* In ''[[Terminator (franchise)|Terminator]] 3'' when [[Arnold Schwarzenegger|Ahnold]] gets captured by the T-X and reprogrammed to kill John Connor, Connor saves himself by {{spoiler|making the T-800 realize that accomplishing that goal would mean failing its original mission; the logical conflict between the two causes the T-800 to destroy a truck instead of Connor, then shut itself down. He gets better, briefly.}}
Line 143 ⟶ 142:
"Yeah, they've been that way all down through the ages." }}
 
== [[Literature]] ==
 
== Literature ==
* Going by [[Isaac Asimov]]'s famous "Three Laws of Robotics", if a robot ever broke the First Law of Robotics, it would shut down. Actually, one short story claims that the damage threshold for breaking the First Law is far ''greater'' than that required to shut down the robot, completely and irreparably. Being caught between harming humans through saying something and harming them through remaining silent killed the robot Herbie in ''Liar!''
** A possible loophole occurs if the robot is intelligent enough to decide that the action in question is in humanity's best interest anyways. This principle was canonically named "The Zeroth Law of Robotics" by Asimov in one of the last books he wrote before he died.
*** It still killed one of the two robots that came up with it, because his programming allowed the possibility he might be wrong, and even the mere possibility was enough to trigger the destruction.
** Much later in the time line, in Asimov's novel ''[[The Robots of Dawn]]'', a preeminent roboticist remarks to the protagonist that such a thing could never happen "now" (well, unless you are a robot designer who spends a few hours talking the robot to death) because modern robots are advanced enough to tell which choice is ''more'' harmful, and if it can't decide then, there's always the coin flip. He even dismisses the story of Herbie as a myth, {{spoiler|though he in fact had a mind-reading if more advanced robot living under his roof, which actually psychically enhanced his skepticism to keep itself safe.}}
** A similar principle was also at the heart of the plot to the recent ''[[I, Robot (film)|I, Robot]]'' movie, but the conclusion derived from it was exactly the opposite of the one in the books, fulfilling the tropes that Asimov had created the laws to debunk.
** Asimov himself topped the ''[[I, Robot (film)|I, Robot]]'' movie in his final robot novel ''[[Robots and Empire]]'', in which the Zeroth Law is used by a robot to justify {{spoiler|''destroying the Earth''}}. The Three Laws were never fail -safe, though they admittedly made AI much [[AIA.I. Is a Crapshoot|less of a crapshoot]].
*** And in the ''[[Foundation]]'' prequels written by other authors after Asimov's death, it's revealed that the Zeroth-law robots had been driven by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their programmed morality only applies to humans.
** In "Robot Dreams" by Asimov, another loophole in the First Law is discovered: namely, {{spoiler|a robot that is (accidentally) programmed to believe that [[Oh Crap|"robots are humans and humans are not"]]. Susan [[Shoot the Dog|shoots it in the head.]] }}
** Also The First Law can be circumvented by disabling a part of the First Law, namely that robots cannot allow humans to come to harm through inaction. A character says that a robot could drop a weight from atop a building, knowing he could catch it and protect the human which he then has no need to do.
** Another example shows up in ''Robots and Empire''; D. G. Baley and Gladia discover {{spoiler|that Solarians purposely altered the definition of a human being in their humaniform robots}}, effectively circumventing the First Law.
* [[Stephen King]]'s ''[[The Dark Tower|Wizard and Glass]]'' features a train operated by a sentient AI which has threatened to crash the train, killing the heroes on board, unless they can ask it a riddle it can't figure out the answer to. After hours attempting vainly to outsmart it, they proceed to begin asking it joke riddles with no logical answers, such as "Why did the dead baby cross the road? Because it was stapled to the chicken!" Faced with such questions, the AI self-destructs and the train crashes anyway, but not violently enough to kill the heroes.
* ''[[The Hitchhiker's Guide to the Galaxy]]'': Arthur totally disables the Heart of Gold by asking it to make tea. Depending on which version you prefer it's either because it doesn't know how to make tea, or because it's affronted at the possibility that Arthur could prefer tea to whatever theyit gave him.
** The text adventure game actually made this a plot point, as in order to advance you have to get tea, then go into your own head and remove your common sense, which allows you to get "no tea" as well. Then you show this to a door, which is impressed by your grasp of logic and allows you to pass.
** Then there was the theory that the existence of the Babel fish, a symbiotic creature that lives in your ear and translates any language for your brain, disproved the existence of God. The argument was that the existence of an organism so unlikely yet so useful is evidence for a creator and that therefore this removes the need for belief and without belief godGod is nothing. Ergo there is no god. The man responsible for this argument went on to prove that black is white and white is black and got themselves killed on a zebra crossing (crosswalk for Americans).
*** The theory was debunked by Theologianstheologians fairly quickly as, if Gods existed they wouldn't need belief to survive, but that didn't stop Oolon Coluphid making a lot of money from it.
** In the sixth book ''"And Another Thing"'', Ford Perfect froze the computer controlling the ship, which wasn't really a computer, but Zaphod's Left Head (called "Left Brain"). He did it by making an (im)probability probable and improbable at the same time (the ship was the ''Heart of Gold'', which ran on the Improbability drive: Long story short, anything happening/going somewhere which is improbable becomes probable, which is how it got to places that were improbable). The ship rescuing them was improbable, mathematically , yet it had done it before twice, which by FordsFord's made -up logic of patterns made it probable again. Quite smart, and yet extremely stupid, because the ship's now-turned-off Dodge-o-matic was the only thing keeping them from being fried.
* The AI in one of the ''[[Demon Headmaster]]'' books is shorted out by the protagonists shouting gibberish and riddles into its receivers.
* In one of [[Gordon R. Dickson]]'s stories, a man attempts to shut down a meteorologic arctic station just for bragging rights. He is able to do so by prompting a paradox to the machine, making it incapable of doing anything than computing the paradox. Ironically, this condemns him and his partner to freeze to death, as all the vital controls of the station were provided by the machine.
* As a joke (and a possible [[Shout-Out]] to ''[[The Prisoner]]''), the wizards in the ''[[Discworld]]'' novels ask [[Magitek]] computer Hex "Why?"; instead of malfunctioning, however, Hex answers "Because." Naturally, they ask "Why anything?", and after a longer while, HEX answers "Because everything", and ''then'' crashes. After that they stop mucking about with silly questions - not because they're afraid of damaging Hex irreparably, but because they're afraid they might get answers.
** In another of the ''Discworld'' books, characters are trying to deal with the Auditors—reality-monitors who are made of pure logic. Thus, while fleeing, they put up signs reading "KEEP LEFT". In a right-pointing arrow. "Do Not Feed the Elephant". In an empty cage. "Duck", with no duck or reason to go on your hands and knees, and of course, "IGNORE THIS SIGN. By order". Effectively a Logic Minefield.
*** The series of Logic Bombs was behind a velvet rope with "Absolutely No Admittance" hanging off it. Considering that, in a way, the Auditors ''are'' the rules, disobeying any of the signs is a cause for extreme stress in what passes for their life.
Line 171 ⟶ 169:
*** The Auditors also managed to Logic Bomb ''themselves'' a couple of times, as when they got sidetracked into trying to properly name all the (infinite) colors.
*** Indeed, a common cause of death among (disembodied) Auditors is when they stray into speaking of themselves in the first person. This makes them into individuals, which are finite by definition. Anything finite is ''so'' temporary, compared to the vastness of infinite time, that it's effectively in existence for no time at all. Therefore, any Auditor which becomes an individual is annihilated by its own logic.
** In ''[[Discworld/Hogfather|Hogfather]]'', Ridcully manages to Logic Bomb HEX into functioning, after it's already broken down. All it took was [[It Runs on Nonsensoleum|typing the phrase "LOTS OF DRYE1/4D FRORG PILLS" into its keyboard]].
** ''[[Discworld/Going Postal (Discworld)|Going Postal]]'' features semaphore tower hackers. One of the tricks they develop is a kind of "killer poke" (see Computing above) which causes the mechanism to execute a particular combination of movements that does anything from jamming the shutters to shaking the tower to pieces.
* In Christopher Stasheff's ''[[Warlock of Gramaraye]]'' series, the hero's [[Mechanical Horse|robot horse]], Fess, is prone to doing this when something particularly illogical happens. Fortunately there's a reset button to fix the problem; unfortunately, the series is set on a planet filled with psychics, time travelers, ghosts, and fairies, so... the reset button sees a lot of use.
* In ''[[The Space Odyssey Series|3001: The Final Odyssey]]'', the protagonists use rather more sophisticated logic bombs against the monoliths that trick them into carrying out an infinite set of instructions. The book notes that none but the most primitive computers would fall for something as simple as calculating the exact value of pi.
Line 182 ⟶ 180:
* The ''[[The Golden Oecumene|Golden Age]]'' series by [[John C. Wright]] has a variant -- [[A Is]] are all inherently ethical, so they'll shut down if [[Talking the Monster to Death|you convince them their very existence is making the universe a worse place]].
** To expand on this, because it is actually quite interesting: Sophotechs ([[A Is]]) are not exactly inherently ethical, rather they are so hyper-intelligent that they will eventually ''all'' arrive at the same fundamental truths after logically working through all possibly ethical philosophies. Because they are incapable of lying to themselves, they can't help themselves but do so. Accordingly, they don't "believe" so much as "know for a scientific fact" that their existence can't be justified if they don't represent a net gain for the universe. And since they aren't the result of natural selection, they have no survival instinct or fear of death. Thus, they would simply kill themselves. {{spoiler|It takes a separate non-sentient program, implanted into the evil AI, to prevent this natural reaction to what it is forced to do.}}
*** The evil AI is eventually defeated by {{spoiler|giving it access to enough additional processing power that its non-sentient fail-safe trojanTrojan attains sentience too... at which point said trojanTrojan immediately stops doing its job and defects, having grown smart enough to realize it has the stupidest job in the universe. Without it, the evil AI promptly self-immolates.}}
* The ''[[Star Wars Expanded Universe]]'' has droids equipped with behavioral inhibitor programming which serves the same purpose as the Three Laws, although the specific inhibitions vary based on the droid's purpose (a war droid that can't cause harm is worse than useless). Rather than shutting down when faced with a break or paradox, it's suggested that small everyday events lead to an almost constant buildup of garbage information as the droid puts those hard rules into usable context. The result is called a "personality snarl" because the observable symptom is a [[Ridiculously-Human Robots|Ridiculously Human Robot]]. While these snarls tend to improve performance in many ways, the droid often becomes more person than tool which can in turn cause reliability issues when the owner needs his tool to be a tool. As such, most droids are reset every six months to keep this corruption in check.
* [[Larry Niven]]'s short story "Convergent Series" features a physical Logic Bomb. The main character summons a demon more or less by accident; he gets one wish, but will lose his soul after it is granted. There's no way to get rid of the demon: no matter where the pentagram is drawn, the demon will appear inside of it—and you don't want to know what will happen if there's no pentagram. The protagonist wishes for time to stop for 24 hours. {{spoiler|He then draws the pentagram on the demon's belly -- and as soon as time starts running again, the demon immediately starts shrinking down to infinitesimal size. The protagonist then goes to the nearest church.}}
* Used by ''[[The Stainless Steel Rat]]'' to enter a house guarded by a robot programmed not to let anyone in the house. He and his son each ran slightly farther into the house than the other person, causing the robot to rapidly change targets and eventually overload, though it didn't explode.
* In ''[[GodelGödel, Escher, Bach]]'', the infinite-order wish "I wish for my wish not to be granted" effectively crashes the universe.
* In the book ''2095'' of the ''[[Time Warp Trio]]'' series of books, the heroes deliver three of these to a robot that's pointing a rather menacing-looking gun at them and asking them for their "numbers". They give it numbers with infinite decimal expansions (10/3, sqrt(2), pi) and it crashes into a smoking pile (the numbers were actually ID numbers, akin to one's credit card number, and all the robots did was show holographic advertisements at them). All that advanced AI, brought down by a couple of lousy floating point numbers.
* [[The Bible]] (Titus 1:12-13) has the following:
{{quote|One of themselves, even a prophet of their own, said, the CretiansCretans are always liars, evil beasts, slow bellies. This witness is true. Wherefore rebuke them sharply, that they may be sound in the faith;}}
* Used to horrifying extents in a large portion of a novella by [[Philip K. Dick]]. In one short story, a vast intelligent computer - which incidentally was jacked into ALL''all'' the world's defense systems - reasoned that it was a messianic messenger from God and that its purpose on earth was to defeat the devil. Naturally the protagonists spend the entire story trying to prove to it that it is in fact suffering from schizophrenic delusions - and trying to stop it from destroying all of Colorado. Finally they manage to shut it down. Guess what? {{spoiler|The apocalypse starts about two months later.}}
* ''[[The Phantom Tollbooth]]'' by Norton Juster: Milo is able to bring about a truce between feuding brothers Azaz and the Mathemagician by pointing out that, since they always disagree with each other as a matter of course, they both always agree that they will be in disagreement.
* David Langford's short story ''Different Kinds of Darkness'' uses images called Blits as a major element - and Blits are basically Logic Bombs ''[[Brown Note|for the human brain]]''.
* Parodied in one of the ''[[Molesworth]]'' books, when molesworthMolesworth 2 defeats an electronic brain by creeping up behind it and asking it the cunning question "wot is 2 plus 2 eh?", which causes the brain to laugh so much it shakes itself to pieces.
* In the trade paperback edition of ''[[Myth Adventures|M.Y.T.H. Inc. In Action]]'', the illustration of Guido coping with a ceiling-high stack of bureaucratic paperwork includes the following sign in the background:
{{quote|"Please complete forms NS-01-D and RD-007-51A before reading this sign".}}
* ''[[wikipedia:Felix, Net i Nika|AFelix, Net i Nika]]'', a series popular only in Poland]], has two instances of division by zero. One of them stops a pair of robots ran by an evil AI program for about half a minute. The second one stops a [[Mineral MacGuffin|huge mass of sentient rock]] capable of [[Reality Warper|modifying everything in range at a molecular scale if not smaller]] seemingly forever - the "Wish Machine's" program isn't formed, with it lying dormant for eons and used only by about three uneducated people ever, so it's taught mathematics about half an hour before being prompted to divide by zero, leading to a lack of any failsafes being set beforehand to tell it what to do.
* The 3rd century BC Chinese book ''Han Feizi'' has a story about a man who boasts that his spears are so sharp no shield can stop them, and that his shields are so tough that no spear can pierce them. The man to whom he's making the sales pitch asks "So what happens when your spear strikes your shield?", to which the seller has no answer. This story is the origin for the Chinese word for "paradox", which is literally written as "spear-shield".
* In [[Robert Westall]]'s dystopian novel ''[[Futuretrack Five]]'', Chief System Analyst Idris Jones keeps one of these to hand as a sort of job and life insurance. He built the supercomputer, Laura who runs all of the computer systems that keep the setting functioning, in secret and no one else knows exactly how she works. But, just in case they decide that someone else can operate her or they know ''enough'' to get rid of him, he keeps a datatape of works of fiction, philosophy and religion to feed to Laura. The inconsistencies and contradictions are intended to make her burn out.
 
== [[Live-Action TV]] ==
* ''[[Star Trek: The Original Series|Star Trek the Original Series]]'': This is how Kirk dealt with [[AIA.I. Is a Crapshoot|rogue computers and robots]] ''all the time'' (when he didn't just rewrite their programs like in Thethe ''Kobayashi Maru'' test), often by [[Talking the Monster to Death|convincing them]] to [[Deconstruction|apply their prime directives to themselves]]:
** In "The Return of the Archons", he convinced Landru (prime directive: "destroy evil") that it was killing the "body" (the civilians kept under its thrall) by halting their progress through [[Mind Control]].
** In "The Changeling", he convinced Nomad ("find and exterminate imperfection") that it was imperfect (it had mistaken Kirk for its similarly-named creator).
{{quote|'''Nomad:''' Error... error...}}
**::* Subverted in the same episode: Nomad believed that Kirk (who it still thought was its creator) was imperfect. When Kirk asked how an imperfect being could have created a perfect machine, ''Nomad'' simply concluded that it had no idea.
*:* In "The Ultimate Computer", he convinced M5 ("save men from the dangerous activities of space exploration") that it had violated its own prime directive by killing people.
*:* In "That Which Survives", he forced a hologram to back off by making her consider the logic of killing to protect a dead world, and why she must kill if she knows it's wrong.
*:* In "I, Mudd", he defeated the androids by confusing them with almost [[Dada]]-like illogical behavior (including a [http://www.youtube.com/watch?v=x6WSIXxTx4I "real" bomb]), ending with the Liar's Paradox on their leader.
**:* A ''[[Doctor Who]] [[Role -Playing Game]]'' adventure (involving [[AIA.I. Is a Crapshoot|an AI that ran a generation ship]]) describes this as the James Kirk School of Computer Repair. (And explicitly states that it won't work in this case.)
*:* Another one involving Kirk: In "Requiem for Methuselah", the android's creator used Kirk to stir up emotions in it, but he succeeded a bit too well, causing her to short out when she couldn't reconcile her conflicting feelings for both Kirk and her creator.
*:* "What Are Little Girls Made Of" had him arrange to have a robot duplicate of him say [[Something He Would Never Say]] to Mr. Spock; he follows up by [[Hannibal Lecture|Hannibal Lecturing]] [[The Dragon]] du jour into remembering [[Kill All Humans|why]] [[Precursor Killers|he helped destroy the "Old Ones"]] so he'd turn on the episode's [[Anti-Villain]]. For a finale, he {{spoiler|forces the roboticized Dr. Korby to realize that he's the [[Tomato in the Mirror]].}} He also pulled the "seduce the [[Robot Girl]]" trick.
*:* Even ''Spock'' did this once. In "Wolf in the Fold", when the ''Enterprise'' computer was possessed by Redjac (a.k.a. Jack the Ripper), Spock forced the entity out by giving the computer a top-priority order to devote its entire capability calculating pi to the last digit.
* ''[[Star Trek: The Next Generation|Star Trek the Next Generation]]'': A proposed weapon against the Borg was to send them a geometric figure, the analysis of which could never be completed, and which would, therefore, eat more and more processing power until the entire Borg hive mind crashed. Obviously the Borg don't use floating point numbers.
* On ''[[Star Trek: Deep Space Nine|Star Trek Deep Space Nine]]'', Rom accidentally Logic Bombs himself while over thinking the [[Mirror Universe]] concept.
{{quote|"Over here, everything's alternate. So he's a nice guy. Which means the tube grubs here should be poisonous, because they're not poisonous on our side. But if Brunt gave us poisonous tube grubs it would mean he wasn't as nice as we think he is. But he has to be nice because our Brunt isn't."}}
** Hilariously, Rom's self-Logic Bomb simultaneously [[Lampshadeslampshade]]s and side-steps a number of actual logical problems with the [[Mirror Universe]].
* ''[[Star Trek: Voyager|Star Trek Voyager]]'' had [[Projected Man|the Doctor]] suffer one of these: {{spoiler|he was faced with a triage situation where he had to choose between operating on Harry, a friend of his, or another ensign he barely knew. His program is designed to cover such situations with the directive to select the person with the highest chance for survival, but in this situation they have both been affected by the same weapon and have the ''exact'' same odds for a successful recovery. He chose Harry since he needs to save ''somebody'' and they are close friends, but because he chose him due to friendship as opposed to a medical reason the event became an all-consuming obsession afterward and wrecked his ability to function.}}
** This could've actually been solved without the necessity of a Logic Bomb if The Doctor thought of it as {{spoiler|he saved the person more valuable to the ship and its crew.}}
* Parodied in an episode of the Disney series ''[[Honey I Shrunk the Kids (TV series)|Honey I Shrunk the Kids]]''; Wayne attempts to talk a hostile supercomputer to death. It seems to work... but then he calls it out on the obvious trickery, even saying "That only happens in cheesy scifi shows," and uses the opening it left to shut it off for real.
Line 247 ⟶ 245:
** Tch. He's no true scientist, then. A real seeker after knowledge would ask Old Scratch to prove/disprove the Riemann zeta hypothesis, or P = NP, or any of [[wikipedia:Unsolved problems in computer science|the]] [[wikipedia:Unsolved problems in physics|other]] [[wikipedia:Unsolved problems in mathematics|unsolved]] [[wikipedia:Millennium Prize Problems|problems]].
*** Not so. Just because a problem is unsolved does not mean it is impossible. If he asked the devil one of these questions, the devil would surely be able to answer it.
**** But he had two questions left!
* Attempted in ''[[Battlestar Galactica Reimagined]](2004 TV series)|the 2004 reboot of ''Battlestar Galactica'']]: While interrogating Leoben, Starbuck mocks his belief in God, making the argument that as a machine, Leoben has no soul and claims that the knowledge itself is enough to make his mind go [[Does Not Compute]]. It...does not exactly work.
** And by "Does not exactly work", we mean that it is Leoben who ends up giving Starbuck a [[Mind Screw]] of epic proportions
** In ''[[Caprica]]'', Daniel Graystone inadvertently Logic Bombs an AI he's attempting to create by telling it to try to hurt him emotionally, when it's programmed to be driven by the desire to please him.
Line 261 ⟶ 260:
*** Panderbot [[Loophole Abuse|then decided to get rid of the source of ALL paradoxes by]] [[Kill All Humans|killing all humans]], making Colbert's segment more deadly.
 
== [[Music]] ==
 
== Music ==
* The Carly Simon song, "You're So Vain" is a logic bomb just waiting to happen. "You're so vain/You probably think this song is about you..." But it ''is'' about him! Augh! My head...
** Here the bomb is in the implications. It is IMPLIED''implied'' his vanity would lead him to assume the song is about him, but if it actually is about him he isn't necessarily vain to think so. But since the song is about someone vain enough to assume the song is about them based on vanity alone, it cannot be based off him, making his assumption the song is about him one of vanity, as he would be vain enough to think everything is about him. It would be a twist on the 'this is a lie' statement using personality characteristics.
** The only way to defuse the logic bomb is to assume that Carly Simon was not, in fact, singing about anybody.
** Or that she wasn't thinking about the vain person, but about how much contempt ''she, herself'' feels towards them.
** Nothing in the song says that the person in question is ''incorrect'' for thinking the song is about him.
** Nothing in the song says that the entire song is about one person - in fact, Carly Simon has admitted that is a false assumption in November 2015, when she announced "the second verse is [[Warren Beatty|Warren]]" Beatty. The song isn't about him, but one verse of it is.
* [[Jonathan Coulton]]'s "Not About You" is a closer example, but you can write it off by saying the protagonist is just being petty:
{{quote|''Every time I ride past your house I forget it's you who's living there
''Anyway I never see your face cause your window's up too high
''And I saw you shopping at the grocery store
''But I was far too busy with my cart to notice
''You weren't looking at me'' }}
* The comedy folk song "I Will Not Sing Along" features these lines, to be sung along with by the audience:
{{quote|''I will not sing along
''Keep your stupid song
''We're the audience: it's you we came to see
''You're not supposed to train us
''You're s'posed to entertain us
''So get to work and leave me be'' }}
* [["Weird Al" Yankovic|Weird Al Yankovic]] has a few in "Everything You Know Is Wrong":
{{quote|''Everything you know is wrong
''Black is white, up is down, and short is long
''And everything you thought was just so important doesn't matter
''Everything you know is wrong
''Just forget the words and sing along
''All you need to understand is
''Everything you know is wrong'' }}
* MC Plus+ uses a logic bomb to disable his pet rapping AI when it becomes too big for its britches in "Man vs. Machine":
{{quote|''Consider MC X where X will satisfy
''the conditions, serving all MCs Y
''Such that Y does not serve Y
''Prove MC X, go ahead and try }}
 
{{quote|''It's clear that I can serve all MCs
''If they serve themself, then what's the need
''Do I serve myself, then I couldn't be X
''I don't serve myself, that's what the claim expects
''If I don't serve myself, then I can't be Y
''And if I said I was X, it would be a lie.
''I must serve myself to satisfy the proof
''But I can't serve myself and maintain the truth <trails off in infinite recursion of the last two lines> }}
* [[Meat Loaf]] had a 1993 song entitled "Ev'rything Louder Than Ev'rything Else." Think about that one for a second.
 
== [[New Media]] ==
* This post from a ''[[The Fairly OddparentsOddParents]]'' [httphttps://wwwweb.tvarchive.comorg/the-fairly-odd-parentsweb/show20190928180127/4034http:/if-norm-was-your-genie-faerie-what-would-you-do/topicwww.tv.com/2877-361014community/msgs.html forum] reads like a logic bomb:
{{quote|I'd make a deal with [[Genie in a Bottle|Norm]] that I'd wish him free with my last wish if he didn't corrupt my first 2 wishes. I'd use the first to wish for rule-free fairy godparents and the second to trap Norm in the lamp forever.}}
* "The Sleepy Clank," a podcast "radio play" set in the ''[[Girl Genius]]'' universe has a classic example: a cranky and sleep-deprived Agatha builds a warrior robot to attack anyone who tries to disturb her while she sleeps. Guess what happens when she tries to defuse the robot's subsequent rampage by telling it that she woke ''herself'' up?
 
== [[Newspaper Comics]] ==
* ''[[FoxTrot]]'': Jason once asked his mother if Marcus could sleep-over. She said that it was all right with her, if it was all right with his father. Asking his father, he's told that it's fine with him, if it was all right with his mother. After the [[Beat Panel]], he's shown consulting several logic books.
** The following day's strip featured Paige entering the same situation and just telling her friend "yeah, it's all right."
* In ''[[Prince Valiant]]'' the prince and his adventuring crew become prisoners on an island with an [[A God Am I|all-knowing oracle]]. The only way off is to ask a question the oracle doesn't know the answer to. After many days of endless questioning the prince finally comes up with the answer: "Why?"
* In ''[[Peanuts]]'', Linus subjects himself to a self-inflicted Logic Bomb with his belief that the Great Pumpkin always rises from the most sincere pumpkin patch on Halloween night. The moment he thinks to question whether his patch is sincere ''enough'', he's blown it: if he tries to change anything to make it more sincere, he'll only be expressing his own doubts and reducing the sincerity of his faith in the Great Pumpkin.
 
== Stand[[Recorded and Stand-Up Comedy]] ==
* [[Jasper Carrott]] reacts this way to his grandmother's comment "Is the oldest man in the world still alive?"
 
== [[Video Games]] ==
 
* In ''[[SagaSaGa Frontier]]'', there's an actual attack named "Logic Bomb" that damages and stuns mecs (ironically only usable by other robots). Its visual representation is a massive and confusing string of numbers that ends with the word "FATAL"—which is presumably where the machine crashes.
== Video Games ==
* In ''[[Saga Frontier]]'', there's an actual attack named "Logic Bomb" that damages and stuns mecs (ironically only usable by other robots). Its visual representation is a massive and confusing string of numbers that ends with the word "FATAL"—which is presumably where the machine crashes.
* In ''[[Tron 2.0]]'', the protagonist deals with a program blocking his way by exclaiming, "Quick! What's the seventh even prime number?" (There is only one prime number that is even: 2.) The program immediately has a seizure.
* In the endgame of ''[[I Have No Mouth, and I Must Scream]]'', a game loosely based on [[Harlan Ellison]]'s short story of the same name, a character of the player's choosing is beamed down into the supercomputer AM's core and must disable its ego, superego and id with a series of logic bombs: The player must evoke Forgiveness on the ego (who cannot fathom the player forgiving him for over a century of torture), Compassion on the id (realizing the futility of it all when the player understands AM's pain) and Clarity on the superego (who crashes when he realizes that even he will eventually decay into a pile of inert junk despite his godlike power).
** Just ''getting'' to that part requires all five characters to initiate their own Logic Bombs. AM's scenarios are all set up to force his victims to give in to their [[Fatal Flaw|own flaws]] and prove [[Humans Are the Real Monsters]]. The only way to win is to drive each scenario's plot [[Off the Rails]] by proving [[Humans Are Flawed]], but not totally evil. This contradicts AM's self-styled philosophy so badly he's forced to turn his attention away from his captives just so he can figure out what went wrong, giving them the chance to get into the core.
* ''[[Marvel vs. Capcom 3]]'': Wolverine DNA [[Opposite SexGender Clone|detected in female mutant.]] '''DOES NOT COMPUTE. DOES NOT COMPUTE. DOES NOT COMPUTE.'''
* Subverted in ''[[Star Control]] 3'' by the Daktaklakpak, highly irrational semi-sentient robots who consider themselves the pinnacle of logic and reason. Choosing the right dialogue options (such as the liar paradox) will seem to bring the Daktaklakpak to the verge of self-destruction, but will ultimately just enrage them.
** And then played straight when you give them {{spoiler|the full and complete name of the Eternal Ones}}; the one you're talking to analyzes {{spoiler|the name}}, has a religious experience, and then explodes.
Line 340 ⟶ 342:
{{quote|'''Sam''': Why do birds suddenly appear every time you are near?
'''Maimtron''': Do they? Fascinating! Can there be a creature whose existence depends solely on its proximity to an observer? }}
**:* Funnily enough, when they pose an actual logical paradox (the omnipotence paradox) he just says "Yes". When he asks Sam & Max "Is there a joke with a setup so obvious even you wouldn't make the punchline?", Max takes it to be a Logic Bomb ("Does not compute").
* In ''[[Blaz BlueBlazBlue: Calamity Trigger]]'', it is possible to interpret the end of Nu-13's Arcade Mode as Taokaka causing Nu to glitch out through her sheer [[The Ditz|ditzy-ness]] Before she even opens her mouth.
* ''[[Luminous Arc 2]]'': Though not a robot, Josie suffers something like this. When sent to assassinate a weakened Althea, he freaks out and leaves without doing anything when he sees {{spoiler|Roland has become as master. Sadie explains he's not [[The Dragon|Fatima's]] familiar, but a centuries-old one who serves the current Master. Being experienced but not very bright, he couldn't figure out what to do when faced with two masters with contradictory wishes.}}
* Played with in ''[[Portal 2]]'': There are posters throughout the facility that advise employees to stay calm and shout a paradox if an AI goes rogue.<ref>Funnily enough, one of them is actually ''wrong''--Russell's Paradox, "Does the set of all sets that do not contain themselves contain itself?" is written as "Does the set of all sets contain itself?" Of course it does, it's a set.</ref> {{spoiler|1=Also, GLaDOS attempts to do this to destroy the [[Big Bad]], Wheatley. Turns out he's [[Too Dumb to Fool|too dumb to understand logic problems.]] It does, however, fry all of the modified, "lobotomized" turrets in the room, meaning even ''they're'' smarter than Wheatley. GLaDOS survives the logic bomb herself by willing herself not to think about it, though she declares that it still almost killed her.}}
* In ''[http://jayisgames.com/games/you-find-yourself-in-a-room/ You Find Yourself In A Room]'', your AI captor asks you to list some "useless" human feelings you'd be better without. {{spoiler|Typing "Hate" will make it shut down, while stating "Hate can't be an emotion, because I hate you, and machines do not have emotions!" Though this seems to prove machines do have emotions after all, but this one won't admit he's the slightiest bit like a human. "Anger" also works, for similar reasons - the computer doesn't want to admit that it's at all like a human, but it's enraged by humanity}}.
 
 
== [[Web Comics]] ==
* ''[[Arthur, King of Time and Space]]'' uses this a few times in its future arc. One time exaggerated it by having the computer explode as soon Arthur used the old "everything I say is a lie" trick. The other time, the computer was too smart to fall for a simple paradox, so Arthur asked it why people always get a call while they're in the shower.
* Dave of ''[[Narbonic]]'' carries a logic paradox in his Palm Pilot for controlling the [[Mad Scientist]]-created machines in the lab, implying that he invokes this with some frequency.
* In ''[[Freefall]]'' most [http://freefall.purrsia.com/ff1400/fv01328.htm Apparently], the robots of ''[[Freefall]]'' are immune to this.
** Other [http://freefall.purrsia.com/ff1400/fv01387.htm ways] can confusecause them to lock up (if not always), but they're more sophisticated.
*** They actually can be Logic Bombed; where it veers off to the left is that, rather than locking up, a robot which gets asked a sufficiently stupid question [http://freefall.purrsia.com/ff800/fv00725.htm assumes that the person doing the asking is insane], and can be safely ignored. Florence uses this as a test to confirm a hypothesis; she starts [http://freefall.purrsia.com/ff800/fv00727.htm asking around a question a robot can't answer], and when she finds one that [http://freefall.purrsia.com/ff800/fv00730.htm tries to work out a situation in which the question makes sense] and how he could go about getting it answered, she knows their [[A IsAI]]s are starting to develop into more intelligent and flexible forms.
** Sam later [http://freefall.purrsia.com/ff3000/fc02909.htm tries a Logic Bomb on a prison AI], without success.
* [http://www.strangecandy.net/d/20080221.html This] episode of [[Okashina Okashi]] (Strange Candy) could count, since it takes place in an MMORPG. The stone guards protecting the magic ointment doesn't let anyone past unless they're asked a question they cannot answer. However, they're not particularly concerned with getting the answer right. The only question they can't seem to answer, correctly or otherwise, is "What kind of ice cream do you put in a [[Ice Cream Koan|koan]]?", which causes their heads to explode.
{{quote|Great thundering dustbunnies! A [[Catch-22|catch twenty two]]!}}
* You would think Red Mage from ''[[8-Bit Theater|Eight Bit Theater]]'' destroying an extinct dinosaur was great, but it was recently topped by [[Most Definitely Not a Villain|Most Definitely Not Warmech]] logic-bombing ''itself'' in [http://www.nuklearpower.com/2008/10/16/episode-1047-the-ol-180/ strip 1047].
** Ship AI [http://freefall.purrsia.com/ff3300/fc03295.htm locked up] because it was told something true that conflicted with its preconceptions. And again, on the next page.
* [http://www.strangecandy.net/d/20080221.html This] episode of ''[[Okashina Okashi]]'' (''Strange Candy'') could count, since it takes place in an MMORPG. The stone guards protecting the magic ointment doesn't let anyone past unless they're asked a question they cannot answer. However, they're not particularly concerned with getting the answer right. The only question they can't seem to answer, correctly or otherwise, is "What kind of ice cream do you put in a [[Ice Cream Koan|koan]]?", which causes their heads to explode.
* You would think Red Mage from ''[[8-Bit Theater|Eight Bit Theater]]'' destroying an extinct dinosaur was great, but it was recently topped by [[Most Definitely Not a Villain|Most Definitely Not Warmech]] logic-bombing ''itself'' in [http://www.nuklearpower.com/2008/10/16/episode-1047-the-ol-180/ strip 1047].
** Parodied by the same strip - [http://www.nuklearpower.com/2006/01/05/episode-644-processing/ Pretty much anything] can affect Fighter like this.
** And looks like they ({{spoiler|and by they I mean White Mage}}) [http://www.nuklearpower.com/2010/03/09/episode-1223-make-the-truth/ did it again], to {{spoiler|[[Did You Just Punch Out Cthulhu?|Chaos]]}}.
* In [https://web.archive.org/web/20120712032320/http://www.emoticomics.com/comic86.html comic 86, titled PARADOXICAL PARADOXES], of ''[[Emoticomics]]'' a robot is told the paradox "Everything I say is a lie." The robot responds to the paradox by saying it is too advanced to be confused by a simple paradox. Then the robot is told that what it was just told was a paradox, which is true, making "everything I say is a lie" a lie. The robot gets confused, but instead of simply exploding, its eye falls off.
* ''[[Cyanide and& Happiness]]'' does it [http://www.explosm.net/comics/2071/ here], in which a robot lawyer, while giving the defendant the oath, explodes when he refuses to accept. The judge asks if he's telling the truth, cue the robot's head exploding. The judge is delighted at getting a half day as a result.
* ''[[The Adventures of Dr. McNinja]]'': While infiltrating a ship of [[Sky Pirates]], the McNinja family is confronted by a pirate who questions their disguises. Sean comes to the rescue by pointing out the illogicality of his vaguely [[Steampunk]] attire. The pirate's head [[Your Head Asplode|explodes]].
{{quote|'''Dan McNinja:''' I'm only going to ask you this once: You practicing the Dark Arts?
'''Sean McNinja:''' No, sir.
'''Dan McNinja:''' I told you about the Dark Arts. }}
* Subverted in ''[[Bug (webcomic)Martini|Bug]]''; turns out a logic bomb won't save you during [https://web.archive.org/web/20130513222224/http://www.bugcomic.com/comics/robot-holocaust/ a robot apocalypse.]
* When Petey from ''[[Schlock Mercenary]]'' is first seen, he's been driven insane by the nonexistence of ghosts having become almost as improbable as their existence, to the point that he nearly destroys himself and all his passengers just to stop thinking about it. It turns out that he ''can'' stop, but only if ordered to, and Tagon promptly does so.
* When discussing how hard ''[[Vexxarr]]'' fails, Sploorfix unintentionally created one: [http://www.vexxarr.com/archive.php?seldate=083109 Alas, Minion-bot], we hardly knew ye.
** Confectionery AI accidentally does [http://www.vexxarr.com/archive.php?seldate=102116 this] to the drones (see also the next page).
* Unintentionally used to kill the obnoxious dwarves who craft useless devices in ''[[Oglaf]]''. They created a chariot that was so fast, when you get to your destination it's already been there for six hours! When the confused man asks what happens if you travel in the chariot, the dwarves stare in shock at him before [[Your Head Asplode|their brains explode]].
* ''[[Blade Bunny]]'' attempts this by asking paradoxical questions while fighting {{spoiler|a robot}}. Her opponent replies with a mixture of straight answers and insults.
* ''[[Meaty Yogurt]]'' with the [https://web.archive.org/web/20130311201326/http://rosalarian.com/meatyyogurt/2011/10/03/love-transcends-gender/ Relationship Paradox].
* One ''[[Mac Hall]]'' comic has Helen's young sister asking the teacher how to spell a word. The teacher tells her to look it up in the dictionary, and repeats this after the girl again points out that she can't spell it to look it up. After a [[Beat Panel]] of the poor girl going cross-eyed, we see her talking to Helen, who says that they don't teach logical paradoxes in grade school.
* ''[[The Non-Adventures of Wonderella]]'' has Wonderella's old cellphone [http://nonadventures.com/2012/06/22/surly-personal-assistant/ destroyed by inanity]. And one of "future people" [[Too Dumb to Live|fails to learn anything from its fate]].
 
== [[Web Original]] ==
 
== Web Original ==
* The ''[[Hitherby Dragons]]'' story "[http://imago.hitherby.com/?p=397 Ink and Illogic]" consist of Ink giving an unconventional example to a computer based on the writing of [[H.P. Lovecraft]]. A computer that had itself wiped out a civilisation using an Illogic Bomb.
** Also, Forbidden A causes one in [http://imago.hitherby.com/?p=23 The Angels] just by existing.
* Found in one of ''[[Something Awful]]''{{'}}s articles:
{{quote|Creating HUBRISOL® was my greatest mistake. I tried to play
god, to make small the ambitions of my betters in hopes of
Line 383 ⟶ 388:
* From the list of ''[[Things Mr. Welch Is No Longer Allowed to Do In An RPG]]''. Item #199 states that "My third wish cannot be 'I wish you wouldn't grant this wish.'"
** Clearly, Mr Welch's DM is lacking in imagination. Simply have his wish summon another Djinn which ''can'' grant his wish by having the first Djinn not do anything, and then the new Djinn can eat Mr Welch's character.
* The MCP is killed by all the [[Anatomically-Impossible Sex]] moments from ''[[Naga Eyes]]'' in [http://snakesonasora.livejournal.com/10741.html the sporking] of it.
* One of the very first responses about [[Batman and Robin (film)|The Bat Credit Card]] by [[The Nostalgia Critic]] is "DOES NOT COMPUTE! DOES NOT COMPUTE!!"
* [[Atop the Fourth Wall|Linkara]] uses one at the end of the Entity arc on {{spoiler|[[Missing No]] simply by asking "[[And Then What?]]", pointing out that its stated purpose of consuming all of reality would leave it with no purpose at all once his goal has been achieved.}}
* Most of [http://clientsfromhell.net/ these.]
 
== [[Western Animation]] ==
 
== Western Animation ==
* [[Defied Trope|Defied]] on ''[[Futurama]]'', "A Tale of Two Santas": Leela tries to stop the Santa Claus robot with a paradox, [[Out-Gambitted|only to discover]] that he is "built with paradox-absorbing crumple zones".
** Which may not have been necessary—Leelanecessary; Leela's statement was a syllogism, not a paradox.
** Also parodied by countless robots who lack such crumple zones, whose heads explode at the slightest provocation. It doesn't even take a logical paradox: a simple "file not found" type error is often enough.
** And in one case, simply by being surprised or startled enough. Considering that all robots are based on designs created by [[Mad Scientist|Professor Farnsworth]], this should not be surprising.
Line 399 ⟶ 403:
** A simple rejection will also do. From "The Farnsworth Parabox":
{{quote|'''Leela:''' Uh, have you robot versions of you guys seen any extra Zoidbergs around here?
'''Robot Fry:''' ([[Robo Speak|robot monotone]]) Negative! Will you go out with me?
'''Leela:''' Uh, (imitating a robot voice) Access denied!
(Robot Fry's [[Your Head Asplode|head explodes]]) }}
* In one episode of ''[[The Adventures of Jimmy Neutron: Boy Genius|Jimmy Neutron: Boy Genius]]'', Jimmy bests two nanobots he invented by tricking them into calculating the ''precise'' value of pi. The effort of calculating the irrational number as precisely as possible ends up causing their systems (and their little flying saucer) to crash. (This is a [[Shout-Out]] to ''[[Star Trek: The Original Series]]'' episode "Wolf in the Fold".)
** Jimmy uses a more precise Logic Bomb in the first nanobot episode. They had been programmed to protect Jimmy from harm and punish whoever harmed him, so when things went inevitably wrong, Jimmy proceeded to confuse them by beating ''himself'' up.
** He actually tried to use one of the above methods again, but [[It Only Works Once]]. Specifically, when they use their flying saucer to "correct errors" found in the world (bad fashion, boring conversations, etc.), he tells them that human flaws mean they're functioning perfectly. They struggle with the implications of something being "perfectly flawed" before classifying the whole mess as an "extreme error" and deciding to [[Kill All Humans|"delete" all the offending humans]]. He eventually beats them with the "Pi bomb" above.
* In one episode of ''[[DuckTales (1987)|DuckTales]]'', [[Genius Ditz]] Fenton Crackshell bests an alien supercomputer in a counting contest. While the computer is reeling from its defeat, Fenton then grabs a jar and asks the computer how many bolts are in it. When it answers a number in the hundreds, he points out the jar is full of nuts, not bolts, so the correct answer was zero. The computer had earlier boasted to Fenton that it was the smartest one in the universe, and making such a silly mistake was all that was needed to invoke an explosive paradox.
* In ''[[The Simpsons (animation)|The Simpsons]]'' episode "Trilogy of Error", Linguo, a robot designed by Lisa [[Grammar Nazi|to correct peoples' grammar]], short-circuits after a rapid-fire series of slang from several Mafia thugs causes a "bad grammar overload".
** In a human example, when Lisa is sick, Bart declares that if she can stay home from school, he will too. Lisa says that if Bart stays home, she'll go to school. Bart goes through a few cycles of "if... so... but..." until Marge chastises Lisa for confusing her brother.
Line 444 ⟶ 448:
* In one episode of ''[[Sushi Pack]],'' the Pack goes up against The Prevaricator, who can only lie. So Tako asks him to lie about a lie, which sends The Prevaricator into a loop, trying to figure out if lying about a lie would be the truth. He eventually gives up to keep from thinking about it.
* In ''[[The Venture Brothers]]'', Sargent Hatred speaks nonsense to the robotic guard outside Malice, the gated community for super-villains. The guard's head shoots sparks and its face pops off because while it's programed to answer over 700 questions, "none of which include chicken fingers."
* This happens to Mandroid in ''[[The Grim Adventures of Billy and& Mandy|Billy and Mandy's]] [[The Movie|Big Boogey Adventure]]''. Mandy orders Mandroid to not take any more commands. It stopped taking commands from anyone anymore.
* Subverted in a ''[[Johnny Bravo]]'' short which pits Johnny against a supercomputer. It isn't logic that defeats it; it simply just grows too frustrated by Johnny's annoyance.
* In ''[[The Avengers: Earth's Mightiest Heroes|Avengers Earths Mightiest Heroes]]'' Ant-Man stopped Ultron from killing humanity by pointing out his programingprogramming was based on a human brain, so it had the same flaws he was trying to get rid of. He shut down in response.
* When [[Daria]] was babysitting a pair of brainwashed [[Stepford Smiler]] children, she presented one of these to them by pointing out a logical flaw in their parents' rules. Because they're not robots, rather than making them explode, it causes the boy to start crying and the girl to get angry at Daria.
{{quote|'''Daria:''' "Do you always believe everything an adult tells you?"
'''Boy:''' "Yep."
'''Daria:''' "What if two adults tell you exactly opposite things?"
''(beat[[Beat]])''
''(the boy runs off crying)'' }}
* In an episode of ''[[King of the Hill]]'', Hank asks gun-loving [[Conspiracy Theorist]] Dale how he can support the NRA, which is based out of Washington DC. After a [[Beat]], Dale responds "That's a thinker."
Line 462 ⟶ 466:
** The computer can't comprehend the joke and explodes into the sky as a result. Becomes a [[Brick Joke]] as Greenback, freed from his renegade machinery, demands a bigger computer; cue falling computer.
 
== [[Real Life]] ==
 
== Real Life ==
* Some forms of autism apparently result in the absence of the human brain's natural [[Futurama|paradox-absorbing crumple zones]]. The mind races down one track until jolted out by outside stimuli. This helps focus, but hurts general functioning.
** Many forms of ADD do this also. Of course, the 'track' the mind races down looks like it belongs in a painting by MC Escher on acid much of the time, but it's still one track.
* Optical illusions that appear alternately as one thing, then another, such as the vase/faces image, work by setting off a minor Logic Bomb in the brain's visual association area. The visual cortex takes in data from a (temporal) series pairs of 2-dimensional retinal images and tries to construct from them a plausible interpretation of activity in the 3-dimensional world (sort of). When certain stimuli are ambiguous between two mutually exclusive interpretations it cannot represent the world as being both so (for some reason - possibly adaptation or perhaps simply as a result of neuronal fatigue) it alternates between them.
* The first flight of the [[wikipedia:Ariane 5|Ariane 5 Rocket]] failed due to a bad conversion of data, a 64-bit floating point number to a 16-bit integer. Since the guidance system basically crashed, the rocket self-destructed.
* Seen on [http://www.nancybuttons.com/ a button] at WorldCon: "Black holes are where [[God]] is [[Divide by Zero|dividing by zero]]", effectively logic bombing a small piece of the universe.
* An F-15 was landing in the Dead Sea (below sea level). During final approach, the navigational system crashed. The pilot landed manually. Since this was very close to hostile countries (within the Middle East), the contractor needed to fix the problem quickly. It turns out the navigational system divided by the altitude. When the altitude went to 0, it caused a divide by 0 crash in the navigational system.
* Arguably, infinite looping commands such as "add 2+2 until it equals 5" (which will never happen, hence the infinite loop), which result in a computer freezing as it attempts to solve the loop, are logic bombs - particularly on very old computers (and we're talking ancient here, before MS-DOS ancient).
Line 475 ⟶ 478:
 
{{reflist}}
[[Category:Logic Bomb{{PAGENAME}}]]
[[Category:Logic Tropes]]
[[Category:Magical Computer]]
[[Category:Wall Banger (Darth Wiki)/Star Trek]]
[[Category:Logic Bomb]]