Failsafe Failure

Everything About Fiction You Never Wanted to Know.
Jump to: navigation, search
Information icon4.svg This page needs visual enhancement.
You can help All The Tropes by finding a high-quality image or video to illustrate the topic of this page.


"Out of order?! Fuck! Even in the future nothing works!"
Dark Helmet (on finding out his ship's self-destruct override is broken), Spaceballs

Thanks to Finagle's Law (or just ignorant writers), on TV a system's failsafe will never work when it's needed the most, nor will it actually be failsafe—usually it'll be quite the opposite, sometimes referred to as 'fail deadly'. The only reference to an emergency shutdown you'll be likely to hear is a panicked tech yelling "It won't shut down!" as the system runs wild. It's supposed to make the phenomenon of Explosive Instrumentation more plausible, by acknowledging it's not supposed to blow up in your face, but a failure elsewhere of a key safety lockout means it can, and will. It also justifies how something that is supposedly governed by industry-wide standards, regulatory law, and years of engineering refinements could go so horribly wrong in the first place.

What's a failsafe? Well, the world is full of a lot of dangerous machinery and devices. Huge electrical turbines, nuclear reactors, trains speeding down the tracks at 300 km/h, semi trucks that weigh in excess of 40 tons loaded rolling down the freeways and that's just the stuff that isn't designed to kill anyone. There's plenty of stockpiled bombs, missiles and such out there too. These are all things that would cause some spectacular collateral damage if they suddenly went haywire.

Thus, in the real world, things that have the potential for very destructive damage not only undergo strict maintenance procedures, but usually have circuit breakers, password protection, arming/firing keys, backups for redundancy, and prominent big bright red emergency handles that can shut the whole system down if pulled—and, more to the point, they usually have a totally separate set of safety features, designed to trigger automatically when the system's operating parameters get too far outside safe norms, which will (ideally) shut down the whole shebang without making the situation worse than it already is.

If something is described as "fail safe", it means that it has been designed and built so that a critical mechanical or operator failure will cause the system in question to default to its safest possible state, quickly and automatically, without any human intervention. It's worth noting that while a triggered failsafe is generally designed to be safe for people, it can be amazingly destructive to the equipment in question. Up to and including "your multi-million-dollar installation is not just wrecked, it's also a toxic-waste site". Unfortunately, this can provide a motive for the operators to sabotage the failsafe....

Failsafe measures can range from the simple to the complex. From automobile safety glass (it's not intended to shatter at all, but when it does, it shatters into relatively harmless little crumbs instead of huge deadly shards with edges like scalpel blades) to the safety key on a "jet ski" (it's tied to the operator by a lanyard, so that should they fall off, it will pull out the key and stop the craft instead of leaving it driverless) to the modern air brake system on a train (air pressure is used to keep the brakes off, so that a loss of pressure causes the brakes to come on and the "dead man's handle" in the locomotive will automatically apply the brakes if the engineer is somehow incapacitated).

Modern nuclear reactors are possibly the most thorough example of the 'fail safe' principle available. (In current designs, excess heat will interrupt the fission reaction and shut down the reactor simply by heat expansion of some key components; the core is designed so that a sufficient degree of heat expansion results in the fuel elements being too widely separated to sustain a reaction, so that the reactor cools down instead of overheating until the core melts. If that's not enough, the SCRAM (emergency shutdown) system is usually implemented as a separate set of control rods, dedicated to emergency-shutdown use and suspended above the reactor by electromagnets, or by mechanical clamps sprung to pop open when electrical power is removed; that way, even a complete power failure will still release the rods to drop into the core and starve out the reaction (some designs even include a spring-loaded backup for that system, just in case gravity stops working). Strongly safety-oriented designs, such as the Canadian CANDU, also include the ability to inject a neutron-absorbing liquid into the core, so that even if the SCRAM rods become completely inoperable—say, if there's a fire within the containment building that warps the rods or their channels so that they get stuck instead of dropping into the core—there's still a way to bring a runaway reaction under control before it turns into a catastrophe. (This is very likely to completely wreck the reactor core, of course—but, most often, by the time things are bad enough that "fail safe" comes into play, whatever device is failing is already a lost cause, and the idea is to limit the extent of the damage as much as possible.))

All of this is ignored in fiction-land, where the hero will have to go into that burning building or board that Runaway Train and manually stop the catastrophe themselves, since the folks at Mission Control have already tried to stop it but every emergency system failed to respond. Of course, all of this is based on a completely ass-backwards understanding of the concept, but what else can you expect from Hollywood?

In Real Life most disasters are caused by a combination of different failures, or more commonly different errors, which when combined manage to defeat normal safety measures. This is where 'fail safe' can really shine; a truly fail-safe design takes human factors into account, which is a nicer way of saying that sometimes people royally screw up and it's necessary to engineer for that kind of failure too. Remember, Plane and train crashes tend to make the news because they don't happen every day.

The "human-proof" failsafe design is getting more and more prominent nowadays exactly because the biggest techno-catastrophes in history had operating errors on a Too Dumb to Live level as key precursors. Things hardly "just blow up". The infamous Chernobyl disaster was only made possible by operators intentionally disengaging all of the reactor's safety features to conduct an ill-advised experiment. Later investigation concluded that just a fraction of those systems left online would have likely prevented the catastrophe—as they were designed to. The only slightly less famous Three Mile Island disaster had a faulty critical component that was discovered in time but neither replaced nor properly bypassed. The Bhopal toxic spill happened after literally years of negligence by the operators of both the physical condition of the equipment and established safety protocols when handling poisonous materials, basically operating unsafely and relying on luck until it ran out. Fukushima Daiichi was being operated well past when safer reactor designs had been invented, was built with less precautions for both earthquakes and tsunami than it should have been, and had several parts (that would be broken in the earthquake) in disrepair or lacking inspection.

And so on...

Compare the way Hollywood treats personal vehicles when the owner is always Driving Like Crazy, bribing the traffic cops and leaving the car in a state of neglect, a falling-apart car that endangers it's occupants and everyone around it is frequently treated as comedy.

See also No OSHA Compliance, Override Command, Deadfoot Leadfoot, Inventional Wisdom. Often invoked in a chain of Disaster Dominoes.

Examples of Failsafe Failure include:

Anime & Manga[edit | hide | hide all]

  • Neon Genesis Evangelion abuses this trope. Every second angel attack, someone has to push molly-guarded buttons, smash in the protective glass over a handle, or cut the power. Most of the time, the girl at the controls ends up shouting, "The EVA is rejecting the signal!""
    • The one and only time that the failsafe system does actually work and manages to automatically eject the entry plug, it makes things considerably worse for the pilot. The mechanical systems of Eva apparently have it in for the pilots just as much as everything else.
    • Another example of it actually working would be in episode 13, when the viral angel attacks Nerv. When one of the Simulation Evas reaches for the Pribnow Box, an emergency shatter-and-pull mechanism blows the arm off, protecting the crew in the Box. The makers probably only allowed that because the alternative would be killing off some main characters all too soon dying that way is not painful enough in Hideaki Anno's twisted, twisted imagination.
      • Protecting the crew? It fired the severed arm straight into the glass viewing window which started to crack and leak... forcing the team to abandon the situation AND room before they were drowned in plague infected water.
  • Averted in Naruto. Minato and Kushina both implant their chakra in Naruto's seal, so that if he tries to break it they will show up to stop him and repair the seal. This actually works. Both times.
  • In Code Geass, we see one instance of mechanical failure in the standardized Ejection Seat, and that's presumably because it's being hit with microwaves that short out the electronics (and make the pilot pop like a potato in the microwave). In another episode, when Lelouch's mecha is shot down, the ejection seat has a terrible launch vector and thus ends up skipping along the ground at high speeds; it's a wonder he didn't suffer whiplash.

Comics[edit | hide]

  • In Watchmen, Dr Jon Osterman is trapped inside the Intrinsic Field test chamber by the door closing behind him when the automatic timer starts up the generators for that afternoon's experiment. As Dr Glass puts it, "I'm sorry, Osterman. The program's locked in and we can't over-ride the time lock. It's a safety feature."
    • Not being able to disengage the lock is less of a big deal than not being able to stop the experiment, given what it does to matter in general.
    • Why an experiment of this type actually even needs an ordinary door is a pretty significant question. It would only need a small hatch that is closed at all times, except when experimental material is being put in, and even more intricately closed larger hatch for necessary repairwork, with parts that make the system completely disabled while open.
    • Early nuclear research was full of blunders like this, though.
  • Green Lantern actually does have several failsafes which kick in, shutting down if the wearer breaks Lantern Code, reserving a small supply of energy the lantern normally can't access to protect the wearer from mortal energy and so forth. But Lanterns have been able to override the latter failsafe to continue fighting after their normal reserve is depleted (and given that lanterns are selected for fearlessness, it seems silly to allow that.) Also Hal Jordan was able to override the former failsafe after his ring was depleted for insubordination by drawing energy from a Guardian's construct.
    • The Alpha Lanterns seemed to lack sufficient failsafes. While their minds were linked to the Book of Oa to make sure they faithfully executed their duties as Internal Affairs, there was no failsafe in place to stop them from being hijacked by Hank Henshaw, the Cyborg Superman, whose Kryptonian-based technology has traditionally been billions of years behind the Guardians. Possibly justified by Cyborg's machine empathy.


Fan Works[edit | hide]

  • One engineer was so annoyed by the persistent number of Failsafe Failures in Star Trek that he wrote a fanfic about a leading engineer in the Star Trek universe being put on trial for negligence.


Films -- Live-Action[edit | hide]

  • Dr. Strangelove has a pretty much identical plot to Fail-Safe (see Literature below), but the attack is a result of human intervention rather than mechanical failure (although it is a mechanical failure because of an attack that prevents one of the bombers from being recalled).
    • It also includes a slight variant on the Trope. When the bomber plane is damaged after getting hit by a missle, a pilot's surveying of the problems ends with, "I think the self-detonation device got hit and blew itself up."
  • This is parodied in Spaceballs when, after the Big Red Button is pushed activating the self-destruct, the computer says in the last few seconds that they can stop it by pressing a button to cancel. The button, of course, has a big "out of order" sign hanging on it, which prompts Dark Helmet to shout, "FUCK! Even in the future, nothing works!"
    • This probably specifically parodies a scene in Alien, where Ripley attempts to abort the self-destruct sequence, but can't do it in time because it takes as many steps for her to abort it as for her to start it. The "self-destruct" of the Nostromo was not your traditional sci-fi Big Red Button affair, but rather depended on deliberately disabling the fail-safes that prevent the drives from overloading. This is deliberately a long and difficult process because you wouldn't ever want to be able to activate it by accident. Pity no-one thought to have a snap-back quick restore feature installed, but there you go.
  • Speaker for the Dead got that bit right on a planet-busting bomb: "Disarming it is easy. Arming it is near impossible."
  • In Live Free or Die Hard the bad guys blow up an entire natural gas facility by routing all the gas to it, never mind the fact that there would be dozens of failsafes to prevent the necessary overpressure from breaking anything at all much less exploding. Of course the bad guys used the power of Hollywood Hacking to pull it off and since computers are magical it all makes sense.
    • And apparently none of the failsafes are purely mechanical, either.
    • Not so ridiculous after all: I knew a guy who worked with the Italian natural gas agency and he told me that when the distribution system was computerized (about 1970) the master mainframe was put in an underground vault with armed guards at the door, because causing large accidents and property damage by inputting purposely wrong commands wasn't outside the realm of possibility. He may have been exaggerating, and automatic protection systems have come a long way since then, but still.
  • A textbook example from The Machinist: a worker is repairing a broken machine when someone accidentally leans on the On button (which is only possible because the workshop has No OSHA Compliance whatsoever). Hammering on the Off button does absolutely nothing, the repair worker is dragged into the machine and loses his arm. It's not clear what was wrong with the machine to start with, but it might have been a good idea for someone to disconnect the power before sticking his arm in there.
    • It was made clear that the machine was supposed to be locked out, but the manager had previously reprimanded employees for taking too much time to get equipment fixed. That kind of pressure is definitely illegal, but happens more often than we'd like to think (especially in small shops with narrow profit margins).
  • The end of Speed involves a Runaway Train with the emergency brake disabled, and no Dead-Man Switch. And it didn't trip any overspeed controls either.
    • Then there's the elevator at the beginning. Payne destroys the emergency brakes with bombs, then it turns out the crane Jack and Harry hooked to the car to secure it couldn't hold the weight either.
  • In Capricorn One, a government agency specifically defeats every failsafe on Robert Caulfield's car. Not only do the brakes fail, the throttle gets stuck wide open, the gearshift is locked so he can't go into neutral, and even turning the key off won't cut the ignition.
    • This also happens in an episode of The Professionals. The components of Doyle's car are set to fail one by one as he's going down a hill. Justified though as the killer isn't trying to fake a car accident; he's just playing with Doyle before he kills him.
  • In the notorious Irwin Allen disaster flop The Swarm (1978) the killer bees attack a nuclear power station, and cause it to blow up almost instantly when one of the technicians falls across a random instrument panel. Also the actual core is completely exposed to the air without any evident shielding.
  • In the James Bond film Thunderball, an assassin tries to kill Bond by turning up the setting on a spine-stretching excercise machine he's strapped into. Bond blacks out and is only saved by a nurse happening to enter the room just in time. Leaving us to wonder why the hell the machine was even designed to be able to go that fast.
    • Because human beings come in different shapes and sizes. A setting that will inflict painful bodily harm on Bond might be the bare minimum needed to successfully do spinal traction on, oh, Andre the Giant.
    • In Moonraker a mook tries to kill Bond by disabling the chicken switch on a centrifuge and cranking the spin rate to unsafe levels. Not so much Failsafe Failure as intentional tampering, but why would a piece of equipment designed to test human endurance have the wires to the safety switch connected to a plug easily removable by the controller, and why would it go up to speeds considered unsafe for humans in the first place?
  • Michael Bay's The Island has two notable examples of this. A truck carrying a massive load of train wheels loses its entire load when a single strap is released, then at the climax throwing a single breaker switch causes the entire mechanism to explosively fail. But then being a Michael Bay film, having things blow up is to be expected.
  • Justified in The Taking of Pelham One Two Three where the safety devices on a New York Subway train are actually a plot point. The police believe the Dead Man's Handle will prevent the villains jumping off the train while it's moving, but they've actually rigged up a system to hold down the lever. Later as the train appears to be careening out of control, it's eventually stopped by the safety devices built into the track.
  • In Outbreak, a lab technician is infected with The Plague when he carelessly opens and reaches into a centrifuge while it's still spinning, breaking a vial of infected blood and cutting his hand. In Real Life, lids on most (but not all) centrifuges lock until the spinning has completely stopped; for these models, it's impossible to open one while it's still in motion.
  • In the 1971 film version of The Andromeda Strain, the lab has a nuclear self-destruct device, with three substations (to disarm the bomb) per floor, but it's discovered they need five per floor, and are in the process of adding them, but they haven't been finished (this is a government installation, of course). When the self-destruct countdown is activated, team leader Stone, along with the only team member who has the shut-off key, are trapped in a section with an unfinished nuclear destruct shut-off substation. Stone cries out, "When the bomb goes off, there'll be a thousand mutations! [The virus] Andromeda will spread everywhere, they'll never be rid of it!" He touches the other team member and points at the exposed, unfinished shut-off substation. "The defense system is perfect, Mark, it'll even bury our mistakes."
    • The fail-safe here does make sense (unfinished shut-offs notwithstanding), but somehow the designers weren't smart enough to anticipate a virus which was spread by being nuked.
    • In fact, you would actually want the bomb to not be able to be turned off by the people in the facility. If they have been infected by something, but not yet dead, what if they panic and try to leave? You would want them to not be able to stop the bomb and and escape. Nuking a small number of people deep underground is a small price to pay to stop a dangerous pathogen like this one (other than its ability to feed on nuclear energy) from escaping.
  • For something less high-tech, Sylvester Stallone's 1993 film Cliffhanger starts with Gabe (Stallone's character) climbing up a mountain to rescue friends Hal and Sarah. To get to the rescue helicopter they have to pull themselves on a line stretched across chasm suspended by their climbing harness and a carabiner (a big metal clip). When Sarah is in the middle of the crossing the carabiner starts to buckle and Gabe goes on the line to catch her but he is too late and she falls to her death. The problem with that scene is that a carabiner is designed to withstand the weight of a falling climber, a standard one would have a rated strength of 23 kilonewtons while the static load of a of a Hollywood starlet would would be around 0.5 kilonewton. At that point viewers who knew their climbing may feel their Willing Suspension of Disbelief shatter with the carabiner.
    • The studio was actually sued by the carabiner manufacturer as a result of this scene and the inaccuracy of it breaking in such a scenario (when it snaps there is a close up of the carabiner with the manufacturer's logo prominently displayed).
  • The film version of Wing Commander begins with the Kilrathi attacking an outpost in order to capture its navigational data. Realizing why they enemy is attacking, the commander of the outpost orders the navcom, which is located in a sealed room, destroyed. However, the Self-Destruct Mechanism refuses to work, due to sabotage.
  • This sort of thing seems to happen all the damn time in the Final Destination franchise, to the point that its a wonder that horiffic freak accidents don't happen constantly. Some of these failures, though apparently outlandish, are still within the realm of possibility, however unlikely.
  • Averted, subverted and justified in quick succession in the climactic scene of the movie version of The Hunt for Red October.
    • The Aversion: To 'sell' the appearance of having destroyed the submarine Red October to its recently-evacuated crew, the US Navy attack it with an air-dropped torpedo. The torpedo is successfully aborted before impact in the scene which became Trope Namer for I Was Never Here.
    • The Sub Version: The commandeering of the Red October is interrupted by the arrival of a russian Alfa-class attack sub, which launches a torpedo. Captain Ramius orders the sub steered into the torpedo's path at full throttle, closing the distance before the torpedo's warhead can arm itself - its safety features work a little too well.
    • The Justified Version: In response to this failed attack, the commander of the Alfa orders all safety features disabled on his remaining torpedoes. As a result, after playing tag with the next torpedo he fires, the commander of Red October is able to decoy it into locking onto the vessel that fired it, destroying the Alfa.
  • The B-movie Evolver involves a failed military robot (which ended up killing dozens of soldiers during a training exercise) being re-purposed as a household laser-tag toy. Needless to say, the robot reverts to its original programming and starts killing people. When its creator attempts to use the verbal shutdown code that worked during the training exercise, the robot simply rejects the override and kills the guy.
  • Silver Streak ends with a classic Runaway Train scenario, evading all possible safety features. Justified in that the controls for the emergency brakes were deliberately sabotaged by the villains and the dead-man switch for the engine was disabled by placing a heavy toolbox on the pedal. (There was at least one way still remaining for our heroes to stop the train -- use one of the bad guy's pistols to shoot a hole in a brake line and thus cause the train brakes to automatically engage due to the loss-of-pressure condition being tripped -- but as neither of the protagonists knew a damn thing about trains, its justifiable that they did not think of this.)

Literature[edit | hide]

  • Obviously Fail-Safe (the book, movie and TV drama) qualifies. The American strategic nuclear forces have a system in place to prevent bombers from attacking the Soviet Union without clear authorization - bomber crews are conditioned to turn back at the fail-safe line no matter what is happening around them, if they haven't received the go signal themselves. In this case it's an "active go" system, designed to send out an attack order - and it does so incorrectly, due to a subtle and unnoticed technical fault.
    • Norman Moss, in the book "Men Who Play God", makes it clear that this is not how Real Life works - the "go" signal is a voice order that must be given by a human being and cannot be transmitted accidentally. Nevertheless it remains a cautionary tale against the dangers of too much automation in military systems, a thing the American President and Nikita Krushchev are left bemoaning to each other near the end of the book, as the last bomber closes unstoppably on Moscow...
    • However, at the time period the book is set in -- the 1950s -- US nuclear forces actually did use a "fail-deadly" system, wherein the bombers were supposed to attack Moscow unless they periodically received orders not to attack Moscow. It wasn't until 1961 that we switched to "fail-safe" in everything.
  • Mentioned/subverted in Mostly Harmless; it is explained that a set of special bulletproof windows are not designed to be shot at from inside. They can also be jimmied open with just a credit card. This is because of 'The Great Ventilation and Telephone Riots of SRDT 3454'.
    • To explain: the main cause of the riots was a building environment control system. Part of the installation process involved sealing the windows shut, to make it easier for the system to do its work. One particularly hot day, many of these systems broke down, resulting in the overheated office workers taking to the streets. As a result of the riots, buildings were required to have windows that opened.
    • In the same book, it's mentioned that the difference between something that might go wrong and something that "cannot possibly go wrong" is that when something that cannot possibly go wrong goes wrong, it's usually impossible to fix.
  • Averted and lampshaded a bit in the book 2001: A Space Odyssey. The makers of the failsafes of the airlock doors had mentioned, "We can protect you from stupidity, we can't protect you from malice."
  • In both the book and the movie of The Taking of Pelham 1-2-3, the criminals hijack a subway train, and demand that all of the signals all the way to South Ferry be set to green. The train then rolls through every signal and station, at speeds exceeding 80 miles per hour. Note that the train does stop when it gets to South Ferry as it is going too fast for the turn and trips the overspeed control. However it's interesting that it never tripped any overspeed control along the way.
  • In the climax of The Shining (the Stephen King book), lead character Jack Torrance desperately tries to cool down the main boiler of the Overlook Hotel, while Danny, Wendy and Mr. Hallorann escape on a snowmobile. At first, it looks as though the boiler (which has to be constantly maintained) will return to acceptable levels, but the pressure is already too great, and the boiler blows up, taking Jack, the hotel and the topiary animals with it. Justified in this case as the boiler is explicitly described as both very old and very dangerous; the hotel manager has been bribing the safety inspector for years to keep it from being forcibly replaced.
  • As above, the The Andromeda Strain. The bacterium mutates and destroys gaskets... that are protecting the lowest, most secure level from being contaminated. A nuclear bomb is set to destroy the base, and since the bacterium mutates with levels of energy, it'll cause a worldwide outbreak.
  • In The Stand, the engineered superflu virus nicknamed "Captain Trips" is accidentally released from a top secret installation in the High Mojave, and, unfortunately for the rest of the world, a security guard is able to escape because the doors to his station (which he thought erroneously to be "clean") did not magnetically lock at the moment of the installation's containment breach. The guard takes his family and flees, making it all the way to East Texas before dying. General Billy Starkey, the man charged with the containment operation, later comments on this fact.
  • In Dave Barry 's Big Trouble, it is mentioned that the corrupt Mega Corp built a new prison in downtown Miami using off-the-shelf garage door openers to power the cell doors open and shut. Someone accidentally hit their garage door opener button while driving by soon after the jail was filled, and every door in the place opened. Hilarity Ensued.
  • Justified in the Thursday Next book Lost in a Good Book. The nanomachines that unstoppably convert all organic matter into Dream Topping are contained in an extremely strong electro-magnetic field. The field is maintained by three generators, all of which would have to fail simultaneously in order to release it, an astronomically unlikely possibility. They do fail, though, because the villain Aornis Hades has the ability to manipulate coincidences.
  • Averted in the DS9 novel Valhalla. A sentient, suicidal starship tries to blow itself up by running its fission pile too hot. (Un)fortunately, a mechanical failsafe triggers, wrecking the drive in the process.
  • In the Star Trek: Deep Space Nine novel, Time's Enemy it is shown that the self-destruct command for Jem'Hadar ships is simply "Destruct" in their own language. There is no override, there is no countdown. This is simple and, with the mindset of the Jem'Hadar, the perfect method.
    • It's also fairly secure. Jem'Hadar learn foreign languages ridiculously quickly and never use their own language in the presence of aliens, so there's no risk of an intruder arming the self-destruct.
  • In the Starfleet Corps of Engineers series, a Federation space probe in one story (actually entitled Failsafe) suffers a Failsafe Failure, requiring the crew undergo a mission to retrieve it from a pre-warp planet. Sonya Gomez even seems to lampshade the improbability.
  • A Warhammer 40,000 short story by Graham McNeill had a ultra-maximum security prison in which all the doors automatically unlocked in the event of a power outage.
  • Starships in the Honorverse, especially warships, are built with numerous failsafes that usually work as intended. Occasionally though, Weber falls into this trope, such as when one of the circuit breakers protecting a fusion plant from power surges is itself knocked out, destroying the ship.
    • In fairness, the fusion reactor in question is being hit with anti-ship missile warheads at the time, and the circuit breaker that failed is the one that controls the circuit that's supposed to eject the reactor vessel outside the ship if its about to break open.
  • Averted in Children of the Mind, where an active MD Device is quickly and easily disarmed by a technician. The tech notes that the planet-atomizing superweapon was deliberately designed to be easy to turn off... turning it on, on the other hand, is really hard.
  • In Sergey Lukyanenko's Emperors of Illusions (part of the Line of Delirium trilogy), Arthur van Curtis holds the command crew of an Imperial cruiser at gunpoint while the ship is in hyperspace. He orders the crew to prepare to drop the ship out of hyperspace without first slowing down. In this case, the ship enters normal space at relativistic speeds and, by the time it will slow down, decades or even centuries will pass for the rest of the universe. This has happened before, and yet nobody decided to make deceleration a standard part of dropping out of hyperspace instead of simply a step that a crewmember might one day forget to do.
  • French Sci Fi novel Malevil briefly mentions this. World War III occurs and nobody is certain why it happened, they lived through it and yet the lack of information and details turns it into the Great Offscreen War. One of the possible, never to be confirmed, theories as to why the world ended was Failsafe Failure.
  • In Time Scout the many systems Time Tours and BATF have in place are ... completely useless. And then some.
  • In the novel The Dorset Disaster, a nuclear reactor explodes due to a Failsafe Failure, because someone tampered with the settings that controlled when the reactor should SCRAM-In a bit of similarity to the Real Life Chernobyl incident, it was SCRAM-ing too often and annoying the people running the plant. So they changed the settings, and that led to a big kaboom.

Live-Action TV[edit | hide]

  • This happens all the time in Star Trek, where fail-safes are almost never shown or mentioned unless they fail. One fan group came up with the name "SINEW" for this phenomenon: "Somehow it never, ever works."
    • The warp core is chronically incapable of being shut down in case of emergency. This gets to the point where, in the later series, systems are built into ships to eject the core from the ship when it's about to explode. These also always fail. Fridge Logic says they would have to first cut off the antimatter fuel input to safely eject the core, which should mean the reactor shuts down and doesn't need ejecting anymore... but apparently the only time the writers remember the ship even has fuel is when they do a running-out-of-fuel episode.
      • Just twice in the history of the franchise[1] has core ejection worked. Both times were plot points.
    • The classic Holodeck Malfunction requires four conditions that you'd think would be unlikely to all happen at once: the exit door, the off-switch, and the safety protocols must fail, and everything else must continue to work perfectly. Apparently, this happens almost every time the holodeck is used on-screen. There is no easy way to turn it off even from the outside, as for some inexplicable reason the holodeck is the only system on a starfleet ship to have an independent power supply, and they can't shut it down any better than the warp core. And why is the holodeck designed to accurately simulate dangerous things like bullets and mustard gas anyway?
      • It usually requires the Transporter to be inactive (or prevented from working by a shield, which just raises more questions) to get them out of danger.
    • Particularly ridiculous is the season one finale of Voyager, where the "manual" override on a door lock is shut down by a power failure, negating the very purpose of a manual override in the first place. As SF Debris put it, "That's like having an emergency light that plugs into a wall socket, or a parachute with a rope attached back to the airplane."
      • Well, at least the doors have emergency back-up, even a pointless one. The power circuits of Voyager are "jell packs," living, biological power suppliers, that they only have 48 replacements for. These jell pack can't be replicated, nor does the ship have any other usable power supplier, and the jell packs can actually be killed by viruses.
    • In the episode "Unexpected" of Star Trek: Enterprise, the handrail on the lift in engineering is capable of severing limbs, as there is no cut off if there's resistance.
    • The brigs on Starfleet ships default to open in case of power failure. Even if they need the forcefield to prevent "jail break" via transporter, there's no reason they couldn't also have a thick steel door. There isn't even a backup power supply. The same applies to the medical quarantine in sickbay, and to the force fields that are frequently placed around hazardous life-forms and other suspicious specimens. This is not to mention the fact that force fields are also used to seal hull breaches in combat, meaning that in the event of power loss, which is likely to occur if one's shields are down, the ship is open to space.
  • In one episode of the new Battlestar Galactica, two characters stuck in a leaking airlock are told that the "manual override" for the door had failed. This provided an excuse to space the pair in an over-the-top CG-fest. It's good to know that no major space craft will ever carry a crowbar.
    • The fact that the Galactica was old when the series began and has gone several years (and multiple battles) without adequate maintenance at this point provides a thin veneer of justification, but it's very thin.
    • Well that might be going a bit far. The aforementioned battles had involved the Galactica being hit with a nuclear warhead at one point, and I personally wouldn't fancy my chances of getting through a bulkhead door on a heavily armored warship with a crowbar. If Gordon Freeman was on board on the other hand...
    • Forgetting the fact that the thing is heavily armored and intended to take battle damage, just the simple fact that its an airlock door is going to screw you. As an airlock door its designed to withstand a pressure differential of at least one atmosphere (i.e., the difference between normal pressure and zero). That's 14.7 pounds per square inch. Now multiply that "per square inch" by the surface area of the door and you get an amount of force measured in tons. That is the amount of force the door is designed to withstand, plus a hefty safety margin... and you're going to open it with a crowbar? It seems unlikely.
  • MythBusters: If they're not going to ridiculous lengths to defeat the failsafes on some common household item so they can replicate a myth's results (translation: make something explode), chances are you'll find them putting out a fire or running to catch a driverless vehicle because one of the failsafes on something they made has failed.
    • Specifically, the tendency for their remote controlled cars to run wild and take out the chain link fence of their test area has become a Running Gag. The failsafe is supposed to apply brakes to the car if it loses radio contact. Some of them were genuine failures, some of them were somebody forgetting to set it. One time Jamie forgot to set it before Adam jumped in for a ride...and scratch another fence.
    • Sometimes they do consider a Myth "confirmed" if the fail safes preventing the results they wanted could reasonably be disabled by a normal person. For example, the "water heater rocket" myth required them to close off a safety valve that would prevent the thing from launching through the building and into the sky since to a normal person, the water it drips may appear to be just draining their water bills.
    • In one instance they had to disable the safeties on an elevator. Adam eventually reported, "Anticlimactically enough, I believe I've disabled the entire mechanism by removing this simple pin." Granted, it was a very old elevator in a condemned building already slated for demolition. Wonder why...
  • Played absolutely straight in Season 4 of 24, where Marwan The Wonder Terrorist manages to steal a MacGuffin that can cause every nuclear reactor in America to go into simultaneous meltdown. How one electronic device could untraceably hit over 100 sites at once AND bypass the dozens of failsafes, manual breakers, and shunts present at every site they don't even try to explain (let alone WHY an American company would make such a thing).
    • It's called SCADA security and is a very, very BIG issue amongst both black hats and white hats. Most SCADA-equipped facilities (including O&G installations, infrastructure, utilities and manufacturing plants) were designed with availability, timeliness and reliability first, and security very, very far down the list of priorities. Great when they were mostly expensive, proprietary, and isolated; not so great when they're cheap, IP-enabled devices being remote-controlled from who-knows-where.
    • Inverted in Day 8 where a suicide bomber blows up because of the failsafe.
  • It's actually incredibly common in SG-1, as well as its sister series, Stargate Atlantis. This was lampshaded in an episode of Season eight entitled "Avatar" when the weekly reverse-engineered alien tech malfunctions, and General O'Neill responds with "I though failsafes were supposed to be safe from failure."
    • The Stargate system itself has plenty of failsafes, however, as presumably it was designed to be used by other races, without such devil-may-care attitudes about personal safety. Unfortunately for the SGC, they're operating their Stargate without the standard control device, and feel free to completely disregard any warning signals from the Stargate itself.
      • Even with the "safe" Stargates, ones operating with the original control computer, a gate activation completely vaporizes anything standing in front of the ring with only a few seconds' warning. Also, there's nothing to stop the gate from closing if an object is halfway through, although that only happens if the gate has been open 38 minutes, at which point it runs out of power.
    • Also commonly inverted in Stargate Atlantis, where Atlantis often activates failsafes during bad situations. it inevitable does exactly the wrong thing for the given situation, forcing McKay to waste precious time overriding the failsafe or killing someone when he fails to.
    • The Pegasus Galaxy also has Spacegates, and there's no way to tell whether you dialed an address for a gate on the ground or a gate in orbit besides stepping through (or checking the Ancient database, but not every civilization has that option). The Pegasus network was designed for Lantean spaceships, but if a failsafe was ever needed in their designs, it's here.
    • Failsafes built into the Stargate network, as well as other Atlantean systems, have been mucked around and hacked so much by the Atlantis expedition (in order for Earth PCs and computers to interoperate) that they might as well not exist. Consider "Avenger 2.0" where Felger's code is none too secure, or where McKay's software updates "broke the Gate". Of course, it wasn't really broken, but nevertheless, there were glitches because of this (hence Sheppard's 40,000 year trip into the future and then some).
    • Also, the Ancients aren't big fans of failsafes. Ancient devices need to have a function as complex as time travel before someone will even consider putting in a failsafe (The failsafe that prevents Atlantis from being crushed underwater after it has run out of energy wasn't even included until someone traveled to the past from a future where the city and its occupants were flooded, for instance...)
    • Atlantis itself is the mother of all failsafe failures, a spaceship without an airtight interior. If the shield fails in space, everyone dies, as the crew unpleasantly discovers when they run into power problems while moving it to another planet.
    • The fact that you can remove the safeties on a ZPM is a biggie, you'd have thought that those things had built-in safeties.
  • In Stargate Universe:
    • The Destiny's emergency atmosphere-retention forcefields are only strong enough to reduce the flow of air through gaping holes in the hull, not stop it. It's possible that they used to be stronger but are now past their best-by date. They also may not have been designed to compensate for such extensive deterioration.
    • One jammed-open airlock door is apparently enough to drain the air from all of the remaining habitable areas of the ship, with no other doors capable of being sealed to further compartmentalize the area. Granted, the ship has already had a lot of sections sealed off for this very reason, so perhaps this is simply the case failsafes can be driven past their limits by repeated failure. But one would think that having an airlock and the Stargate in the same emergency-seal compartment would be a bad idea.
    • In a case where a working failsafe caused problems, the shuttle docked at said airlock wouldn't close its own airlock door without someone inside it to operate the controls. This prevents you from locking yourself out, I suppose, but in this case meant a Heroic Sacrifice was needed to stop the shuttle from leaking all of Destiny's air.
    • Destiny's atmosphere recyclers packed it in over the millennia and the ship automatically stopped off at a planet where needed chemicals could be found to repair it. But the ship's autopilot was still set to take the ship back to hyperspace after a fixed period of time, whether or not the chemicals had been recovered by then.
    • This trope is part of Stargate Universe's stock in trade. Most of the drama is derived from either yet another of Destiny's millenia-old unmaintained systems failing catastrophically, or the failsafes working, but none of the human crew knowing what the failsafe procedure actually is, and everyone panicking. When the proper operating procedures include diving into a sun to refuel, the panic is understandable.
    • Averted in one case with an electrified corridor. There's a handy manual breaker just to make sure you can kill the power. However, the fact that a corridor can, without warning, turn into an electrified deathtrap, invokes another trope.
    • It gets better, the ship's capable of getting into your head. In "Trial and Error" when Destiny sensed Young was going through a mental breakdown, it initiated a program that caused Young to continually have vivid dreams and hallucinations of the ship being destroyed, evaluating if he had the ability to continue commanding the vessel. Let that sink in... the ship saw someone suffering a mental breakdown and felt the best solution was to do something that nearly drove him insane.
      • Even better, until they found a way to turn off the scenario, the ship began to shut down all of its control and refused to move. The million-year-old spacecraft that routinely gets shot full of holes, felt that in hostile space, its best recourse was to pull over.
  • In the Red Dwarf episode "Demons And Angels", Lister says the ship is full of fail-safes; "The actual chances of it exploding are one in--" Red Dwarf explodes. "... one."
    • Rimmer also managed to flood the entire crew compartment of the ship with lethal radiation, because he conducted a repair without proper assistance. Ironically, the cargo decks were safely sealed during the event. Why Rimmer was even allowed to touch the ship's nuclear reactor, especially without assistance, is never clarified...
    • Even better, one of the reasons given is that he didn't have Lister to help him. Who in their right mind thought that nuclear reactor repair is a job that screams Lister?!
  • Averted in Angel, where Lindsay tried to activate Wolfram and Hart's failsafe, but Angel's people stopped it from getting loose.
  • Averted in Terminator: The Sarah Connor Chronicles wherein a nuclear power plant's failsafes DO kick in, but a T-888 deliberately sabotages them one by one.
  • Inverted in an NCIS episode, almost as a direct refutation of the Cliffhanger reference above. When a carabiner fails and kills a Navy SEAL during a cliff-climbing exercise, Gibbs rapidly realizes that there's simply no way that part should have failed under normal load. An investigation rapidly reveals that the carabiner was deliberately substituted for a fake made out of much flimsier metal, and that the accidental death was no accident at all.

Puppet Shows[edit | hide]

  • In most episodes of Thunderbirds, disasters were caused, or at least not averted, by faulty safety equipment or poor engineering. Examples included bridges that collapsed as soon as their maximum load limit was exceeded, aircraft whose nuclear reactor shielding failed if the flight was delayed, failure to survey sites properly before beginning major engineering projects, and numerous vehicles without a Dead-Man Switch or equivalent.


Video Games[edit | hide]

  • Ace Combat 5 The Unsung War: During an airshow turned firefight, Chopper gets shot up by enemy forces, damaging most of his plane's internal systems. He still flies for a minute or so to find a safe place to land, but by that time, his eject system is damaged too heavily for him to eject. He dies in the following crash.
  • Half-Life: When things go awry at the Anomalous Materials research department of the Black Mesa science facility, the classic exchange is heard:

"Shut it down!"
"It's not-[BANG] it's not-[BANG] it's not shutting down! It-[screams]"

    • Justified example, as it is heavily implied that the incident was orchestrated by the G-man.
    • And a whole lot of safety protocols were overridden or just plain ignored because Breen said so (possibly related to the above). There was at least some arrogance at work as well, however, as one of the scientists comments on how "We've assured the Administrator [Breen] that nothing will go wrong (cue meaningful look at other scientist)", implying that some of them thought that the test would go fine.
    • Plus most of the Black Mesa systems were too old and were never meant to be used the way they were using them.
    • In Half-Life 2: Episode 1, a failsafe on the Combine's dark energy reactor is reactivated (the Combine had deliberately deactivated it so the reactor would Go Critical), but fails to stop the catastrophic meltdown (or whatever the dark energy equivalent of a meltdown is), because the reactor is too far gone to be stopped. However, the failsafe does slow the reaction long enough for most of the city to evacuate.
  • Subverted in the opening of Xenogears. When an alien threat is taking over the ship, the command crew attempts to cut power using the emergency blocker, which is a 3 foot section of the power cables which jettisons itself out, creating a massive break in power and electronic communication. Unfortunately the alien threat manages to arch the gap and continue its takeover. Their second failsafe (self detonation) works.
  • The game 7 Days a Skeptic features, on an advanced space ship, an escape pod door that opens whether or not there is an escape pod that can be boarded behind it. Almost needless to say, if there's no escape pod, one is greeted by hard vacuum.
    • The entire ship was a deathtrap, really. Not only did the escape pod door open despite having no pod behind it, it required 12 hours to prep the pod for use in the first place. An EMERGENCY ESCAPE POD requires 12 hours to use. Also, it was very easy to fall off the catwalk in the engineering deck, next to the reactor core that has no (visible) shielding. Also, the Comm array controls are in the captain's cabin, which locks from the inside. What truly makes these features mind-boggling is that the game's creator is Yahtzee Croshaw, the kind of person you'd expect to snark the hell out of such idiocy.
      • The door to the captain's quarters has an emergency override... on the inside of said quarters. There's also a fancy observation deck with a transparent dome allowing full view of space, with no visible safety measures should a stray rock smash it. And the ship's engineer spends most of the game lurking in the mess hall, sending the counselor to fix problems, or just ignoring them - when the ship's engines turn off without explanation, he just assumes they'll turn back on eventually.
  • Not actually a failsafe, but certainly a countermeasure: in the X Wing Series, it's mentioned that X-Wings have a system to be triggered when they are powerless in flight which should turn the power back on. It is never shown working. It works in the books, though.
  • The Resident Evil series has at least one in every single game, mainly for the purpose of locking you into whatever room has the latest mutated/undead boss. As well, even the Failsafe Failure fails as the door lock mechanism invariably releases the second you kill the Big Bad du jour. The best example of a needlessly complex failsafe comes from the train braking system in Resident Evil 0. To activate the brakes, you must:

1: Find the brake instruction manual in the front car.
2: Pick up the card key for the brake system.
3: TRAVEL THROUGH THE ENTIRE TRAIN to the brake-lock system located outside of the caboose.
4: Insert card key.
5: SOLVE AN ADDITION PUZZLE.
6: TRAVEL BACK THROUGH THE ENTIRE TRAIN to the front car.
7: SOLVE THE SAME ADDITION PUZZLE A SECOND TIME.
8: (really, the only step that should be here) Pull brake lever.

    • Slightly hand waved in the RE storylines in that the designers of pretty much everything in Umbrella Corp. and Racoon City were utterly insane.
  • An Interactive Fiction game titled Fail-Safe puts the player in the role of a computerized emergency help system. Someone in a crisis is calling for help on the radio, and you have to consult your database and give him instructions on what to do. There is a twist of the Tomato Surprise variety.
  • Metal Gear Solid contains such a Failsafe Failure that is actually part of Fox Hounds plans. You receive a key which is part of the override. But the game wants three keys for the override. Thus, you need to heat the key up to make it the shape of the second slot, then cool it down for the third shape. It's kind of lucky that the base holding Metal Gear Rex contains a foundry, and is in Alaska, otherwise how else would you use the key? This is made even more baffling when you consider that the nuclear missile will launch with only two codes, yet three keys are needed to override and even worse, said override key will actually arm the weapon if it's in a disarmed state.
    • And then that the key can only ever be used once.
    • More intelligently integrated into Peace Walker, which is a "fail-deadly" nuclear device designed to guarantee a retaliatory nuclear strike in the event of an attack, creating true MAD (the idea being that humans might not be able to launch a retaliatory strike, knowing they'd wipe out an entire country). Unfortunately, the flaw is that it doesn't personally check to see if nukes are being launched, and the story revolves around someone feeding the machine fake data in order to make it strike.
  • Fallout 2 and 3. In the second game, a thorough letter was found about what would happen if someone disables the Oil Rig's reactor cooling systems: a megaton-sized meltdown (not that it's possible in real life but whatever). Guess what you need to proceed: hack the control system or simply blow it up. Gecko's reactor also counts since you are ordered to shut it down; you can interpret it as fixing the coolant leak or running into the active zone and turning a big red valve to shut down the cooling system altogether, causing a nasty meltdown and forcing a whole city to relocate. Now then, why would anyone make the cooling system controlled by a single valve and more importantly, why would anyone place that valve into somewhere you can't reach it without being roasted alive?! Fallout 3 had Project Purity: when you finally mow down the Enclave defending it, Li warns you that it was damaged in the fighting; the tanks are experiencing critical overpressure and the whole thing will explode unless the purifier is turned on to relieve the pressure. Thing is, another failsafe blew out and the control room has a fuckton of radiation inside so it's going to be a one-way trip. Earlier, you can read a message that Vault 87's GECK room has radiation purge systems but they can't deal with all that radiation coming from the outside and are constantly failing (that's another example of Fridge Logic: 200 years elapsed and the rest of the wasteland is safe so why would the vault still have 700 rad/sec at the entrance?).
    • In Fallout 2, there's a robot you can control in the Gecko reactor, so it's not all that mind-boggling. And the ghouls are immune to radiation, so again they wouldn't mind waltzing in and turning the valve.
    • That's not unusual; ambient fallout lingering in the environment gets dispersed more quickly and lowers radiation levels to acceptable levels, while sealed, enclosed spaces can remain radioactive for quite some time. This is the case with Chernobyl today; you can walk through it, but you should stay out of most of the buildings.
    • Or the Fallout 3 DLC "Broken Steel": two confirmations and no password protection whatsoever are NOT enough to prevent the Mobile Base Crawler from calling down an orbital strike on itself, is that too hard to realize? If you have a really determined foe on a Roaring Rampage of Revenge toting a BFG, a platoon of infantry with air support isn't foolproof either. Clue: the Enclave learned this the hard way.
    • The worst part about the Fallout 3 example is that you can get three followers that are completely immune to radiation (a robot, a super mutant, and a ghoul), plus you can buy a slave. Despite this fact, all four will refuse to go into the purifier in your place and turn it on.
      • This is fixed in the "Broken Steel" DLC, where they will agree to go. Either way, you survive (despite the ending cutscene).
    • Fallout: New Vegas has one implemented due to carelessness before the bombs dropped in the Dead Money DLC. Before the bombs fell, the Sierra Madre was set up to broadcast a distress signal on its external radio antennas if something catastrophic happened. Unfortunately, since all the radios were set to broadcast music for the Grand Opening until after it was finished (three guesses as to what happened before the Grand Opening), the 'emergency' broadcast is still a broadcast inviting people to the Grand Opening Gala Event at the casino proper. Since the broadcast was designed to repeat until help arrived, this has led hundreds of unwary explorers to their deaths over the years...
  • Brave Fencer Musashi: Whoever designed Steamwood is NOT an engineer. Any damage at all to it results in catastrophic pressure build-up which can only be released with eight separate valves on separate floors which have the most convoluted method of operation seen even in a video game. I'm surprised Grillin' Village is more than a steaming crater.
  • In the Infocom game Suspended, the player wins a lottery to function as a fail-safe for the global weather control systems. Of course, when everything goes wrong, it turns out the player's robots are broken, the repair center is a mess, your provided documentation contains errors, and nobody actually told you how to find out what the problem is or how to fix it. Of course if you are taking too long to fix the problem and the casualties are piling up, actual repairmen will show up to deal with the issue. They start by turning you off -- as it's assumed these failures could only occur if you were causing them.
  • Might and Magic VIII plays with it. From your perspective, the failsafe itself[2] is the failure (as it will cause the destruction of the world, all for no gain). From the perspective of the ones that implemented the failsafe, the failsafe works perfectly, you just happen to be collateral damage (as the reason why the failsafe is there is to stop the Kreegan from subverting Escaton).


Web Animation[edit | hide]

  • Sarge's quote in the quotes section from Red vs. Blue highlights his tendency to build machines with "failsafes" that end up backfiring on him somehow.
    • For example, the bomb that he built into Lopez could be armed remotely, but Sarge designed it so that he himself couldn't disarm it, just in case he was captured and brainwashed into helping the Blues. Brilliant.
  • Parodied in a Homestar Runner cartoon. In a Strong Bad e-mail that parodies Star Trek, Strong Bad activates "the forward humbuckers" to prevent his ship from colliding with a comet. A message comes up on a screen saying "the forward humbuckers have never worked."


Web Comics[edit | hide]

Web Original[edit | hide]

  • Doctor Grordbort's Contrapulatronic Dingus Directory (a mock catalogue of Steampunk rayguns and other non-existent devices of the scientific-romance era) is full of warnings about involuntary sterilization or the loss of "only some limbs" through mishandling of the devices, which is frightfully easy to do due to their needlessly complex controls and lousy human-engineering.


Western Animation[edit | hide]

  • In Ben 10: Secret of the Omnitrix, the titular device gets messed with in such a way as to cause it to start a countdown to an explosion that will destroy the universe. The subversion is that that is the failsafe. The creator figured that destroying the universe itself was better than having the thing fall into the wrong hands. Which raises a number of questions that have never really been resolved.
  • During the first Season Finale of Megas XLR, Coop frantically searches his dashboard for a button that will save the world, only to discover that the button actually labeled "Save the World" was marked "Out of Order".
    • A more serious example occurs when Megas' breaks its protonic stabilizer, a part that seems to be the ONLY thing that keeps its reactor core from immediately Going Critical with enough force to destroy a planet. A SCRAM or shutdown of the reactor is apparently impossible, as it's never mentioned as an option.
      • The Photonic Stabalizer is the Power Core of the mech. With it shut down, MEGAS wouldn't be operable.
  • In one episode of Archer, a computer virus infects the mainframe and threatens to upload all the spies' names to the virus' creator. They get the idea to just unplug the mainframe until everything can be sorted out, but it turns out the mainframe has a battery backup. Behind a nearly indestructible locked door. Whose lock is controlled by the mainframe.
  • My Little Pony Friendship Is Magic, season 2, episode 1 almost invokes this by name. Magical chaos is running wild. Twilight Sparkle, having seen this happen before (mainly from her own spells), has developed a failsafe spell for just this sort of occasion, and sees no reason to not use it at the first sign of major trouble. But this time, the source is stronger than she is used to, so...

Twilight: My failsafe spell...failed.


Real Life[edit | hide]

  • The Moorgate Tube crash of 1975 occurred when the driver of a London Underground train failed to release the Dead-Man Switch that would have stopped the train had he done so. With grim irony, this is he had had a stroke, and as a result, was holding onto it with a death grip.
  • A combination of corporate negligence and incompetent design meant that the Therac-25 radiation therapy machine would occasionally kill or maim a patient with a massive overdose of radiation. Engineers now cite the Therac-25 as a textbook example of how not to design a safety-critical system.
  • Darwin Awards are often awarded to people who go to extreme lengths to override failsafes in their determined effort to get from point A to B or retrieve a fallen object or some similar minor objective. Like this man, who had to try really hard before he could get run over.
    • Don't try to unjam a woodchipper without turning it off first, or you might Fargo-ify yourself, as at least one Darwin Award winner did.
  • The sinking of the Titanic on 15 April 1912 could have been largely averted since the ship was built to accommodate more than enough lifeboats for passengers. The ship set out on its maiden voyage with only one-third of the available lifeboats equipped, because ships at the time only allocated lifeboat space based on percentages (for some bizarre reason) that related to the tonnage of the ship. As a consequence, only a small amount of passengers actually made it into lifeboats. This was the standard shipping practice at the time and was hastily changed afterward. The idea that the designer and crew were "so confident of it being unsinkable" is an urban legend.
    • The number of lifeboats on the Titanic was within the standards of the day. At the time, it was believed that any passenger ship would take long enough to sink that other ships would be able to reach it before it sank completely. The only ships that might have enough lifeboats for everybody were warships, which would be inclined to sink a lot faster under the conditions that would be expected to make them sink.
    • In fact, the ship might have survived to return to port if it hadn't tried turning to avoid the iceberg. In the actual event the iceberg scraped along the side of the ship and sprung leaks into multiple watertight compartments. If the ship had hit head-on, only the front compartment would have been compromised. The ship could have remained floating with just one compartment flooded. Slowing down as much as possible first would still have been advisable, mind you.
  • The failsafe systems at the Chernobyl nuclear power plant didn't do any good, because the operators had deliberately disabled them to increase power.
    • Worse, they were trying to increase the power because the safety procedure they were testing wasn't working. Even worse than that, The reactor was scheduled to be turned off the day before, but the minor problems on the grid prevented it from doing so during the well-trained day shift, and the inexperienced night shift first did a mistake in lowering the reactor's power, and them, trying to correct it, put the reactor into a completely untested and unstable mode, and then decided to proceed with the test rather than turn it off and try next time.
    • The problem was exacerbated by several design features that were clearly less then ideal. For example, the control rods were poorly designed—in an emergency reactor shutdown (a.k.a., a scram), the control rods are supposed to be dropped en masse into the reactor to block neutron emissions and shut down the chain reaction. Due to a decision to place graphite tips on the control rods (the same material used to moderate the reaction in the first place), a scram would have caused a sudden power flare before damping the reaction; making the problem worse, the control rods at Chernobyl were manually operated.
      • The original design intended to have a whole sequence of fail-safes that would engage one after another, allowing for enough time for the "graphite effect" to wear off in SCRAM. However, with all them disengaged, SCRAM rods ended up as not just the last but the only available safety feature—and because of the aforementioned design flaws failed at that quite spectacularly.
      • However, the fact that RBMK reactors—the same type as in Chernobyl—were successfully operated many years after the catastrophe—highlights the fact that their design is not nearly as poor as the idea to switch off built-in safety features.
    • The Canadian CANDU nuclear reactors are considered the safest reactors in the world, specifically because of the incredibly numerous failsafes for the safety mechanisms.
  • On the Boeing 747's first flight, the backup batteries that would have powered the hydraulics in case of engine failure failed upon takeoff. Doesn't sound like much until you find out that the newly introduced high bypass engines were very finicky and the engineers had little idea of whether they would stall upon takeoff due to the change in the angle of attack in the air (these engines stalled very easily - at the time a tailwind would easily lead to a stall). Engine stall = no power = no hydraulics = no control over flight surfaces = guaranteed crash = 700,000 lb bomb loaded with jet fuel. Thus, they strapped on some batteries to power said hydraulics, but the batterries failed. Fortunately the first flight went according to plan. Source: Wide-Body: The Triumph of the 747 by Clive Irving.
  • The tale of The Gimli Glider. On July 22, 1983, a combination of underfueling Air Canada's brand new Boeing 767 and a faulty fuel level sensor led to its pilots not knowing they were low on fuel until they ran out - at 41,000 feet in midair. To top it off, many of the instruments in the cockpit were electronic and designed to be powered by the same jet fuel - meaning that the pilots were flying blind and had no ability to control the aircraft. Fortunately for all concerned, there were a few battery-powered backup systems, a nearby decommissioned landing strip (the former Royal Canadian Air Force Station Gimli), and Captain Pearson happened to be an experienced glider pilot. The Gimli Glider managed to land safely (though it blew out a few tires on the landing gear and skidded to a stop on its nose) with no fatalites and only minor injuries. The Gimli Glider was repaired and flew for 25 more years until it was decommissioned in 2008.
  • On Aug. 18, 2003, Hitoshi Nikaidoh, a surgical resident at St. Joseph Cristus Hospital in Houston, Texas, was decapitated by an elevator with faulty door failsafes. The car was supposed to be "out of order", but some jerk removed the sign. Did anybody think to cut the elevator's power?
  • Pressurized or liquid gas cylinders are nasty things if not treated nicely. There are very good reasons why cylinders have pressure release valves and rupture disks, and why they shouldn't be diked out.
    • As MythBusters demonstrated with a gas cylinder, it punched a nice, clean hole through a cinder block wall they'd built for the test. In the process, the wall was shoved back noticeably.
    • It's worse with butane. If it's refilled cold (and boiling butane itself contributes to this) and one doesn't leave enough of "empty" space (which is required for this very reason), it's possible that when the can warms up, pressure will raise enough to condense all the gas. Then it warms up a little more and expanding liquid pops the can. Safety valves help, but not necessarily enough (since this happens in the first place because gas bubble was too small, pressure can raise much faster than from uniform heat expansion valves are supposed to handle), and not necessarily make things quite safe even if they work perfectly (since butane is heavy and tend to stick around as a fuel-air cloud).
  • In the original space shuttle design that caused the Challenger disaster, the joints between booster rockets segments were sealed by two thin rubber o-rings, the second ring supposedly holding secure in case the first ring was burnt out. Thanks to near-freezing temperatures outside both rings failed to seal and were vaporized. There was nothing else to stop the leaking flame from the rocket burning away one of the support brackets holding the booster to the external fuel tank, which subsequently ruptured and broke apart due to aerodynamic forces when the insufficiently secured booster smashed its nose into the tank. Also, pressure suits and the cockpit ejector seats had been discarded after the first few missions since crews of seven couldn't be ejected during launch and anyway NASA didn't think they'd be needed after the first few "test" missions. By contrast there have been two different Cosmonaut crews saved from certain death by the escape systems launching their Soyuz capsules away from an exploding rocket.
    • The SRB design itself wasn't flawed. It worked fine—so long as you maintained mission parameters and didn't try to operate a solid-fuel rocket dependent on o-rings well below temperatures for which it was rated, or reused parts that were obviously deteriorating. The Morton-Thiokol engineers who designed it knew this and objected to NASA and their own higher-ups overriding their recommendations. As for the lack of ejection seats, because of the orbiter design only the pilot and commander could have ejected anyway, and possibly the two crew seated behind. Because they were a deck down, other passengers (in Challenger's case, including teacher-in-space Christa McAuliffe) would have died anyway. Apollo and Mercury were both equipped with escape towers, as the Soviet capsules still are, that would pull the entire capsule free, while the Gemini design allowed for ejection seats.
      • The Soviet Buran shuttle was launched some two years after the Challenger catastrophe, but included the ejection seats for the whole crew right from the start of the design process. In fact Soviet designers have long (and quite vocally) criticized the Space Shuttle cockpit layout, calling it a throwback to the WWII era bomber cockpits, and made a point of putting all crew seats on the same deck. Had the Shuttle used the same layout, at least several crewmembers could've possibly been saved.
  • The crew of Soyuz 11 died from a leaking pressure valve during reentry.
  • Apollo 1 and Liberty Bell 7. The latter case came first. After Virgil Grissom's Mercury capsule splashed down, the explosive bolts on the hatch (which were there for emergency egress purposes) went off, allowing water to rush into the tiny capsule and sink it. Grissom was very much in danger of drowning, as the crews of the recovery helicopters were trained too well and had to realize the capsule was a lost cause before finally realizing the astronaut was also having difficulty staying afloat and rescuing him. After losing a spacecraft, NASA decided explosive bolts were a bad idea, and did not incorporate them into subsequent designs, instead opting for doors which required much more deliberate effort to open. On the Apollo 1 spacecraft, the door opened inward. They claimed this was safer. While the crew of Apollo 1 was doing a test on the ground, a fire started in the capsule. The atmosphere inside was pure oxygen at greater than sea level pressure, and NASA had also managed to put all kinds of flammable materials in the cockpit. As the fire rapidly grew, the pressure grew inside the cockpit, making it impossible for any human being to open the inward-opening door, which, due to the lack of explosive bolts, could not be blown open. Smoke and fire turned the cockpit into a fiery tomb from which there was no escape, and all three astronauts died. One of these astronauts was Virgil Grissom. I'd say that fits this trope pretty well, wouldn't you?
    • The fact that the cabin was pressurized to about 2 atmospheres, to reflect net outward pressure the capsule would experience in space, was also a major contributory factor. It meant there was both far more oxygen available to accelerate the fire and that an inward opening door was simply physically impossible to open until the oxygen pressure was reduced. Just as there were no emergency bolts, there was also no means for depressurizing rapidly.
    • In case you flunked chemistry, since oxygen allows for a fuel to burn, concentrations of oxygen make it easier to burn things and pure oxygen (without inert gases to get in the way and take waste heat) cranks any flame in it Up to Eleven. 1 atm of pure oxygen is enough to let iron nails burn much like magnesium, if slower. In a 2 atm pure oxygen environment, nearly anything is a "flammable material".
    • A similar story happened in the Soviet program too, but wasn't really a case of the failsafe failure—only the crew error. The cosmonaut on a week-long test in an oxygen chamber decided to brew himself some tea and turned on a hot plate. As he was scheduled to have some medical tests taken that day, he needed to clean and disinfect the electrodes' attachment points on his skin, which he did with an alcohol-soaked cotton swab, which he then proceeded to unthinkingly throw in the general direction of the garbage bin. Unfortunately the swab landed right onto the hot plate, and in a pure oxygen atmosphere of a chamber combusted immediately, starting a major fire. Due to the design of the chamber door it was opened only some 20 minutes later, when the cosmonaut in question, Valentin Bondarenko, already sustained a third degree burns, from which he died a couple days later.
  • The Apollo 13 Failsafe Failure was even more spectacular (the fact that NASA managed to bring the command module home with all three men alive is often considered NASA's Crowning Moment of Awesome). It was a whole series of Failsafe Failures.
    • The faulty oxygen tank on Apollo 13 was jarred during the craft's assembly because someone forgot to remove a screw that was holding it in place, causing it to be pulled up a few inches and dropped by a machine arm. This knocked the drainage system in the tank out of alignment, which prevented the tank from draining its liquid oxygen properly. The tank was designed with electrical coils that could be turned on to heat the oxygen inside for use in flight, so during a test run, someone turned on the heat... and since the tank wasn't draining properly, the oxygen didn't heat up and drain out normally, so the temperature inside the tank just kept climbing. There was a fuse inside which should have opened and broken the electrical circuit if it got too hot, but someone put in the wrong kind of fuse, which was designed for much lower current, and the too-high number of volts going through welded the fuse shut. There was also a human watching an outside temperature gauge that registered the heat inside the tank, but the gauge was only designed to go up to 80 degrees Fahrenheit (which was about 200 degrees hotter than the sub-zero oxygen would be stored in). Since the needle never went above 80, he didn't realize the tank was getting up to 1,000 degrees inside, and as a result the insulation coating those electrical wires inside the tank completely burned away... which left them vulnerable to sparking. The faulty tank was launched on schedule as part of the Apollo 13 craft, and four days into the flight, they flipped on the heat in the tank, it sparked and ignited the tattered remains of insulation, and fire burst inside the tank promptly exploded it.
    • To make matters even worse, the Apollo craft carried two oxygen tanks for extra safety, but they shared some plumbing, and when tank #2 exploded, it took several critical parts of tank #1 with it. Result: most of the oxygen in both tanks was lost, and the astronauts inside would have died in a few hours if they hadn't been carrying around a healthy lunar module with its own oxygen supply.
    • At least the nuclear fuel rod cask was fine. A miniature nuclear reactor was built to power some instruments that were to be left permanently on the moon, but just in case the mission never got to the moon, a ceramic cask was built to contain the nuclear fuel, and it was designed to survive a fiery reentry to Earth... just in case. And survive it did.
      • The SNAP-27 carried on Apollo 13 (and Apollos 12, 14, 15, 16, and 17) was a Radioisotope Thermoelectric Generators, essentially an atomic battery, not a reactor (there's no chain reaction fission going on, that's the clue). It basically turns the heat of spontaneous radioactive decay into electricity via thermocouples. It is similar in design to the RTGs carried on Pioneer, Voyager, Cassini, and the New Horizons probes. They've also been used to power lighthouses, Antarctic science experiments, and anywhere you need a decades reliable power source. It was probably the most reliable components flying on Apollo 13. Even if the cask had ruptured during reentry, the plutonium inside would have remained intact and ended up at the bottom of the same 20,000 ft. deep trench, doing absolutely no harm to anyone for the next 5,000 years.
  • As shown on MythBusters: plugged safety valve on water heater + thermostat failure = steam-powered ballistic missile. As reported all over the news, one such incident occurred in a strip mall in Burien, WA on July 28, 2001.
  • The original DC-10 airliner cargo door fault that caused the 1974 Paris disaster. Firstly, the cargo door opened outwards, as opposed to inwards. This meant that air pressure inside the plane would naturally try to force it open, requiring a complex set of locking hinges and pins to keep it closed. Secondly, the door handle was supposed to be impossible to close unless all the pins were safely latched, but in practice, if enough force was applied to the handle, the internal mechanisms would bend out of shape without latching. So, the door could still appear to be closed and locked even when it wasn't. Thirdly, warning placards to inform the ground crew of the potential problem were installed, but they were only in English, which most ground crews around the world couldn't read. And finally, when the door blew out, the pressure change collapsed the cabin floor and severed all of the aircraft's control lines, including the redundant backups, rendering the pilots helpless. Airliners have floor vents to prevent such a catastrophic failure, and the DC-10 DID have floor vents, just not in the area of the cargo door, for some reason.
    • And the reason that there were even warning placards in the first place? In 1972 there was a American Airlines flight that had the exact same problem, but because there was no one on the seats that were on the section of floor that collapsed, it cut some of the control cables leaving just enough control to land the plane with no loss of life. Because the only way that the FAA could force McDonnell Douglas to fix the planes was to ground them all and not let them fly before the door was repaired, there was a gentlemen's agreement between the head of the FAA and McDonnell Douglas to fix the doors without forcing them to ground all their planes.
  • Modern Formula One cars have anti-stall systems in the engine management computer. These are very useful as long as they don't go off accidentally on the starting grid and put the car into neutral when it ideally should be in first. This is more embarrassing than dangerous though.
  • HMS Ark Royal sank after being hit by a torpedo that, among other things, caused flooding that shut down the boiler which powered the emergency pumps and all the electrical generators, the ship having been built without dedicated emergency generators separate from the main system. Oops.
    • USS Enterprise (the first one, CV-6) had a steering engine breakdown in the middle of one of the carrier battles for Guadalcanal, jamming the rudder into a hard turn. Fortunately the crew had rigged an emergency steering engine in case this very thing happened. Unfortunately, everyone in the compartment with the backup was knocked out by toxic gas released from nearby fires. It took nearly thirty minutes for someone to reach the compartment, and before he could turn on the backup motor he passed out as well; fortunately he came to fifteen minutes later and managed to turn on the backup motor. While all this was going on, another Japanese air raid was detected approaching but turned away fifty miles out.
    • The Japanese carriers at Midway had their emergency generators for the firefighting system just off the upper hanger deck, about at the midships line. This placed them on the same deck where any bomb that struck the carrier would probably explode, at about the spot enemy pilots would use as aiming point. At least two of them probably lost the backup generators to shrapnel from exploding bombs.
  • Inversion: Electrical codes require failsafe protection (fuses or circuit breakers, for example) to be on all circuits, to stop the current flow in the circuit when the wire gets hot enough to possibly catch on fire. Aspiring electricians will have the failsafe rules for preventing electrical fires hammered into their heads repeatedly (electrical fires being as much as if not more of a danger than electrical shock). So it is jarring at first, to learn that circuits for fire pumps MUST NOT have any fuses or circuit breakers of any kind. Why? If the fire pump is running, it is assumed there is already a fire, and a fuse or breaker breaking the circuit (and shutting off the pump) isn't going to improve the situation.
    • Probably applicable only to the American grids, which have a peculiar system where the neutral wire is isolated from the ground. European grids have the neutral grounded, so short circuits do not usually propagate for a large distances, making this requirement somewhat irrelevant.
  • One of the main causes of the Three Mile Island nuclear accident was a pressure relief valve sticking open. At first the dangerous pressure is relieved—and then the coolant keeps escaping through the stuck-open valve.
    • Which the operators would've noticed, if the indicator light had been connected to the valve itself rather than the switch that controlled the valve.
    • Adding to the problems there, the plant was being operated with several alarm lights permanently locked on due to some type of failure in the system causing the alarms to read false, and instead of fix the issue they just ignored them. So when the things which those alarms were supposed to be monitoring actually reached the alarm point... no-one knew.
    • Ironically, the manager on duty at the time of the accident had gone to see a movie the night before... The China Syndrome, which is about safety coverups at a nuclear power plant, complete with a near-meltdown situation.
  • The SL-1 reactor, site of the only fatality directly caused by a nuclear incident in the US. It was built for and run by the US Army as a prototype for a small, semi-portable reactor to power mobile command centers. A technician was performing a maintenance test on it while it was shut down. Said test required him to manually elevate the reactor's only control rod a few inches. He raised it up almost 2 feet. The reactor became instantly active and went prompt critical,[3] the sudden power spike caused the water in the reactor to superheat and flash to steam, and the pressure surge ejected the control rod, which impaled the technician on the roof of the compartment. Luckily the other failsafes that weren't violated/ignored to do this kicked in and shut down the reactor, but not before the other two people at the site were killed by enough radiation to require all three to be buried in lead-lined coffins entombed in cement.
  • The Deepwater Horizon oil spill in the Gulf of Mexico occurred because the blowout preventer, a supposedly idiot-proof device that seals the pipe in the event of something like, say, a rig explosion, failed. It turns out that the device had been tampered with (one of the rams that would have sealed the pipe was taken out to make room for some kind of monitoring equipment, amongst other things) but it's still a great example.
    • It gets better. That was the backup device. Someone noticed a problem with the primary during some tests, hence Transocean fitted the monitoring kit and said "Oh, don't worry, the backup will take care of it." Predictably, when it was called upon, the backup failed.
      • IT STILL GETS EVEN BETTER, there has been a rumor that the alarm was turned off so false alarms wouldn't wake people up. No wonder 11 people died.
    • BP's safety record is one of the worst in that regard, in that the disabling the failsafes and monitors to increase productivity seems to be a SOP for the company. For example, the earlier Texas City Refinery Explosion occured partly because someone disabled aan overflow alarm, which, when the other one broke, started a chain reaction that killed 17 people.
  • The Big Bayou Canot train wreck of 1993 happened because a barge struck a railroad bridge hard enough to kink the tracks, but not hard enough to actually break them, which would have set off warning signals and stopped the train.
  • Cancer is an example of a failsafe failure of a failsafe failure of a failsafe failure of a failsafe failure. Precancerous cells are a normal occurrence in the human body due to imperfections in DNA replication. Fortunately human cells have proliferation control mechanisms, failures in these systems can cause inappropriate cell division. On top of this, cells have other control mechanisms such as cell contact signals that stop cell division, cell survival signals, immuno-detection of cancerous cells, telomeres limiting the number of cell replications. These other failsafes are overcome by natural selection and the law of large numbers though further compounding mutations during DNA replication in subsequent generations of the cell line.
    • Part of it is glycolipid monitoring. Normally glycolipids sit on cell membranes. Cancers often make an affected cell overproduce those too, so they start shedding away. Killer T cells are watching for this and respond by locating the source and reducing it to goo. If things go far enough, however, glycolipids produced in greater than even "trail of smell" quantities will confuse T-Killers, so not only the cancerous cells are not found, but the immune system goes haywire (which is why it's a common complication - in case of humans, that's when chemical treatment is administered and takes a turn, since most things that slow down cancer also kick immune system when it's down).
  • The fantastically elaborate design of the stuxnet worm managed to override every safety system used to ensure that the gas centrifuges at the Natanz nuclear facility couldn't malfunction. The whole system, from the Windows operating system of the controlling workstation to the SCADA PLD controlling the speed of the centrifuges was essentially taken over by the worm. The worm even managed to make the SCADA system "lie" to the computer connected to it by playing back the data from a normal run, à la Speed, as the PLD caused the centrifuges to spin out of control. This was caused deliberately of course, but it shows that human ingenuity, as well as human stupidity, can override failsafe systems.
    • Which is the reason why any reactor is not in any way connected to the outside world other than maybe a simple telephone (which is a separate entity as well to be extra sure).
    • Thanks to USB drives and the genius at Microsoft who thought "let's allow plug and play media to run programs as soon as inserted without the user knowing" that's not a problem.
      • While that was a dumb design flaw in Windows (since fixed), ask yourself why computers in that sensitive facility even had USB ports? The ports should have been plugged, or otherwise physically disconnected.
    • The Stuxnet worm is a bit of a special case, as it was not only designed to cause the system to fail but, well, All The Tropes is not necessarily saying it was designed with the help of the company who built the centrifuges...
  • The failsafes in the Fukushima #1 nuclear power plant worked as intended after the 11 March 2011 earthquake in Japan, safely stopping all three of its working reactors. But then the tsunami came and washed out all the emergency diesel generators that some genius placed right at the shore, and the plant's connection to the grid was severed by the quake. So the plant lost cooling at all of its six reactors, which led to the successive meltdowns of at least three of them.
    • Even that might have been dealt with because of another safety measure designed in: the reactor power system had the capability of getting electricity from truck-mounted mobile generators, which were, in fact, on scene within a few hours. Only problem was that no one had verified that the generators and the power system they were supposed to supply emergency power for had compatible attachments for the power cables.
  • The Russian submarine Kursk sank due to one such design flaw: a faulty torpedo was able to leak hydrogen peroxide, which proceeded to react with the torpedo casing, causing the first explosion which then set off the fuel and munitions in the torpedo bay. There are arguably several flaws in the design that let this happen (starting with explosions ideally not being the immediate consequence of a leaky pipe).
  • During the Cold War, the USSR employed what was called the "Dead Hand" system, in case the West were tempted to attempt to destroy Moscow and other Soviet military installations, leaving the USSR incapable of commanding the launch of its own nuclear weapons. Dead Hand was a network of radiation, seismic, and pressure sensors to detect a nuclear attack, combined with radio beacons at Soviet bases and cities (if the beacon stops transmitting, then it has presumably been destroyed). The system could, in theory, be placed in automatic mode, whereby it could, upon detecting an enemy attack, automatically launch the entire Soviet nuclear arsenal at pre-programmed targets without any human intervention.
  • As Pieter Hintjens summed it up in Confessions of a Necromancer,

The lesson here is that making systems more complex, to try to make them "reliable" will usually make them less reliable.

  • On a lighter note, the whole point of commodities futures market (aside of risk management) is that all the brokers and traders deal only in contracts and by the time actual physical goods are packed, middlemen already have sorted it out on paper, so all that's left is to move containers from the producer more or less straight to the purchaser. Of course, there are several layers of safeguards to ensure all this goes smoothly - unless several people and automatic checks fail at once. But, as usual, having multiple safeguards allows some of them to get away with bad practices or not working at all until others fail, while automation greatly enhances natural human ability to screw up. Thus now and then some unsuspecting office finds itself in "The Big Blunder and Egg Man" episode of The Many Loves of Dobie Gillis.
  1. VOY episode "Day of the Dead", and the 2009 preboot
  2. Once Escaton has begun the process of destroying a world - which he only does if it is infested by Kreegan and he deems it unable to remove that infestation even with his help - he cannot willingly stop, no matter what
  3. Despite what Hollywood Science says, a reactor Going Critical is NOT a bad thing. All it means is that the reaction is self-sustaining, i.e., it's turned on. Prompt critical on the other hand, means you're screwed before you even have time to say "Oh Crap."