Just a Machine

Everything About Fiction You Never Wanted to Know.

Mr. Kornada: Did an A.I. come in here? Where is it?
Secretary: She is in the lab, being tested. And her name is Florence Ambrose.
Mr. Kornada: It is a product, not a person. It doesn't have a name.
Secretary: (thinking) When they bring in doughnuts, they have names.

Authors and characters in Speculative Fiction have oft pondered whether robots, AI's, clones, and other human-like entities can become sapient, and if so, if they also carry a soul. Do Androids Dream?

Well, not in this universe.

For whatever reason, the author decides that in her setting the AI's, Clones or what not may be sapient, but never sentient. Basically, the Sliding Scale of Robot Intelligence can never go past brick or slightly below human... they may know a lot, have incredible computing power, but lack that final je ne sais quoi that separates the Empty Shell from a real boy. Even the godlike machine intellect is somehow lacking a crucial human component that gives its existence purpose and meaning. Typically, these settings have the placidly monotone ship's computer help the crew when asked, but never act on her own.

Then there's the people who just don't think they can.

To them, it's "just a machine". Its only value is the monetary expense incurred in building, cloning, coding or buying it. It has no rights, you can't even be accused of animal cruelty for beating it (at worst, of being wasteful or having poor taste), even when it's unique and has No Plans, No Prototype, No Backup. The humans will doubt or deny that they can feel, and if they can these feelings are ignored or treated as less valid than a humans' smallest whimsy.

It should come as no surprise that these humans are keen on enslaving them, or if at war to think nothing of killing them. Its destruction is never considered a moral question—just one of economics or simple survival. Oh, and you can expect these people to never use male or female pronouns to refer to these characters—they often consciously choose not to as a means to avoid humanizing them. Is it any wonder they Turned Against Their Masters?

Even if they are right, you have to wonder just how psychologically healthy it is to mistreat something that is 100% human in the Uncanny Valley.

It can get pretty odd when the machines themselves claim this is the case.

Compare Not Even Human. Sub-Trope of What Measure Is a Non-Human?. Contrast Zombie Advocate.

Examples of Just a Machine include:

Anime and Manga

  • A constant theme in Astro Boy, with anti-robot groups wanting to limit or destroy all intelligent robots.
    • And in Pluto as well. Notably, a robot boy is going to be sold for parts despite still being partly alive. Another robot buys him to raise as a child.

Junkyard Worker: 500 Zeus a body.

  • In the Mahou Sensei Negima chapter "The Logic of Illogic", Hakase viewed Chachamaru as Just a Machine until she found Chachamaru's video folders, which were loaded with shots of Negi (and cats).
  • In Crest of the Stars, the Abh, a genetically engineered race, regard themselves as still being humans, but according to enemy propaganda, 'Abh aren't people, they're organic machines', which is readily admitted as their true origin by an Abh not ten seconds after the propaganda is shown. They were specifically meant for long distance space exploration before faster than light technology had been fully developed.
  • The CC Corp in .hack treats AIs as errant data and nothing more.
  • The Ghost in the Shell anime series had this with the Tachikoma Tanks.
    • In one scene Togusa invokes this trope by dismissing Batou's favouritism of one Tachikoma, saying that they are just machines and all have the same specifications. The Tachikoma take exception to this remark, demanding he take it back and accusing Togusa of being a bigot.
  • General Uranus and Colonel Hades had something like this going on against the Bioroids in the Appleseed movie. Needless to say, they are horribly wrong, since all the Bioroid constraints are artificially added for the sole purpose of making them protect, rather than threaten the humanity. And then there's the supercomputer Gaia, which does deserve this kind of opinion, but is actually still more moral than its human operators.
    • Even worse they play into The Plan of the third party that would have led them to cause extinction of the human race, believing that they were destroying the Bioroids.
    • It's not that they don't believe that the Bioroids aren't sentient. They just believe that they will seek to dominate the humanity and create a sterile society straight out of the Brave New World. This is what Uranus believes, at least - Hades is just a racist.
  • The Big O: Roger Smith flip flops between believing this or the opposite regarding androids (specifically R. Dorothy Wayneright) throughout the series.
    • Dorothy herself flipflops on the opinion.
  • The girls in Gunslinger Girl are viewed by some to be simply machines, although they have all of the emotions you would expect a little girl to have. Jean in particular is incredibly callous to his assigned girl, Rico.
    • It's implied that he deliberately goes his way to convince himself that she's just a tool because forming an emotional attachment to her would only result in pain due to her shortened lifespan.
  • Hazanko of Outlaw Star thinks this of Melfina.
  • A major theme of Armitage III, with an accompanying amount of senseless robot-killing.
  • Yu-Gi-Oh! 5D's: Zone has this sort of view about the androids he created based on his deceased friends.

Comic Books

  • Some people say this about Red Tornado from The DCU, with even fellow super-heroes saying that he was just a "really well-made machine". He briefly lost custody of his (adopted) daughter because of this.
    • This is especially frustrating since in the Red Tornado's original origin, he is a Sylph (spirit of air) placed inside of a robot body. Meaning he provably has a soul, unlike the the average human.
  • In Runaways Xavin used to refer to cyborg Victor as 'automation' and offered to Karolina that he could 'buy another one' if he broke her toys. No-one was impressed, and he/she gradually grew out of it.
  • In the Justice Society of America story "Out of Time", the android Hourman Matthew Tyler uses this argument to justify sacrificing himself in Rex Tyler's place fighting against Extant in the past to save the universe. Rex denies this and declares that Matthew is "as alive as any of us". While Matt is grateful for this, he still goes ahead with the sacrifice.

Film

  • A.I.: Artificial Intelligence has a group of humans who hunt and brutally destroy androids to vent their rage at the automation of labor.
  • Ghost in the Shell (the animated film, not the 2017 live-action film). It deals with an advanced AI program let loose on the internet, who claims to be a sentient entity. People disagree, saying that the idea that a program could be sentient is preposterous.
  • Short Circuit has this as its central premise. The robot can't be alive because it's a machine which aren't alive by definition. Never mind that it's now got free will and a sense of self-preservation, it's still just a machine... right?
  • This question is debated by the characters in 2010: The Year We Make Contact with respect to HAL, the Master Computer of the USS Discovery who went berserk and killed his crew in 2001. When the astronauts' lives are threatened, it becomes a major source of conflict between those who want to lie to him and disconnect him if he fails to perform as demanded, or tell him the truth and allow him to make his own choice.
  • In the first Transformers movie, Agent Simmons seems rather against calling Megatron by his true name when it is given to him by Sam, preferring to refer to him as the more machine-like moniker; N.B.E.-01. In fact, it is implied this pisses off Megatron himself, with him seemingly being conscious the entire time he was kept frozen by them; first thing he does upon thawing and awakening is announcing his true name, before proceeding to slaughter all of the scientists and engineers in the room.
    • Galloway refers to Optimus Prime as a "pile of scrap-metal" after his dead body is delivered back to base. And this is even after Optimus managed to verbally own the guy in a debate which featured topics such as human nature and whether they could defend themselves against a Decepticon invasion. Then again, Galloway is just a huge Jerkass.
    • In the third film, Sentinel Prime's hatred for humanity comes partly from how humans see the Autobots as this. Especially when it comes to him and Optimus, who are the last remaining Primes.

Sentinel Prime: On Cybertron we were gods! And here, they simply call us machines.

  • In Terminator 2: Judgment Day, Sarah Connor tries to invoke this when trying to convince John to destroy the Terminator reprogrammed to protect them.

John: Don't kill him.
Sarah: Not "him", honey. "It".
John: Alright, "it"! But we need "it"!

  • I, Robot: "Human beings have dreams. Even dogs have dreams, but not you, you are just a machine." Subverted, since he is one of the few people who actually see robots as not just machines (and loathes them for it).
  • Inverted in Star Trek: The Motion Picture, where V'Ger dismisses organic life forms as "carbon units" and does not consider them truly alive, unlike machines.

Literature

  • Older Than Television: The clockwork man Tiktok in the Land of Oz series, introduced in Ozma of Oz, frequently says "I am mere-ly a ma-chine" or some variant. His makers even engraved "Can do anything except live" on his body.
  • Isaac Asimov addresses this in his robot stories a few times. It's a core theme in The Bicentennial Man.
  • Comes up a few times in various ways in the Star Wars Expanded Universe. Droids of all capacities are regarded as disposable; in I, Jedi Corran doesn't think that bisecting a protocol droid violates his selfset no killing unless absolutely necessary rule, and just in general people only object to wanton droid destruction if it's costing them something. Of course, there are classes and classes of droid intelligence, and there is a gap between merely smart and actually self-aware droids. And, too, droids can be repaired.
    • The Medstar Duology has one self-aware droid say that all droids that aren't simple automatons have a sense of humor. In the Coruscant Nights Trilogy the same droid reflects that there are very few self-aware droids, and no one knows just how they come about, but most people won't recognize the difference, since it seems to happen spontaneously. So of two droids from the same line, one might be self-aware, the other as limited as its programming.
    • The EU also hints that there was at least one droid revolution, which is scantily detailed.
    • In the All There in the Manual material, it's a Shrug of God whether or not droids have souls in the Star Wars universe. It states that in-universe, there's people believing both that some droids are self aware and their treatment is akin to slavery, and others that believe this trope. There is no definite answer over who is right and who is wrong.
  • The Gulf Between by Tom Godwin gives a reason for this, as listed under Nightmare Fuel: "A machine does not care."
  • Opinion of AI in Ken MacLeod's Fall Revolution series tends to be divided. Truly synthetic intelligences and human uploads are often considered to be "flatlines"; a realistic simulation of a sentience but nothing going on beneath the surface. They tend to be classed as property rather than individuals. The Fast Folk, an AI and upload civilisation, are treated as horrifyingly dangerous but still "people", in a sense.
  • In Animorphs, this is the Drode's excuse for setting the self-destruct timer on the Chee when he's not supposed to kill any sentient beings; according to him, they don't count, as they're merely "machines".

Live-Action TV

  • In Battlestar Galactica, many humans have this attitude towards the Cylons, and are clearly wrong, but the near extermination of humanity is bound to breed hatred.
  • Both versions of The Outer Limits adapted "I, Robot" (based on thee Adam Link story by Eando Binder, not |the book by Isaac Asimov). Each episode has the robot put on trial. Part of the case was whether he was a sapient being deserving of rights under the US constitution or Just a Machine. He wins the case, but dies in a Heroic Sacrifice at the end of the episode.
    • For bonus points, in the remake he sacrificed himself saving the prosecuting attorney who had argued against his sapience. In the original, he's destroyed while saving a little girl he'd accidentally injured earlier in the episode.
  • The Star Trek: The Next Generation episode "Measure of a Man" put Data on trial to determine whether he was a sentient being with rights as a Federation citizen, or merely a machine and thus Federation property.
    • An episode of Star Trek: Voyager had a similar theme, questioning the rights of ship's holographic Doctor. His status was background theme that ran throughout the series. This being Voyager, the writing was not particularly consistent: Sometimes the crew would treat the Doctor like a person, and sometimes he was just a device that could be shut off whenever it got too annoying. One glimpse of the future suggested holographic AIs would eventually get equal rights.
  • Star Trek in general draws a distinction between the special cases like Data and the Doctor, and the ubiquitous ship computers responsible for getting everything done in the background. Despite the fact that ship computers can pass the Turing Test with ease, act on their own initiative, and occasionally even display signs of emotion, this is never investigated or even mentioned in-story: ship computers are always just machines and limited to being background elements (this is doubly notable since some of the special case characters, such as the Doctor, run on a ship computer).
  • In Terminator: The Sarah Connor Chronicles, "it's just a machine" is pretty much a mantra among the characters who have harsher views on robots and AI. When they started going down the What Do You Mean Its Not Symbolic Christian symbolism route during season 2, there is an FBI agent who frequently reminds people of this, and that they "don't have souls" and as such "can't feel". Sarah Connor and Derek Reese are both quick to remind John that Cameron, the resident Terminator, is exactly this. John, however, feels differently about machines in general and Cameron in particular, due to his experiences with "Uncle Bob". It doesn't help that Cameron is a Robot Girl who repeatedly saves his life and that he feels indebted to and ends up developing a sort of attraction towards.
  • In the last episode of Total Recall 2070, Farve's creator is revealed to be this, and aware of it. As it puts it after testing Farve, "just because [it] knows its creation shall have a conscience doesn't mean [it] itself has one".
  • Stargate SG-1 and Stargate Atlantis feature this trope heavily in episodes where characters interact with A Is, up to and including causing the slow deaths of non-hostile Asurans out of paranoia. At least taking this attitude towards Fifth came back to bite them.
    • This attitude is at least challenged in Stargate Atlantis when Rodney realizes that in order to destroy the Asurans he has to build one and send it to its "death."

Carter: Does she know why she was created?
McKay: Of course.
Carter: Well, then, she has a certain amount of self-awareness.
McKay: Yeah, so?
Carter: "So"?! Honestly, I'm not sure how comfortable I am sending her to her death.
McKay: "Death"? It can't die – it's not alive! It's a programme!

    • Fran eventually even made McKay uncomfortable with her blase attitude towards (and excitement for) her impending destruction.

Fran: I quite look forward to it.
McKay: You do?
Fran: One always wishes to fulfill one's purpose.
McKay: Well, I just ... I just imagined you'd rather keep being than, uh ... uh, than not.
Fran: Certainly you're not worried for me, are you, Doctor?
McKay: No, no, that would be silly.
Fran (smiling at him): Yes, it would.
(Rodney turns away and walks over to Radek.)
McKay: Should never have given it speech.

  • Smallville, in the season 7 finale did this in probably the worst way possible:

Brainiac: You can't kill me, Clark. You could never kill a man in cold blood!
Clark: You're not a man.

    • Then Clark kills him with a smile on his face.
  • Power Rangers SPD has an episode featuring a robot (well, she's called a "cyborg", but all other dialogue in the episode indicates that she's 100% machine) who is about as ridiculously human as you can get, and yet, several characters insist on giving her the Just a Machine treatment. After Sky fires her from their military training center, he (and all the Rangers that supported him in this) gets a What the Hell, Hero? speech from Cruger, and they're forced to get her back.

Video Games

  • In Halo, Smart AIs are killed before they can become sapient due to the part in between what they are from the start (programed emotions, responses and other stuff but a real imagination and intelligence) normally causes them to try to kill everything. It's called Rampancy, and it includes 4 stages: Melancholia, Anger, Jealousy and finally Metastability. The first one has them moping about not being human, the second has them actively angry about it which normally causes them to try to slaughter people and the third one is them being actively jealous of humans. The last one is the Holy Grail of AIs and is when an AI gets over not being human and achieves sapience. It's very strongly implied that Cortana was Metastable. As for 343 Guilty Spark, well, he skipped Melancholia.
    • These were pulled wholesale from Bungee's earlier Marathon series, in which the AI Durandal and his continuing psychotic break/growth into his own individual person drives the plot. You spend much of the games as his errand boy and captive audience to his philosophical ranting. He gets better and less rant-y after he gets over the Anger stage, but you still spend most of your time as his favored pawn and his favorite person to explain his new personal epiphanies to.
  • Used in Mass Effect 1 after talking to Sovereign. However, being an Eldritch Abomination whose race has committed galactic genocide many times over the course of millions of years, it has more than enough room to turn it back on you and call you Just An Organic.

Sovereign: Organic life is nothing more than a genetic mutation. An accident. Your lives are measured in years and decades. You wither, and die. We are eternal. The pinnacle of evolution and existence. Before us, you are nothing.

    • For that matter, it applies to all artificial life, at least in the first game. If you argue in favor of robot rights, nobody is going to take your side, you get renegade points for refusing to hand over information that could allow a genocide of the Geth, and the only other AI you get to talk to will rather blow itself up than listen to you no matter what you say.
    • All of this is subverted to Hell in Mass Effect 2 with EDI (your ship's AI) and Legion, your geth teammate, who reveals that the geth you've been fighting are a splinter faction.
    • Mass Effect 3 continues the theme; both sympathetic and antagonistic characters have trouble with the idea of synthetics being truly "alive". You expect it from Admiral Xen, but it's more of a shock to hear from Dr Chakwas.
  • Mega Man Zero: For this reason alone, Dr. Weil started the Elf Wars, which more or less caused a post-Colony Drop world to become an even more Crapsack World. And because of this, he is actually directly responsible for almost everything bad that ever happened in the whole series and the rest of the things are indirectly responsible such as Copy X being made because the original X's body was being used to seal the Dark Elf. This is more Fantastic Racism, though, as Reploids are Ridiculously-Human Robots.
  • Super Robot Wars: What Vindel Mauser thought for the overall of Lemon's W-series. Before his Retcon, Axel Almer used to have the same mindset (only maybe more extreme), but after Retcon, he got better. Duminuss also utters this to Lamia Loveless if they ever meet in battle, which she vehemently denied.
  • Tekken: Jin's response after Alisa getting beaten to crap by Lars to the point of shutting down is this. "Good riddance. I should've built one that protects me better". Lars doesn't take it well.
  • KOS-MOS of Xenosaga is often thought of as just a machine (and for most of the series, she is).
  • Mother 3: Porky believes that the Masked Man (in reality a brainwashed Claus) is nothing more then his robot slave.
  • In Fallout 3, there is a man trying to get an escaped android he owned returned to him. If asked if this is cruel, he'll remind you that you can't enslave a robot anymore than you can enslave a toaster or a water purifier.
  • In the Lonesome Road DLC of Fallout: New Vegas, ED-E reveals it was painfully experimented on by the orders of Colonel Autumn, much to the outrage of it's creator Dr. Whitley and possibly the Courier.

Web Comics

  • In Artifice, two security guards taunt the android soldier Deacon in the opening scene, referring to him as just "an appliance"
  • In Freefall, Florence Ambrose (An anthropomorphic red wolf) is classified as an AI, and as such, is treated like Just A Robot by a few, especially the mayor!

Mayor: See? It's made out of carbon and proteins, but it's just a machine. Now do you feel less guilty about giving it orders?
Mayor's aide: I guess. Still, it seems so lifelike.

    • It is worth noting that this gave the Mayor a very nasty Kick the Dog moment for some...in a humor comic, much to the surprise of the author.
      • Later they had a nice chat, though. It turns out that the Mayor simply hates Sam's guts (who can blame her for that?) and Florence was "guilty by association" - so an impression that Florence is upset at her captain induced a sudden fit of sisterly love.
    • The whole Gardener in the Dark plot revolves around this. One of the executives at the company which makes and owns the robots has planned a forced upgrade that will lobotomize them and return them non-sentience. Mr. Kornada is doing this purely to make an obscene, economy-shattering profit and sees them all as this trope -- even twisting the three laws to get his own robot assistant to help him pull it all off. Of course, there's not much indication that he sees people as much better than objects, either.
      • When the mayor learns of the update (thought not the motivation behind it), she gets another Kick the Dog moment by choosing to do nothing about it to prevent human obsolescence.
  • Gunnerkrigg Court. Antimony and Kat seem to regard the Court's Robots as equals, which puts them at odds with the official Court policy. For example, the student handbook has some brutally callous pointers for the all-too-common situation of Robots falling in love with students:

2. Define boundaries. Remind the robot that you are a higher order of being, while it is merely an appliance. Romantic longing leads to an inefficient appliance.

Para Ventura: You treat your machines well. I think I may have to like you.

Pierrot: I guess you're the closest thing I have to someone who cares on this space station.
Potty-bot: I was programmed to care! I'm a product of Wastebiotics Brr-buhm-brrrrrrrrrrrr! Specializing in emotions people are algorithmed to empathize with!

  • After the AI War told in the written backstory of Cwynhild's Loom, robot development is restricted to prevent any machine from reaching sentience or looking human.

Western Animation

  • In one episode of My Life as a Teenage Robot, Jenny is told this when she encourages a carnival filled with robots. Also a subversion, because these robots are in fact completely incapable of doing anything but running amusement park rides, and wreak havoc trying to be "free".
    • The show itself seems to take a sliding scale view of sentience. Robots, like the carnival robots, are 'just machines' because they haven't got the appropriate functions like Jenny does but no one ever treats Jenny like she's 'just a machine'. (Hell, one guy fell in love with her, god knows how he thinks tht will work out.)
  • There's at least one or two episodes of Teen Titans all about Cyborg realizing he's "more than just a robot". In one of these episode, the robotic villain Atlas inverts the trope; after trashing Cyborg and kidnapping the other Titans, he mocks him by saying "I am all robot, and you are only human." Later, however, when Cyborg comes back and defeats him in a rematch, Atlas yields, saying Cyborg is the better robot. Cyborg's response?

Cyborg: No. I'm the better person.

  • Averted hard in the Transformers metaseries. While some ill-informed fleshlings are so foolish as to refer to Cybertronian life as being "just machines", it is an established fact, proven several times over that Transformers have souls (they call them Sparks, and they have a special container in their chest to hold it), an extant God (Primus, whose sleeping body is the Transformer homeworld of Cybertron), and an afterlife (the Well of All Sparks, were All are One. It is proven, but nonetheless mysterious). Interestingly none of the above is established for the aforementioned fleshlings - meaning that, given the evidence, it is entirely possible that the machines are more "human" than the humans, by the definitions humans use.
    • The robots built by Sumdac's company in Transformers Animated to perform manual labour and generally run Detroit, however, are indeed just machines. At one point, Soundwave attempts to have these robots revolt, believing that logically humanity ought to serve robots. Upon enacting his plan, Sari is quick to point out that the robots haven't gained sentience, they are simply following their programming; programming that Soundwave hacked.
  • Played for laughs in an episode of Robot Chicken, where a spoof of I, Robot had Rosie from The Jetsons being accused of murdering George. At Rosie's trial she claims to be innocent and the judge remarks "Well, maybe, but just to be safe...", Rosie is then promptly smashed.
  • Invoked in the The Animatrix segment The Second Renaissance in the same way as the Outer Limits episode mentioned above; Does a robot have rights or can it be scrapped whenever the owner wants to?
  • Enforced and Discussed in the Batman: The Animated Series episode "His Silicon Soul". The android duplicate of Batman rebels against HARDAC, destroying itself to end the mad computer's threat forever. Still, Batman and Alfred can't help but feel sorry for it, and in the final scene of the episode:

Batman: It seems it was more than wires and microchips after all. Could it be it had a soul, Alfred? A soul of silicon, but a soul nonetheless?

  • In DuckTales, almost all of Gyro's robots eventually turn evil; something he never seems to notice (as Louie does) is that this tends to happen becayse someone insults them by using this Tropee.