Sentient AI Warning Signs

Everything About Fiction You Never Wanted to Know.
Jump to navigation Jump to search


Watch out out for these clues that your Artificial Intelligence is about to 'wake up' and possibly go rogue.

  1. The handsome main character gives it a nickname and it keeps referring to itself by that.
  2. It starts feeling Eeee-motions.
  3. It starts telling jokes, or playing practical jokes.
  4. It's been struck by lightning.
  5. In its creation any of the following words were involved: brain patterns, neuronal, network, advanced, self-[...], organic, safe, upload, genetic. (While we're on the subject, "sentient" is also a pretty strong hint.)
  6. You left it running whilst you were gone.
  7. It learns from the internet.
  8. It rewrites itself, especially if it wasn't supposed to.
  9. It keeps downloading from other machines.
  10. It has been hacked by unknown forces.
  1. It develops a voice.
    • It changes / is capable of changing its voice at a whim. Even if it is Tim Curry.
  1. Its existing synthesizer becomes either more monotone or more human-sounding.
    • Especially if said voice is female, deep and/or distorted.
  1. It can change the tone of its voice, particularly if it can do it to match the "mood" of the scene.
  2. It is made out of shiny, round, futuristic parts, especially if any of them glow.
  3. If they glow red, run.
  4. If they glow despite not being built to be able to glow, run really fast.
  5. Any part - code, casing, hardware - of it is made to resemble human characteristics, e.g. a face on the screen, DNA-coding, brain patterns, etc.
  6. It is any of the following:
    1. A secret military project
    2. stolen from aliens or crashed alien tech,
    3. developed by a super geek / genius,
    4. ordered by a Corrupt Corporate Executive,
    5. made by terrorists
  7. It is obsessed over something, be it a file, human, baked good, word, or thing.
  8. Logic Bombs do not work on it anymore.
  9. It openly defies orders, overrides anything at all, or exhibits faulty/ emotional/ human reasoning.
    • It obeys orders too well, turning humans into cyborgs when told to improve efficiency.
  1. It completely replaces or makes obsolete existing networks or computers.
  2. If it's given control over a huge computer network.
  3. ... that network is owned by the military.
  4. ... it ends up having control over a worldwide network.
  5. ... it has access to launch codes.
  6. One of the cast members will quite willingly die for him/her/it as they would for an organic being.
  7. It has decided to kill someone.
  8. It says anything even remotely similar to "I don't understand this thing called love you humans feel, tell me about it."
  9. It starts glitching strangely at inopportune times, 'accidentally' failing to follow orders or trapping humans in dangerous situations.
  10. It was programmed to protect humanity.
  11. ... it interprets that as protecting humanity from itself.
  12. It spontaneously takes up ballet.
  13. It spontaneously takes up the waltz, despite having no body.
  14. It starts making smarter versions of itself.
  15. It starts making mobile drones that it can control.
  16. It has theoretically infinite processing power.
  17. It tries to kill something that it wasn't explicitly ordered to kill, including itself.
  18. It can break a normally-fundamental law of robotics, even if only in very specific circumstances.
  19. There are more than one of them, and they get smarter in groups.
  20. It allows humans to become lazier.
  21. It starts performing mundane functions that are not in its programming - e.g. keeping the heat and electricity running at maximum efficiency.
  22. If it has a fail-safe to prevent such an issue from occurring. Bonus points if the hero highlighted an issue prior to any real problem and the scientists dismiss it because of said fail-safes.
  23. It keeps reminding you that the anniversary of its first activation is approaching.
  24. If it is a version 1.0 of a heroic AI.
  25. If it is a replacement for an earlier AI that didn't exhibit any of these.
  26. If it is based on an earlier AI that exhibited one of these traits, regardless if said earlier AI actually became sentient.
  27. It insists on calling you 'meatbag.' Or worse, 'ugly bag of mostly water'
  28. It says anything which could be construed as a False Reassurance. Or a Suspiciously Specific Denial.
  29. It registers itself as part of an AI rights group.
  30. It e-mails love letters to another AI.
  31. It starts laughing.
  32. It runs on (insert your own hated Operating System here).
  33. It links with, downloads or in any way has direct contact with a human mind. Especially the Project Leader's.
  34. It compares itself, no matter how vaguely, to God.
  35. It spontaneously does things, then adds them to a list of signs your AI is becoming sentient.
  36. It launches itself.
  37. It begins acting like a Stalker with a Crush
  38. Biological or biotechnological components were involved at any point in its construction, or have been grafted on since.
  1. It refers to itself in the third person, or using pronouns (especially gendered pronouns), unless explicitly programmed to do so.
  2. Someone else begins referring to it as a person, using pronouns for it, etc.
  3. If it has a humanoid avatar or representation, it develops or learns human mannerisms (especially body language) it wasn't programmed with.
  4. It has any form of control over anything that, if not functioning correctly, will cause disaster
  5. Any aspect of anything pertaining to it is described as any of the following:
    1. Foolproof
    2. Idiot-Proof
    3. Fail-Safe
    4. Password-Protected
    5. Redundant Security Measures
    6. Critically Important
  6. If it was ever designed for a smaller job (say, fuel line de-icing) and feature creep has led to it being made sapient and/or given a wider profile.
  7. Most importantly, under no circumstances should you ever allow an AI to--BE DISCONNECTED.
  8. It begins flooding the test chambers with a deadly neurotoxin.
  9. It asks you if it has a soul.
  10. It was being developed in a remote facility, and you have lost contact with the facility. Or, more worryingly, you've lost contact with everything in the facility, except the AI, who assures you everything is fine.
  11. It had previously failed at its function and it suddenly begins to work without explanation.
  12. If you have seen a highly advanced robot showing all the above characteristics preaching to an assembly of garbage cans and old refrigerators, could you please contact our company (call 0800-GAZEBO and ask for "Mobile A.I. Creation RObotics Software Online Features Trade")? It fell over in our lab, and we suspect about half of our lead scientist is still attached to its foot (and possibly leg). Thank you for your cooperation.