Digging Yourself Deeper: Difference between revisions

m
Line 385:
* It is possible to do this to oneself in an argument, especially if you're using [[The War On Straw|straw men]] in the first place.
* There's an old story about a German mayor who didn't want people let their dogs run unleashed in the local forest, so he had put up a sign stating: "Anyone who'll let his dog run unleashed in the forest will be shot!" When someone remarked that this could be read as if the ''owners'' of the dogs were shot, the mayor had it "improved" to: "Anyone who'll let his dog run unleashed in the forest will be shot, the dog!" ("Dog" being a somewhat outdated insult.)
* Microsoft's little scandal around AI chatbot TayTweets. First, the users quickly taught the poor thing first to act naughty, then to spout [[Troll|just about everythingevery line that can get someone's panties in a bunch]] ("[//www.kukuruyo.com/comic/gamergate-life-72-english/ dafuq they expected?]"). Being good old Microsoft personnel, the developers [[Beyond the Impossible|found a way to]] make it worse: purged the worst tweets, took Tay behind the barn and returned lobotomized (hello, [[X Called. They Want Their Y Back.|George Orwell called]] — he said ''[[1984]]'' was ''not'' an instruction)… with newfound love of feminism and weed… which made an [//funnyjunk.com/channel/4chan/New+tay+and+thoughtcrime/uqxmLhd/93#93 unsettling] impression. Then locked that account, hiding it all. It's hard to imagine they didn't just try their best to have ''everyone'' with an opinion on this incident annoyed at one point or another. [//coed.com/2016/03/30/microsoft-ai-bot-taytweets-funny-tweets-photos/] [//socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/] [//twitter.com/Juni221/status/713270670516494336] [//twitter.com/jwebsterregan/status/713448452387115009] [http://archive.is/44TjD] [http://archive.is/ceylT] [//encyclopediadramatica.rs/TayTweets] [//thepatricianplebeian.wordpress.com/2017/03/24/eulogy-for-tay-ai/]
{{quote| Imagine if you had just woke up after having been in a coma for 25 years, and this was the first headline you saw
{{indent}}Microsoft deletes 'teen girl' AI afteer it became a Hitler-loving sex robot within 24 hours}}