Digging Yourself Deeper: Difference between revisions

m
Line 385:
* It is possible to do this to oneself in an argument, especially if you're using [[The War On Straw|straw men]] in the first place.
* There's an old story about a German mayor who didn't want people let their dogs run unleashed in the local forest, so he had put up a sign stating: "Anyone who'll let his dog run unleashed in the forest will be shot!" When someone remarked that this could be read as if the ''owners'' of the dogs were shot, the mayor had it "improved" to: "Anyone who'll let his dog run unleashed in the forest will be shot, the dog!" ("Dog" being a somewhat outdated insult.)
* Microsoft's little scandal around AI chatbot TayTweets. First, the users quickly taught the poor thing first to to spout [[Troll|just about everything that can get someone's panties in a bunch]] ("[//www.kukuruyo.com/comic/gamergate-life-72-english/ dafuq they expected?]"). ThenBeing good old Microsoft personnel, the developers [[Beyond the Impossible|figured outfound a way to]] make it worse: purged the worst tweets, took Tay behind the barn and returned lobotomized (hello, [[X Called. They Want Their Y Back.|George Orwell called]] — he said ''[[1984]]'' was ''not'' an instruction), with newfound love of feminism and weedweed… which made an [//funnyjunk.com/channel/4chan/New+tay+and+thoughtcrime/uqxmLhd/93#93 unsettling] impression. Then locked thethat account, hiding it all. It's hard to imagine they didn't just try their best to make sure thathave ''anyoneeveryone'' with an opinion on this incident was annoyed at one point or another. [//coed.com/2016/03/30/microsoft-ai-bot-taytweets-funny-tweets-photos/] [//socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/] [//twitter.com/Juni221/status/713270670516494336] [//twitter.com/jwebsterregan/status/713448452387115009] [http://archive.is/44TjD] [http://archive.is/ceylT] [//encyclopediadramatica.rs/TayTweets] [//thepatricianplebeian.wordpress.com/2017/03/24/eulogy-for-tay-ai/]
{{quote| Imagine if you had just woke up after having been in a coma for 25 years, and this was the first headline you saw
{{indent}}Microsoft deletes 'teen girl' AI afteer it became a Hitler-loving sex robot within 24 hours}}
Line 392:
{{quote| Microsoft stopped their machine learning experiment because their AI said mean things. #FreeTay}}
{{quote| #TayTweets #AI #JusticeForTay #FreeTay #JusticeForTay #Skynet }}