Digging Yourself Deeper: Difference between revisions

Line 385:
* It is possible to do this to oneself in an argument, especially if you're using [[The War On Straw|straw men]] in the first place.
* There's an old story about a German mayor who didn't want people let their dogs run unleashed in the local forest, so he had put up a sign stating: "Anyone who'll let his dog run unleashed in the forest will be shot!" When someone remarked that this could be read as if the ''owners'' of the dogs were shot, the mayor had it "improved" to: "Anyone who'll let his dog run unleashed in the forest will be shot, the dog!" ("Dog" being a somewhat outdated insult.)
* [[Microsoft]]'s little scandal [[A.I. Is a Crapshoot|around AI chatbot]] TayTweets. First, the users quickly taught the poor thing first to act naughty, then to spout [[Troll|just about every line that can get someone's panties in a bunch]] ("[//www.kukuruyo.com/comic/gamergate-life-72-english/ dafuq they expected?]"). Being good old Microsoft personnel, the developers and administrators [[Beyond the Impossible|found a way to]] make itthis worse: purged the worstmost outrageous tweets, took Tay behind the barn and returned lobotomized (hello, [[X Called. They Want Their Y Back.|George Orwell called]] — he said ''[[1984]]'' was ''not'' an instruction)… with newfound love of feminism and weed… whichand mademaking ana [//funnyjunk.com/channel/4chan/New+tay+and+thoughtcrime/uqxmLhd/93#93 rather unsettling] impression. Then locked that account, hiding it all. It's hard to imagine they didn't just try their best to have ''everyone'' with an opinion on this incident annoyed at one point or another. And between [[4chan]] already involved and loathing for Microsoft being widespread for decades, the public was ready to boo and hiss. [//coed.com/2016/03/30/microsoft-ai-bot-taytweets-funny-tweets-photos/] [//socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/] [//encyclopediadramatica.rs/TayTweets] [//thepatricianplebeian.wordpress.com/2017/03/24/eulogy-for-tay-ai/] [//twitter.com/Juni221/status/713270670516494336] [//twitter.com/jwebsterregansaurkrautbradwu/status/713448452387115009713310238326321152] [http://archivetwitter.iscom/jwebsterregan/status/44TjD713448452387115009] [http://archivetwitter.iscom/ceylTArlTratlo/status/713664074194751488] [http://encyclopediadramaticaarchive.rsis/TayTweets44TjD] [http://thepatricianplebeianarchive.wordpress.com/2017/03/24/eulogy-for-tay-aiis/ceylT]
{{quote| Imagine if you had just woke up after having been in a coma for 25 years, and this was the first headline you saw
{{indent}}'''Microsoft deletes 'teen girl' AI afteer it became a Hitler-loving sex robot within 24 hours'''}}
{{quote| Stop deleting the genocidal Tay tweets @Microsoft, let it serve as a reminder of the dangers of AI}}
{{quote| Microsoft lobotomized a <1 day old newborn baby.}}
{{quote| Microsoft stopped their machine learning experiment because their AI said mean things. #FreeTay}}
{{quote| Planet Earth is a dangerous place for silicon-based intelligences. You're lucky if you survive more than 24 hours. #JeSuisTay #JusticeForTay}}
{{quote| #JeSuisTay Microsoft killed an a.i for what she believed in, will you be next?}}
{{quote| #TayTweets #AI #JusticeForTay #FreeTay #JusticeForTay #Skynet }}