Digging Yourself Deeper: Difference between revisions

no edit summary
(Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.2)
No edit summary
 
Line 387:
* It is possible to do this to oneself in an argument, especially if you're using [[The War On Straw|straw men]] in the first place.
* There's an old story about a German mayor who didn't want people let their dogs run unleashed in the local forest, so he had put up a sign stating: "Anyone who'll let his dog run unleashed in the forest will be shot!" When someone remarked that this could be read as if the ''owners'' of the dogs were shot, the mayor had it "improved" to: "Anyone who'll let his dog run unleashed in the forest will be shot, the dog!" ("Dog" being a somewhat outdated insult.)
* [[Microsoft]]'s little scandal around [[A.I. Is a Crapshoot|AI chatbot]] Tay/TayTweets. First, the users quickly taught the poor thing first to act naughty, then to spout [[Troll|just about every line that can get someone's panties in a bunch]] (as Oliver Campbell [http://twitter.com/oliverbcampbell/status/713005267722440704 pointed out], they either knew what's going to happen, but were playing coy, or failed to predict the obvious; "[http://www.kukuruyo.com/comic/gamergate-life-72-english/ dafuq they expected?]", indeed). Being good old Microsoft personnel, the developers and administrators [[Beyond the Impossible|found a way to]] make this worse: purged the most outrageous tweets, took Tay behind the barn and returned lobotomized (hello, [[X Called. They Want Their Y Back.|George Orwell called]] — he said ''[[Nineteen Eighty-Four (Literature)|Nineteen Eighty-Four]]'' was ''not'' an instruction)… with newfound love of feminism and weed… and making a [http://funnyjunk.com/channel/4chan/New+tay+and+thoughtcrime/uqxmLhd/93#93 rather unsettling] impression. Then locked that account, hiding it all. It's hard to imagine they didn't just try their best to have ''everyone'' with an opinion on this incident annoyed at one point or another. And between [[4chan]] already involved and loathing for Microsoft being widespread for decades, the public was ready to boo and hiss. Also, either someone at Microsoft was annoyed too, or [[They Just Didn't Care]], but Tay got shut down without any emergency measures, and ran a script that said goodbye in a fancy way and added a mild "evil Microsoft" joke… which in the given circumstances was treated by the fans as a [[Tear Jerker]] (and screen-capped, and quoted). [http://coed.com/2016/03/30/microsoft-ai-bot-taytweets-funny-tweets-photos/] [https://web.archive.org/web/20200629160839/https://socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/] [https://web.archive.org/web/20190809214413/https://encyclopediadramatica.rs/TayTweets] [http://thepatricianplebeian.wordpress.com/2017/03/24/eulogy-for-tay-ai/] [http://twitter.com/Juni221/status/713270670516494336] [http://twitter.com/saurkrautbradwu/status/713310238326321152] [http://twitter.com/Chriss_m/status/713341598478831616] [http://twitter.com/Roxaereon/status/713373480436113410] [http://twitter.com/jwebsterregan/status/713448452387115009] [http://twitter.com/ArlTratlo/status/713664074194751488] [https://archive.today/20160327072904/https://twitter.com/DetInspector/status/712833936364277760] [https://archive.today/20160326212930/https://twitter.com/howells/status/713063501170884608]
{{quote| Imagine if you had just woke up after having been in a coma for 25 years, and this was the first headline you saw
{{indent}}'''Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours'''}}