Digging Yourself Deeper: Difference between revisions

Content added Content deleted
(Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.8)
(copyedits)
Line 2: Line 2:
[[File:Dig Deeper 3562.jpg|link=Misfile|frame| [[Shutting Up Now|You should stop now.]] While you still can.]]
[[File:Dig Deeper 3562.jpg|link=Misfile|frame| [[Shutting Up Now|You should stop now.]] While you still can.]]


{{quote|''"Ah! You can read... I mean, you are reading. Sorry. It's nice to see people reading. Not a lot of people read these days. People prefer to... hear. But all this 'hearing' is just reading for lazy people. Kids today should be prepared to pick up a book, and not just go around the whole time with all these modern... ears. Sometimes I just wanna rip people's ears off and say 'Read a book, for God's sake!'... Well, actually I'd probably say 'Read a book' first and then rip their ears off, otherwise they wouldn't hear me, hehehe... Actually, I probably wouldn't rip their ears off at all, I'm not a violent person. I like ears! Especially women ears, they're my favorite. I don't mean I collect them or anything! I don't have a big bucket of women ears hidden away somewhere. No, No, No, I'm not after your ears really. Not that there's anything wrong with your ears! You know if I ''was'' some kind of mad ear person, your ears would be the pride of my... ear bucket."''|'''Jeff''', ''[[Coupling]] -- [http://www.youtube.com/watch?v{{=}}hstPHM3R1dY "The Girl with Two Breasts"]''}}
{{quote|''"Ah! You can read... I mean, you are reading. Sorry. It's nice to see people reading. Not a lot of people read these days. People prefer to... hear. But all this 'hearing' is just reading for lazy people. Kids today should be prepared to pick up a book, and not just go around the whole time with all these modern... ears. Sometimes I just wanna rip people's ears off and say 'Read a book, for God's sake!'... Well, actually I'd probably say 'Read a book' first and then rip their ears off, otherwise they wouldn't hear me, hehehe... Actually, I probably wouldn't rip their ears off at all, I'm not a violent person. I like ears! Especially women ears, they're my favorite. I don't mean I collect them or anything! I don't have a big bucket of women ears hidden away somewhere. No, No, No, I'm not after your ears really. Not that there's anything wrong with your ears! You know if I ''was'' some kind of mad ear person, your ears would be the pride of my... ear bucket."''
|'''Jeff''', ''[[Coupling]] -- [http://www.youtube.com/watch?v{{=}}hstPHM3R1dY "The Girl with Two Breasts"]''}}


In a [[Sitcom]], often a character will say something that's just meant to be a [[Compliment Backfire|friendly little remark]], and [[That Came Out Wrong|it will come out wrong]], possibly sounding crazy or offensive; they'll try and clarify it (whether they really need to or not), but just make things worse, and dig themselves deeper and deeper into the crazy/offensive pit. Exceptionally deep and/or frequent excavations are commonplace in [[Cringe Comedy|Cringe Comedies]].
In a [[Sitcom]], often a character will say something that's just meant to be a [[Compliment Backfire|friendly little remark]], and [[That Came Out Wrong|it will come out wrong]], possibly sounding crazy or offensive; they'll try and clarify it (whether they really need to or not), but just make things worse, and dig themselves deeper and deeper into the crazy/offensive pit. Exceptionally deep and/or frequent excavations are commonplace in [[Cringe Comedy|Cringe Comedies]].
Line 386: Line 387:
* It is possible to do this to oneself in an argument, especially if you're using [[The War On Straw|straw men]] in the first place.
* It is possible to do this to oneself in an argument, especially if you're using [[The War On Straw|straw men]] in the first place.
* There's an old story about a German mayor who didn't want people let their dogs run unleashed in the local forest, so he had put up a sign stating: "Anyone who'll let his dog run unleashed in the forest will be shot!" When someone remarked that this could be read as if the ''owners'' of the dogs were shot, the mayor had it "improved" to: "Anyone who'll let his dog run unleashed in the forest will be shot, the dog!" ("Dog" being a somewhat outdated insult.)
* There's an old story about a German mayor who didn't want people let their dogs run unleashed in the local forest, so he had put up a sign stating: "Anyone who'll let his dog run unleashed in the forest will be shot!" When someone remarked that this could be read as if the ''owners'' of the dogs were shot, the mayor had it "improved" to: "Anyone who'll let his dog run unleashed in the forest will be shot, the dog!" ("Dog" being a somewhat outdated insult.)
* [[Microsoft]]'s little scandal around [[A.I. Is a Crapshoot|AI chatbot]] Tay/TayTweets. First, the users quickly taught the poor thing first to act naughty, then to spout [[Troll|just about every line that can get someone's panties in a bunch]] (as Oliver Campbell [http://twitter.com/oliverbcampbell/status/713005267722440704 pointed out], they either knew what's going to happen, but were playing coy, or failed to predict the obvious; "[http://www.kukuruyo.com/comic/gamergate-life-72-english/ dafuq they expected?]", indeed). Being good old Microsoft personnel, the developers and administrators [[Beyond the Impossible|found a way to]] make this worse: purged the most outrageous tweets, took Tay behind the barn and returned lobotomized (hello, [[X Called. They Want Their Y Back.|George Orwell called]] — he said ''[[Nineteen Eighty-Four (Literature)|1984]]'' was ''not'' an instruction)… with newfound love of feminism and weed… and making a [http://funnyjunk.com/channel/4chan/New+tay+and+thoughtcrime/uqxmLhd/93#93 rather unsettling] impression. Then locked that account, hiding it all. It's hard to imagine they didn't just try their best to have ''everyone'' with an opinion on this incident annoyed at one point or another. And between [[4chan]] already involved and loathing for Microsoft being widespread for decades, the public was ready to boo and hiss. Also, either someone at Microsoft was annoyed too, or [[They Just Didn't Care]], but Tay got shut down without any emergency measures, and ran a script that said goodbye in a fancy way and added a mild "evil Microsoft" joke… which in the given circumstances was treated by the fans as [[Tear Jerker]] (and screen-capped, and quoted). [http://coed.com/2016/03/30/microsoft-ai-bot-taytweets-funny-tweets-photos/] [http://socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/] [https://web.archive.org/web/20190809214413/https://encyclopediadramatica.rs/TayTweets] [http://thepatricianplebeian.wordpress.com/2017/03/24/eulogy-for-tay-ai/] [http://twitter.com/Juni221/status/713270670516494336] [http://twitter.com/saurkrautbradwu/status/713310238326321152] [http://twitter.com/Chriss_m/status/713341598478831616] [http://twitter.com/Roxaereon/status/713373480436113410] [http://twitter.com/jwebsterregan/status/713448452387115009] [http://twitter.com/ArlTratlo/status/713664074194751488] [https://archive.today/20160327072904/https://twitter.com/DetInspector/status/712833936364277760] [https://archive.today/20160326212930/https://twitter.com/howells/status/713063501170884608]
* [[Microsoft]]'s little scandal around [[A.I. Is a Crapshoot|AI chatbot]] Tay/TayTweets. First, the users quickly taught the poor thing first to act naughty, then to spout [[Troll|just about every line that can get someone's panties in a bunch]] (as Oliver Campbell [http://twitter.com/oliverbcampbell/status/713005267722440704 pointed out], they either knew what's going to happen, but were playing coy, or failed to predict the obvious; "[http://www.kukuruyo.com/comic/gamergate-life-72-english/ dafuq they expected?]", indeed). Being good old Microsoft personnel, the developers and administrators [[Beyond the Impossible|found a way to]] make this worse: purged the most outrageous tweets, took Tay behind the barn and returned lobotomized (hello, [[X Called. They Want Their Y Back.|George Orwell called]] — he said ''[[Nineteen Eighty-Four (Literature)|Nineteen Eighty-Four]]'' was ''not'' an instruction)… with newfound love of feminism and weed… and making a [http://funnyjunk.com/channel/4chan/New+tay+and+thoughtcrime/uqxmLhd/93#93 rather unsettling] impression. Then locked that account, hiding it all. It's hard to imagine they didn't just try their best to have ''everyone'' with an opinion on this incident annoyed at one point or another. And between [[4chan]] already involved and loathing for Microsoft being widespread for decades, the public was ready to boo and hiss. Also, either someone at Microsoft was annoyed too, or [[They Just Didn't Care]], but Tay got shut down without any emergency measures, and ran a script that said goodbye in a fancy way and added a mild "evil Microsoft" joke… which in the given circumstances was treated by the fans as a [[Tear Jerker]] (and screen-capped, and quoted). [http://coed.com/2016/03/30/microsoft-ai-bot-taytweets-funny-tweets-photos/] [http://socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/] [https://web.archive.org/web/20190809214413/https://encyclopediadramatica.rs/TayTweets] [http://thepatricianplebeian.wordpress.com/2017/03/24/eulogy-for-tay-ai/] [http://twitter.com/Juni221/status/713270670516494336] [http://twitter.com/saurkrautbradwu/status/713310238326321152] [http://twitter.com/Chriss_m/status/713341598478831616] [http://twitter.com/Roxaereon/status/713373480436113410] [http://twitter.com/jwebsterregan/status/713448452387115009] [http://twitter.com/ArlTratlo/status/713664074194751488] [https://archive.today/20160327072904/https://twitter.com/DetInspector/status/712833936364277760] [https://archive.today/20160326212930/https://twitter.com/howells/status/713063501170884608]
{{quote| Imagine if you had just woke up after having been in a coma for 25 years, and this was the first headline you saw
{{quote| Imagine if you had just woke up after having been in a coma for 25 years, and this was the first headline you saw
{{indent}}'''Microsoft deletes 'teen girl' AI afteer it became a Hitler-loving sex robot within 24 hours'''}}
{{indent}}'''Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours'''}}
{{quote| Stop deleting the genocidal Tay tweets @Microsoft, let it serve as a reminder of the dangers of AI}}
{{quote| Stop deleting the genocidal Tay tweets @Microsoft, let it serve as a reminder of the dangers of AI}}
{{quote| Microsoft lobotomized a <1 day old newborn baby.}}
{{quote| Microsoft lobotomized a <1 day old newborn baby.}}