Eliezer Yudkowsky: Difference between revisions
Rescuing 2 sources and tagging 0 as dead. #IABot (v2.0beta14)
(Add quote) |
(Rescuing 2 sources and tagging 0 as dead. #IABot (v2.0beta14)) |
||
(5 intermediate revisions by 3 users not shown) | |||
Line 1:
{{creator}}
[[File:Eliezer Yudkowsky, Stanford 2006 (square crop).jpg|thumb|300px|Eliezer Yudkowsky in 2006.]]
{{quote|''Let me say that Eliezer may have already done more to save the world than most people in history.''|From a [http://lesswrong.com//lw/2lr/the_importance_of_selfdoubt/2ia7?c{{=}}1 blog comment].}}
{{quote|Rationalism is the belief that Eliezer Yudkowsky is the rightful caliph.|Scott Alexander|[http://slatestarcodex.com/2016/04/04/the-ideology-is-not-the-movement/ The Ideology is Not the Movement]}}
Author of ''[[Three Worlds Collide]]'' and ''[[Harry Potter and
His day job is with the [https://web.archive.org/web/20120616070928/http://singinst.org/ Singularity Institute for Artificial Intelligence], specifically working on how to make an [[AI]] that ''won't'' just [https://web.archive.org/web/20110621172641/http://singinst.org/riskintro/index.html kill us all accidentally], and make [[
Most of his fictional works are [[Author Tract]]s, albeit ones that [[Rule of Cautious Editing Judgment|many find]] good and entertaining.
Line 19 ⟶ 20:
* [[The End of the World as We Know It]]: Seeks to prevent it.
* [[Fan Boy]]: Isn't one, but has them despite trying to [http://lesswrong.com/lw/ln/resist_the_happy_death_spiral/ warn against being one].
* [[Genre Adultery]]: Almost all of his works, whether [[Harry Potter and
* [[Hannibal Lecture]]: In the [http://yudkowsky.net/singularity/aibox AI-Box] [http://lesswrong.com/lw/up/shut_up_and_do_the_impossible/ experiment], presumably.
* [[Human Popsicle]]: His membership with the [http://www.cryonics.org/ Cryonics Institute] is his backup plan if he dies before achieving proper immortality.
|