Eliezer Yudkowsky’s Twelve virtues of rationality

“It is especially important to eat math and science which impinges upon rationality: Evolutionary psychology, heuristics and biases, social psychology, probability theory, decision theory. But these cannot be the only fields you study. The Art must have a purpose other than itself, or it collapses into infinite recursion.”

— From Eliezer Yudkowsky’s twelve virtues of rationality (check out #12 — it makes no rational fucking sense!).

3 thoughts on “Eliezer Yudkowsky’s Twelve virtues of rationality

  1. it makes a lot sense in the context of the history of ai (combined with the subject matter of truth seeking).

    It’s message is that there are no simple one-off solutions for understanding intelligence. The solution to how intelligence works is distinctly new, like Einstein’s theory of relativity, even if it borrows from many disciplines. In the same vein, that which is nameless should be seeked, rather than regurgitating the same methods and virtues. In essence the 12th virtue is recognizing that there is more than just the prior eleven virtues, which are based upon the intelligence of a mere human being.

    THERE ARE HIGHER VIRTUES.

    Especially apparent in the context that he wants to build a more than human intelligent AI.

  2. Is the AI context something that Yudkowsky was intending, or are you bringing that to it?

    I guess to the extent that I understood it I got “be on the lookout for new methods to get at ultimate truth.” Is that what you’re saying?

  3. Well, I am pretty sure I got the details wrong but the general interpretation right. So yes, that’s what i believe he is saying.

    “Seek out truth yourself for I am only one man”.

    Yudkowsky is deeply motivated to build friendly artificial general intelligence. He had a power point lecture with a slide explaining the differences between different clusters of things we consider intelligent.

    On a line from proceeding from lower to greater intelligence, He placed einstein right next to the village idiot. I would keep that in mind in the context of the void. There are not simply pieces of knowledge missing in our understanding of reality (according to him).

    Consider the silly debates between two logical possibilities for the existence (or not) of god: either the universe is eternal or god is eternal. He is strongly anti-mysterian. So he doesn’t like, say, Robert Wright’s Nonzero.

    He believes AGI will be made (and explode) well before you and I die. I am 24, and I assume you are young too.

    With this knowledge (if he is right), we could build the last religion. Religion would be destroyed the moment the AGI awakens and decides what to do with humans. He doesn’t have to make it within a century, just someday. Also, there is cryonics. But that looks pretty bleak. Futurism is a bummer no matter how you parse it.

    btw, it would be cool if there was a feature for notifications if someone replies.

Comments are closed.