Rationality Considering the End of the World
Eliezer Yudkowsky
The decision to think rationally is a preliminary challenge in any human endeavour. To land a spaceship on the Moon, you must solve problems of physics and engineering. But before you can even attempt such problems, you must think of the Moon as a goal to be attained, not a wondrous tale to be told; the physics must be a problem to be analyzed, not a mystery to be worshipped; the engineering must seem a challenge to creative ingenuity, rather than an invitation to trust to luck. The foundational requirement for humanity to successfully confront and resolve global catastrophic risks may be simply that we stay in a scientific/engineering frame of mind. I consider some common departures from this frame, and offer general principles for staying within it.
(Artificial) Intelligence: The Wild Card
Eliezer Yudkowsky
"Intelligence", being difficult to analyze, may be somewhat overlooked as both a positive and a negative force in the future of humankind. We should not forget that intelligence is responsible for human civilization to begin with; it may be hard to talk about, but that doesn't mean it's weak, or that we can afford to ignore it. Cognitive technologies play a hard-to-analyze role in global catastrophic risks and their management; I talk about this problem generally, and Artificial Intelligence in particular.