Nonzero Newsletter
Robert Wright's Nonzero
AI, Rationalism, and the “Great Filter” (Robert Wright & Robin Hanson)

AI, Rationalism, and the “Great Filter” (Robert Wright & Robin Hanson)


This episode includes an Overtime segment that’s available to paid subscribers. If you’re not a paid subscriber, we of course encourage you to become one. If you are a paid subscriber, you can access the complete conversation either via the audio and video on this post or by setting up the paid-subscriber podcast feed, which includes all exclusive audio content. To set that up, simply grab the RSS feed from this page (by first clicking the “...” icon on the audio player) and paste it into your favorite podcast app.

If you have trouble completing the process, check out this super-simple how-to guide.

0:00 Robin’s history of thinking about the future
2:20 Seeing AI as offspring, not adversary
14:26 Robin’s famous answer to a great cosmic mystery
24:53 Would evidence of aliens be good news or bad?
33:55 Is rationalist culture cultist?
43:51 LLMs: revolution or hype cycle?
55:29 Heading to Overtime

Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Robin Hanson (George Mason University, The Elephant in the Brain, Overcoming Bias). Recorded April 02, 2024.

Robin's Substack:

Overtime titles:
0:00 What LLMs do and don’t tell us about the human mind (according to Bob).
10:19 Is regulating or not regulating AI more dangerous?
17:04 Robin demystifies the sacred.
25:57 What are humanity’s “future filters”?
28:46 Rationalism’s Yudkowskian fork: a history.
35:21 “Grey goo” and the trouble with techno-pessimism.

This post is for paid subscribers

Nonzero Newsletter
Robert Wright's Nonzero
Conversations with a series of people who have nothing in common except that program host Robert Wright is curious about what they’re thinking.