#368 – Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization
from Lex Fridman Podcast
by Lex Fridman
Published: Thu Mar 30 2023
Show Notes
Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:
– Linode: https://linode.com/lex to get $100 free credit
– House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order
– InsideTracker: https://insidetracker.com/lex to get 20% off
EPISODE LINKS:
Eliezer’s Twitter: https://twitter.com/ESYudkowsky
LessWrong Blog: https://lesswrong.com
Eliezer’s Blog page: https://www.lesswrong.com/users/eliezer_yudkowsky
Books and resources mentioned:
1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities
2. Adaptation and Natural Selection: https://amzn.to/40F5gfa
PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
YouTube Full Episodes: https://youtube.com/lexfridman
YouTube Clips: https://youtube.com/lexclips
SUPPORT & CONNECT:
– Check out the sponsors above, it’s the best way to support this podcast
– Support on Patreon: https://www.patreon.com/lexfridman
– Twitter: https://twitter.com/lexfridman
– Instagram: https://www.instagram.com/lexfridman
– LinkedIn: https://www.linkedin.com/in/lexfridman
– Facebook: https://www.facebook.com/lexfridman
– Medium: https://medium.com/@lexfridman
OUTLINE:
Here’s the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.
()– Introduction
()– GPT-4
()– Open sourcing GPT-4
()– Defining AGI
()– AGI alignment
()– How AGI may kill us
()– Superintelligence
()– Evolution
()– Consciousness
()– Aliens
()– AGI Timeline
()– Ego
()– Advice for young people
()– Mortality
()– Love