Living By the Books | AJ Jacobs on Writing Methods
The James Altucher ShowMay 03, 202401:07:4162.04 MB

Living By the Books | AJ Jacobs on Writing Methods

Dive deep into the fascinating world of puzzles, creativity, and the future with James and AJ Jacobs in a riveting conversation that unpacks the power of thinking outside the box for shaping bestsellers and envisioning a transformative tomorrow.

A Note from James:

โ€ŠAJ Jacobs, who has an upcoming book, "The Year of Living Constitutionally", he's a fascinating writer. He basically, instead of just doing research on a topic "Oh, I'm going to do research on the Constitution, I'm going to do research on living healthy, I'm going to do research on the Bible," instead of just researching something and writing a book, which I consider to be boring, he has a completely different way of exploring a topic, and it turns out,and not by accident, it turns out that this is the formula for writing a bestselling, millions-of-copies kind of book.

And it's an interesting way to live life. And what I mean is, A. J. really lives what he's writing about. So when he wrote his, one of his early books, The Year of Living Biblically, he lived for a year, literally word by word, as the Bible would suggest he lives. For instance, I'll let A. J. describes and we talk about it in this episode, but in "The Year of Living Constitutionally", which is about to come out he lives the life of someone.

Around 1790, living the constitution word for word applying to be a pirate with the Congress. Something which was an article, one of the constitution, if you didn't happen to know. And meanwhile, other books he's had are like, "My Life is An Experiment", where everything was an experiment.

For instance, he outsourced arguments with his wife to an outsourcing agency in India. So this turns out to be not only a fascinating way to live a very curious and adventurous life. But again, as I said earlier, it turns out to be a great formula for writing a bestselling book, the kind of formula AI can't really compete with I will add. 

First, I want to hear about AJ's method, the AJ method of living a life of experience and using that to create stories, adventures, and of course, bestselling books. And he's a very funny guy, so how he incorporates Humor into that. And then next week, we're going to do a whole episode, fascinating episode, the year of living constitutionally, because there's so many issues about the constitution.

I didn't know about it. And so many, so much information about the constitution. I didn't even know about it. I used to think I've read it, but it turns out I hadn't. So first off though, the AJ method on writing his own unique brand style of bestseller and living a life of adventure.

Episode Description:

In this episode of the James Altucher Show, James engages in a fascinating conversation with AJ Jacobs, an esteemed author known for his unique approach to writing and life. Jacobs discusses his upcoming book 'The Puzzler', delves into his method of immersing himself in his subjects, and shares insights from the Year of Living Constitutionally. Jacobs, known for his adventurous lifestyle and humor, describes past projects including outsourcing arguments with his wife to India and living according to the Bible. The episode also covers his visit to a long-termism conference, discussing potential future challenges and the impact of AI. Additionally, AJ and James ponder over the significance of incorporating puzzles into daily life and explore the concept of improving forecasting abilities through understanding probabilities.

 

Episode Summary:

00:00 Exploring AJ Jacobs' Unique Approach to Writing and Living

02:47 AJ Jacobs: A Deep Dive into His Life and Works

03:31 The Fascinating World of Puzzles with AJ Jacobs

03:56 AJ's Hermit Life and the Creative Process

04:47 Solving the World's Most Baffling Puzzles

17:07 The Art of Creativity and Intelligence in Puzzle Solving

27:24 Exploring Long-Termism and Future Challenges

31:57 The Unheard Stories of Nuclear Near Misses

32:19 Exploring the Shadows of the Cold War

33:05 The Petrov Day: A Reminder of Nuclear Threat

34:29 Unaccounted Nuclear Weapons: A Lingering Threat

34:48 Envisioning a Utopian Future Amidst Doom

35:42 The Evolution of Technology vs. Environmental Challenges

36:54 The AI Dilemma: Potential and Perils

37:50 AI's Unintended Consequences: From Paperclips to Pandemics

41:19 The YouTube Algorithm: A Case Study in AI's Impact

48:32 Addressing the Threat of Authoritarianism

52:58 The Importance of Science and Statistics in Society

58:15 Rethinking Education: A Focus on Practical Knowledge

01:01:34 Long Termism: A New Perspective on Humanity's Future

------------

  • What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
  • Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!

------------

------------

Thank you so much for listening! If you like this episode, please rate, review, and subscribe to โ€œThe James Altucher Showโ€ wherever you get your podcasts: 

Follow me on social media:

[00:00:00] Do you see the deals on these brands?

[00:00:02] Of course you can't, this is a podcast.

[00:00:04] But when you go to Ross this spring, you'll see brands you know at prices you love.

[00:00:08] We're talking all the shoes, hues and ooze at 20-60% off department store prices.

[00:00:13] Go see the savings at Ross.

[00:00:30] World-class drama, a new season of the Kardashians starring the Kardashians, of course.

[00:00:36] And Grand Cayman, Secrets in Paradise, the sizzling new reality show set in the tropical Caribbean.

[00:00:43] It's all new and it's streaming now on Hulu.

[00:01:08] I'm going to do research on the Bible. Instead of just researching something and writing a book, which I consider to be kind of boring,

[00:01:14] he has a completely different way of exploring a topic.

[00:01:19] And it turns out, and not by accident, it turns out that this is the formula for writing a best-selling millions of copies kind of book.

[00:01:28] And it's an interesting way to live life.

[00:01:30] And what I mean is, AJ really lives what he's writing about.

[00:01:34] So when he wrote his one of his early books, The Year of Living Biblically,

[00:01:38] he lived for a year literally word by word as the Bible would suggest he lives.

[00:01:46] Well, I'll let AJ describe it and we talk about it in this episode.

[00:01:50] But in the Year of Living constitutionally, which is about to come out,

[00:01:53] he lives the life of someone around in 1790 living the Constitution word-for-word.

[00:02:01] Like applying to be a pirate with the Congress,

[00:02:04] something which was an article one of the Constitution if you didn't have him to know.

[00:02:08] And meanwhile other books he's had is like he,

[00:02:12] my life as an experiment where everything was an experiment.

[00:02:15] So he, for instance, he outsourced arguments with his wife to an outsourcing agency in India.

[00:02:22] So this turns out to be not only a fascinating way to live a very curious and adventurous life,

[00:02:29] but again, as I said earlier, turns out to be a great formula for writing a bestselling book.

[00:02:35] A kind of formula AI can't really compete with.

[00:02:38] I will add.

[00:02:39] But first I want to hear about AJ's method, the AJ method of living a life of experience

[00:02:47] and using that to create stories, adventures, and of course, bestselling books.

[00:02:53] And he's a very funny guy.

[00:02:54] So how he incorporates humor into that.

[00:02:57] And then next week we're going to do a whole episode, fascinating episode,

[00:03:02] the year of living constitutionally.

[00:03:04] Because there's so many issues about the Constitution I didn't know about

[00:03:07] and so much information about the Constitution I didn't even know about.

[00:03:12] I used to think I've read it, but turns out I hadn't.

[00:03:15] So first off though, the AJ method on writing his own unique brand style of bestseller

[00:03:23] and living a life of adventure.

[00:03:31] AJ's podcast and he's not your average host.

[00:03:34] This is the James Altiger show.

[00:03:45] AJ Jacobs once again back on the podcast.

[00:03:48] AJ, I'm so excited to have you here.

[00:03:50] I am so excited to be here.

[00:03:52] I was a hermit for like a year working on a book and I have emerged.

[00:03:56] You were a hermit.

[00:03:57] You literally not only didn't return my calls, you didn't return like your wife called me looking for you.

[00:04:04] You didn't return anyone's phone calls.

[00:04:06] It's true.

[00:04:07] It's true.

[00:04:08] I just saw her again after a year and she looks good.

[00:04:11] She looks good.

[00:04:12] Like much of that time when you're writing, is it because you're not writing

[00:04:16] and so you can't do anything in life until you write again?

[00:04:21] Because you got a kind of writer's block or is it just you're just head down,

[00:04:25] working nonstop, finishing the book?

[00:04:28] Well my projects, as you know, like I go all in.

[00:04:33] They are fully immersive.

[00:04:35] So it's like an 18 hour a day job.

[00:04:39] And this one is about puzzles, as you know.

[00:04:41] So I was, it's such a huge topic.

[00:04:44] I could spend 10 years doing it, but I just, I was able to cut it down to two years.

[00:04:48] But I basically all I did was hang out with weird, delightful, eccentric,

[00:04:55] puzzle people, some of the smartest people in the world and just did puzzles

[00:05:00] and try to hear their crazy stories.

[00:05:04] I visited the CIA to see one of the great unsolved puzzles.

[00:05:09] Wait, why did the CIA have one of the great unsolved puzzles?

[00:05:13] Well they have it's called cryptos.

[00:05:15] Have you ever heard of cryptos?

[00:05:17] It's this sculpture that was created 30 years ago by this eccentric sculptor.

[00:05:25] And he created it for the CIA.

[00:05:28] It's in the middle of their headquarters and it's got 2000 letters on it

[00:05:33] and it's got like characters.

[00:05:36] And they, they're a secret code.

[00:05:39] And no one has been able to fully solve the code.

[00:05:43] Even the CIA, which makes me a little nervous to be honest.

[00:05:46] So wait, who made this for them?

[00:05:48] It was a sculptor.

[00:05:50] He teamed up with an X CIA cryptographer.

[00:05:53] But it's this incredibly, it's one of the great unsolved puzzles of the world.

[00:05:58] And I was able to get access to the CIA headquarters.

[00:06:02] And I didn't sneak in that one.

[00:06:04] But so I got to see it up close.

[00:06:07] I don't want to spoil the ending, but I did not solve it.

[00:06:11] You know, there are thousands of people obsessively working on solving this code

[00:06:17] for the last 30 years.

[00:06:19] It's like the Zodiac killer code.

[00:06:21] That's another famous one.

[00:06:22] How do they know it's solvable?

[00:06:24] Well that's a debate.

[00:06:26] They have solved three of the four codes.

[00:06:29] So there are, there are four messages.

[00:06:32] And one of the messages is like some, it basically says there's a treasure buried somewhere,

[00:06:39] possibly on the CIA grounds.

[00:06:41] And then another is like a mysterious poem about shadows and light.

[00:06:47] But there are some people who say this guy is just messing with us.

[00:06:51] Like he's a troll that is like they're never going to solve it.

[00:06:55] So okay.

[00:06:56] So your book is going to be, it's coming out in April.

[00:06:58] We're not going to talk about the book so much today.

[00:07:00] We're going to talk about it in April, but the book is called The Puzzler.

[00:07:02] One man's quest to solve the most baffling puzzles ever from crosswords to jigsaws to the meaning of life.

[00:07:08] I also appreciate that you have a chapter on chess puzzles.

[00:07:12] The whole art of chess composing is a field unto itself.

[00:07:16] Like even, you know, people who never even play the game get obsessed with composing chess puzzles.

[00:07:22] Vladimir Nabokov, the famous author of Lolita was a master chess composer.

[00:07:27] You mentioned a mutual friend of ours who does chess puzzles.

[00:07:31] Cyrus Lactawalla.

[00:07:32] Lactawalla, awesome man and another he's more of a friend of yours than mine.

[00:07:38] But I know him a little as Gary Kasparov came over and he's an okay chess player.

[00:07:44] He's I've heard of him.

[00:07:46] He's not bad.

[00:07:48] Yeah, it was extraordinary.

[00:07:49] I mean he, well, he was actually he insulted my chess board.

[00:07:54] That was the first opening move.

[00:07:56] I think you had a cheap chess board.

[00:07:58] Yeah, he said what is this cheap chess because it was, you know, it had plastic and but then he's like, no, I grew up in Soviet Union.

[00:08:06] I'm used to cheap chess board.

[00:08:08] So it's okay.

[00:08:09] Yeah, he gave me an out.

[00:08:11] Yeah, he gave me an out.

[00:08:13] Your CIA puzzle reminds me.

[00:08:15] Have you ever heard of the Sakata Reddit puzzle?

[00:08:17] Oh yeah.

[00:08:18] Yeah, I looked into that.

[00:08:19] I didn't do a chapter on it, but I love that.

[00:08:22] That's another one.

[00:08:23] I wonder if there's like another book here like mysterious huge unsolved puzzles and not just any unsolved puzzle.

[00:08:30] Like these are unsolved puzzles that millions of people have tried to solve.

[00:08:34] And it's also related to, you know, if you think about it all of mathematics is like giant unsolved puzzles.

[00:08:41] Like is there a way to factor prime numbers quickly?

[00:08:45] For instance, is a giant unsolved puzzle.

[00:08:47] Right.

[00:08:48] Yeah, that's a great point.

[00:08:49] And I mean, like that's part of the book is that life is an unsolved puzzle and there are so many, so many challenges out there.

[00:08:58] But yet cryptos is a they're their lists on the Internet of like the top 10 unsolved puzzles and cryptos is always in there.

[00:09:04] The Voynich manuscript has taken up like, you know, Mark Zuckerberg has taken up millions of hours of our lives.

[00:09:12] But I think the Voynich manuscript has taken up about as many because it's been called.

[00:09:18] What's the Voynich manuscript?

[00:09:20] It's this mysterious manuscript from the Middle Ages filled with all these mysterious symbols.

[00:09:25] And people have been puzzling over it for centuries and, you know, using quantum computers, using everything they can to throw at it.

[00:09:36] And no one has been able to crack it every every like three months.

[00:09:39] There'll be an article I saw the Voynich manuscript and then it turns out the guy is full of crap.

[00:09:46] But yeah, I mean, who doesn't love like it's so it's so tantalizing this idea that you can be the one to finally solve.

[00:09:57] And sometimes it happens like sometimes that's what's amazing is when you actually have someone who solves a puzzle like the Zodiac.

[00:10:05] The Zodiac Puzzles Cyphers were just solved last year.

[00:10:09] What are the Zodiac Cyphers?

[00:10:11] I have to figure you've been immersed in like puzzle culture for the past couple of years.

[00:10:16] What's the Zodiac puzzle?

[00:10:18] The Zodiac was a serial killer in the 70s.

[00:10:21] The Zodiac serial killer, they made a movie about him a few years ago and he left a bunch of mysterious coded notes filled with symbols to the cops.

[00:10:32] He would send all these taunting letters and not all of them have been cracked.

[00:10:37] But just last year, using this new computer program, they cracked one where it was basically like a childish like, you know, I hope you're looking for me.

[00:10:47] You can't find me.

[00:10:49] So yeah, that is another big one that people love to devote their lives to trying to figure out.

[00:10:58] Why do you think I mean, OK, wait, I don't we'll get into all of this when we actually do a podcast about the book.

[00:11:06] Because the book's coming out in April.

[00:11:08] And but I do want to ask you kind of about some interesting experiences you've had while creating this book.

[00:11:14] And I also have an idea for you.

[00:11:17] You know, when I had Will Shortz on the podcast and Will Shortz, of course, is the famous New York Times crossword puzzle editor.

[00:11:23] And the New York Times is arguably the most famous crossword puzzle in the world.

[00:11:27] And he has hundreds of books that he's quote unquote written about where he just says as crossword puzzle after crossword puzzle.

[00:11:36] And he also has Sudoku, he has Ken Ken and you write about these things in your book.

[00:11:40] You should create your own line of puzzle books now.

[00:11:45] I love that idea.

[00:11:47] I mean, James, you always you're such a better businessman than I am.

[00:11:51] I don't know about that.

[00:11:52] Yeah, you've probably never gone broke.

[00:11:54] I've gone broke on business.

[00:11:55] I might go broke again.

[00:11:56] Who knows?

[00:11:57] So yeah, but you're gonna make good at making money and then going broke.

[00:12:02] You don't just go broke.

[00:12:04] First you make a few million, then you go broke, then you make it again.

[00:12:08] And you're always giving me business ideas of how to monetize my books.

[00:12:13] And I'm always too lazy or scared or confused to do it.

[00:12:20] But this time, I think I got to take you up on it.

[00:12:23] You're right.

[00:12:24] I'm gonna launch the puzzler franchise.

[00:12:27] How about I find someone to do the puzzle books for you and just have your name on it.

[00:12:34] And maybe you could write an intro to each one.

[00:12:36] You don't even, well, Schwartz doesn't even do an intro to each one.

[00:12:38] So I love that.

[00:12:39] I will find someone to make the puzzle books for you and you just can self publish them

[00:12:44] or you can find a publisher, whatever you want.

[00:12:46] That's nice.

[00:12:47] You just split it 50-50 with this person or company or whatever.

[00:12:50] All right.

[00:12:51] I'm in.

[00:12:52] One of my best known articles was called My Outsource Life, which I wrote like 15 years ago

[00:12:58] where I hired a team of people in India to do everything for me.

[00:13:02] Including arguing with your wife.

[00:13:04] Including arguing with my wife.

[00:13:05] So I should be an expert in outsourcing and delegating.

[00:13:09] But ironically after, you know, even though that article was an awesome experience,

[00:13:15] I'm just not good at it.

[00:13:17] So you're gonna, yeah, you're gonna teach me or delegate someone to teach me.

[00:13:21] Well, the thing is, the thing is like when if someone says to you, oh, you should do this

[00:13:26] and probably people say that to you all the time.

[00:13:28] Oh, AJ, you should write a book about blah, blah, blah.

[00:13:30] You probably get like tense, like, oh no, I have enough things to do.

[00:13:34] Now they're giving me more things to do.

[00:13:36] And when you did the outsourcing article, it was a little bit of a humorous article.

[00:13:40] Like you literally, when you started arguing with your wife,

[00:13:42] you would call up this outsourcing firm in India.

[00:13:44] It's funny.

[00:13:45] And so maybe it doesn't, maybe it hasn't occurred to you to relieve this business tension

[00:13:49] you sometimes have or business anxiety with outsourcing,

[00:13:53] particularly because outsourcing itself could often be work.

[00:13:56] Like you have to find someone, but I will take care of that for you.

[00:13:59] You can outsource immediately right now to me.

[00:14:02] All right, done.

[00:14:03] I will take care of it.

[00:14:04] You're gonna have a hundred books within the next two years.

[00:14:08] And you'll be generating income every single month from every single one of those books.

[00:14:14] I will do it for you.

[00:14:15] God bless you.

[00:14:16] All right, done.

[00:14:17] It is done.

[00:14:18] What's the book called?

[00:14:19] The puzzler, right?

[00:14:20] Right.

[00:14:21] Okay, we're gonna make a brand out of the puzzler.

[00:14:23] Oh, you're the best.

[00:14:24] Thank you.

[00:14:25] All right.

[00:14:26] I'm done.

[00:14:27] Part one.

[00:14:28] All right.

[00:14:29] Puzzler Sudoka.

[00:14:30] All right.

[00:14:31] Let me know when they're out.

[00:14:32] So, and you know, I'll tell you where to send the checks and that's great.

[00:14:36] All right, done.

[00:14:37] And then so moving on your story about the CIA puzzle reminds me,

[00:14:44] I had quite a few guests on this podcast that have all independently told me this story where,

[00:14:50] like let's say it'll be like a thriller writer or there's been some other examples,

[00:14:55] like a puzzle type of person.

[00:14:57] But let's say this has been told for me like, and J as well about by four or five guests,

[00:15:02] they've all, they're all part of some group organized by the government,

[00:15:06] like in the mid 00s, like 2007, 2008 to come up with scenarios for how the US could be

[00:15:13] attacked or destroyed or whatever.

[00:15:16] And so it's, it reminds me a little bit like that there's this group and again,

[00:15:20] some of them I've interviewed, but they won't, this group's been gathered to sort of solve

[00:15:26] the puzzle of how the US could be attacked.

[00:15:29] And I always ask them, of course, well, what did you come up with?

[00:15:31] But they're not allowed to tell me.

[00:15:33] Yeah.

[00:15:34] You don't want to give people the ideas that I've heard of that.

[00:15:37] I've never been to that.

[00:15:38] But yeah, I've heard like they get Hollywood producers to like come up with these

[00:15:43] crazy scenarios of, you know, what action movie, what sci-fi movie could actually

[00:15:49] happen and how can we prevent it?

[00:15:51] So, yeah.

[00:15:53] If the people who organize that are listening, I want an invite.

[00:15:57] I'm ready.

[00:15:58] I've got plenty of ideas, but I won't act on them.

[00:16:00] Not gonna act on them.

[00:16:01] By the way, you and I earlier were talking about, you know, fiction ideas.

[00:16:05] This is a fiction idea.

[00:16:06] You're in this group of the government's top secret, you know, puzzlers or whatever.

[00:16:12] And it's made up of thriller writers, Hollywood directors,

[00:16:16] crossword puzzle masters, chess masters, whatever.

[00:16:20] And they're getting killed off one by one.

[00:16:23] Oh, that's good.

[00:16:24] And then because one of them has the, I don't know why you have to figure out

[00:16:29] why is the premise sounds interesting.

[00:16:32] So I like that.

[00:16:33] I like that.

[00:16:34] All right.

[00:16:35] So also, let me answer this.

[00:16:37] Like, and this is unrelated to the topic of the book, but of the puzzle masters

[00:16:42] that you have met, you know, in the table of contents, you talk about

[00:16:45] Sudoku, you talk about chess, you talk about riddles and Japanese puzzle

[00:16:50] boxes, math puzzles.

[00:16:52] Was there a common characteristics of people who are masters of solving

[00:16:56] puzzles?

[00:16:57] I think so.

[00:16:58] I mean, it's what I argue is that puzzles are an amazing way to train you

[00:17:05] how to think.

[00:17:06] And there are five or six characteristics that make people great thinkers.

[00:17:14] And you talked about one, well, you talked about many, but one in

[00:17:19] particular and in your most recent book was about the experimental

[00:17:23] mindset that you've got to try all these hypotheses.

[00:17:27] You got to throw everything against the wall and see what sticks.

[00:17:30] So that is a lot about the puzzle mindset.

[00:17:34] So, you know, you're given a logic puzzle, you know, turn it upside

[00:17:40] down, try, try every, you know, if you're given like a wooden box

[00:17:46] puzzle, like spin it, throw it against the wall.

[00:17:50] Sometimes that can break it, but everything.

[00:17:58] Take a quick break.

[00:17:59] If you like this episode, I'd really, really appreciate it.

[00:18:02] I mean so much to me.

[00:18:03] Please share it with your friends and subscribe to the podcast.

[00:18:06] Email me at Altatradgemail.com and tell me why you subscribed.

[00:18:36] And the new season of The Kardashians starring The Kardashians, of course.

[00:18:40] And Grand Cayman, Secrets in Paradise, the sizzling new reality show

[00:18:45] set in the tropical Caribbean.

[00:18:47] It's all new and it's streaming now on Hulu.

[00:19:11] I think it's more creative.

[00:19:15] You know, you got to know some facts, you know, you got to know

[00:19:18] like that gravity exists and that if you throw up a ball is going

[00:19:22] to come down.

[00:19:23] But to me, the real creative geniuses are the ones who come

[00:19:27] up with novel ways to solve problems.

[00:19:30] And you don't need a huge amount of knowledge for that.

[00:19:34] I mean, I talked about this in my book on the encyclopedia

[00:19:37] because I read the Encyclopedia Britannica from A to Z as my

[00:19:41] first book.

[00:19:42] Actually, I'm a book to know it all.

[00:19:44] God bless you.

[00:19:45] And I knew every, like I knew way too much.

[00:19:48] You know, I would just spout random fat.

[00:19:50] My wife would charge me $1 for every irrelevant fact I inserted

[00:19:55] into conversation.

[00:19:56] But that doesn't necessarily make you smart.

[00:19:58] Applying those facts in a creative way, I actually love this quote.

[00:20:02] I don't know who came up with it.

[00:20:04] Knowledge is knowing that a tomato is a fruit, not a vegetable.

[00:20:10] Wisdom is knowing not to put a tomato in a fruit salad.

[00:20:15] Now I just learned some wisdom.

[00:20:17] Now you did that know before.

[00:20:19] Well, also I'd like to, I'd like to attack my own that

[00:20:24] that quote because why don't, why shouldn't you put tomatoes

[00:20:27] in a fruit salad?

[00:20:28] That's another example of rule.

[00:20:30] Like why, why should we follow that rule?

[00:20:33] So yeah, I, I,

[00:20:34] In solving like, let's say you're playing a game of chess is,

[00:20:38] you know, one of the keys to creativity in chess is to look

[00:20:42] at, look at the move that normally everyone would have

[00:20:45] said that moves impossible.

[00:20:47] Right.

[00:20:48] And so starting with the impossible is often a good way

[00:20:52] to lend itself into creativity.

[00:20:56] But related to this, are puzzles a good way to exercise

[00:21:00] creativity or do you need the creativity first?

[00:21:02] Like, can you get more creative through the process of

[00:21:05] solving puzzles?

[00:21:06] I definitely think I mean, it's,

[00:21:08] I think it's helped my mind a ton.

[00:21:11] And one of the, you know,

[00:21:14] one of the things I think that makes a good thinker is

[00:21:18] not falling in love with your hypothesis.

[00:21:21] So being very flexible, being open,

[00:21:24] being able to change.

[00:21:26] And there are a bunch of new books that talk about this

[00:21:29] like the Scout mindset by Julia Gailiff.

[00:21:32] She's a great guest, if you haven't had her on or Adam

[00:21:35] Grants think again, but it's all,

[00:21:38] it's all about not having motivated reasoning.

[00:21:40] So being open and, and that you need.

[00:21:43] Like I know you don't love crossword puzzles,

[00:21:45] but this is, let me give you an example.

[00:21:48] It was a, the clue was for a nine letter word and

[00:21:52] it was, it was the result of a bad trip.

[00:21:57] And I got in my mind, it's a flashback like an,

[00:22:01] like an acid flashback.

[00:22:03] That's it.

[00:22:04] And I wouldn't let it go.

[00:22:05] I wouldn't let it go and it was screwing me up

[00:22:07] and it took me two hours to finally figure out,

[00:22:10] no, it's face plant.

[00:22:11] The result of a bad trip is face plant.

[00:22:14] And what do you need?

[00:22:15] I don't understand.

[00:22:16] Well, if you trip, you fall on your face.

[00:22:18] It's a face plant.

[00:22:19] Ah, ah.

[00:22:20] Not an LSD trip.

[00:22:21] So I fell in love with my hypothesis.

[00:22:25] I wouldn't let it go.

[00:22:26] I didn't have the flexibility of mind and that screwed me.

[00:22:30] And I think that's in life too.

[00:22:32] If you don't have the, if you are like,

[00:22:34] if you have a business and you say that my business is,

[00:22:37] you know, making, making lamps,

[00:22:40] but it's not going anywhere and you need to pivot.

[00:22:43] You have to have that flexibility of mind saying,

[00:22:46] oh, it's not lamp.

[00:22:47] It's sconces.

[00:22:48] That's what's going to make me millions.

[00:22:51] Yeah.

[00:22:52] Being able to be super flexible about pivoting and business is

[00:22:54] critical.

[00:22:55] Like I remember one time,

[00:22:56] I won't get into the weeds of the story,

[00:22:58] but I had, I had an opportunity.

[00:22:59] I thought to make millions of dollars,

[00:23:02] but I couldn't get this idea to work.

[00:23:05] So I said to myself, okay,

[00:23:06] just sell this asset that I have that I thought could

[00:23:09] make millions,

[00:23:10] but I could sell it for super cheap and make like $100,000.

[00:23:13] And I did that instead in the next day,

[00:23:15] I had $100,000 instead of trying to spend the next few years

[00:23:18] trying to make millions.

[00:23:19] And I was able to reinvest that money and so on.

[00:23:22] Like you have to do the things that would never have occurred

[00:23:26] to you normally and are again completely the opposite of your

[00:23:29] original plan,

[00:23:30] but you always have to be willing to do the opposite of your

[00:23:32] plan.

[00:23:33] Yeah, I love that.

[00:23:34] Can I tell you my favorite pivot from history?

[00:23:37] Yes.

[00:23:38] I learned about in the encyclopedia.

[00:23:39] It was this guy named Thomas Welch.

[00:23:42] And he was like in the early 1900s and he was,

[00:23:46] he was really into prohibition.

[00:23:48] Like he thought alcohol was evil.

[00:23:50] So he invented this non-alcoholic wine that they could serve

[00:23:55] at churches for communion.

[00:23:57] It turned out no one wanted now non-alcoholic wine at

[00:24:01] churches,

[00:24:02] but his son took over and was like, well,

[00:24:04] what have we just renamed rebranded it as grape juice

[00:24:09] and sold it to kids and as like a little snack treat juice.

[00:24:16] And like that took off and that was how Welches came about.

[00:24:19] But if they had stuck to the original non-alcoholic wine,

[00:24:22] there would be no Welches.

[00:24:24] This type of story is common in business.

[00:24:26] Like for instance, Purell that they hand you now every

[00:24:29] time you work on a plane to disinfect your hands.

[00:24:32] Well,

[00:24:33] that was originally created as a cleaner for semiconductor

[00:24:35] chips.

[00:24:36] And then just more and more people stopped,

[00:24:38] started stopping by the inventor's desk to say,

[00:24:41] hey, can I have some of that cleaner?

[00:24:42] I want to just like clean my hands.

[00:24:44] And so that became the product.

[00:24:46] Oh, that's good.

[00:24:47] Or what about Botox?

[00:24:49] You know, Botox was invented as a treatment for

[00:24:52] crossed eyes and lazy eyes.

[00:24:54] Really?

[00:24:55] Yeah.

[00:24:56] And it's still,

[00:24:57] it still can be used for that.

[00:24:59] But you know,

[00:25:00] obviously that's not the main use.

[00:25:03] Do you feel like you've become more creative through

[00:25:05] the process of making this book because you've seen so many

[00:25:07] puzzles?

[00:25:08] Absolutely.

[00:25:09] Yeah, I really,

[00:25:10] I think they are the opposite of a waste of time.

[00:25:13] They are,

[00:25:14] you're wasting your time if you're not doing puzzles.

[00:25:16] That's what I'm glad to hear you say that because I always

[00:25:19] get worried like, oh, I've played so much chess lately.

[00:25:21] I'm worried that it's a waste of time as opposed to like,

[00:25:25] I don't know, writing a novel or something.

[00:25:27] But all right.

[00:25:29] Now I'm a little more confident in that.

[00:25:31] Where were some of the places you went all over the

[00:25:33] world doing research for this book?

[00:25:35] Where were some of the places that you ended up?

[00:25:38] Yeah, I went to Spain and participated in the World

[00:25:43] Jigsaw Puzzle Championship.

[00:25:46] Did you win?

[00:25:47] Which was hilarious.

[00:25:48] We did not win.

[00:25:49] We represented the USA and we embarrassed our country.

[00:25:52] I'm afraid to say it was my family and I,

[00:25:54] my sons and my wife and I.

[00:25:57] So we were like trounced by Russia.

[00:26:00] Those people are unbelievable.

[00:26:03] Like how many pieces were in the puzzle?

[00:26:05] It was eight hours and you had four giant jigsaw puzzles

[00:26:09] of about one to 2,000 pieces each.

[00:26:11] And how quickly did they finish?

[00:26:14] They finished all four puzzles in three and a half

[00:26:18] hours, which is amazing.

[00:26:20] Like I could barely move my hands that quickly,

[00:26:23] even if I knew exactly where to place the pieces.

[00:26:26] But they are just, you know, they're remarkable.

[00:26:29] And they had strategy.

[00:26:31] You know, you think jigsaw puzzles, is there really a strategy?

[00:26:35] Yeah, yeah, there's fun.

[00:26:37] There's like you got to sometimes switch from colors

[00:26:41] to the shapes.

[00:26:42] Like, and they had one person who was a specialist in the

[00:26:46] colors of the sky, like the monochromatic sky and ocean.

[00:26:51] Like those can often mess you up.

[00:26:53] So what I love is that whatever the topic,

[00:26:57] you know, chess or mazes or jigsaws,

[00:27:01] like there are people who are so passionate and so knowledgeable

[00:27:05] and so just obsessed that it's just a joy.

[00:27:11] It's just a joy to see.

[00:27:13] And how long did it take you to solve?

[00:27:16] Did you solve any of the puzzles in the eight hours?

[00:27:19] We did. Thank you very much.

[00:27:21] We solved one of the four puzzles and it took us six

[00:27:24] hours. And then we started on the other and then it ended.

[00:27:27] So I remember as a kid, a basic strategy would be just

[00:27:30] separate the edges, the corners and the inside.

[00:27:33] And then the inside you could, you know,

[00:27:35] you could separate by the colors because you know,

[00:27:37] oh, this is a human.

[00:27:38] This is a house.

[00:27:39] This is the sky.

[00:27:40] Did you do any strategies like that?

[00:27:41] Like just like the one of puzzles 101.

[00:27:44] Oh yeah.

[00:27:45] I mean, edges, edges usually are good,

[00:27:48] but it depends on the puzzle.

[00:27:50] Like the, you know, your professional puzzlers will tell

[00:27:52] you sometimes that's not good.

[00:27:55] I'm actually working on a puzzle right now where all of the

[00:27:59] edges are white, but all of the inside has these very

[00:28:03] striking colors.

[00:28:05] So in that case, you can separate them,

[00:28:07] but don't do the border first.

[00:28:09] Start with the colors.

[00:28:10] That's another big lesson is always like find the toe

[00:28:13] hold, find what is the easiest way in?

[00:28:16] What's the weak spot and then attack it from there.

[00:28:20] So, what other places you went to?

[00:28:24] You mentioned to me earlier something that sounded really

[00:28:28] interesting about long, long-term anism.

[00:28:30] What does that mean and where did you find out about it?

[00:28:32] Well, that is a, that's not a relate,

[00:28:35] which is good because my publisher will kill me for

[00:28:37] talking too much about puzzles.

[00:28:39] But this was the most fascinating conference I've ever

[00:28:43] been to on no exaggeration.

[00:28:46] And it was about a month ago.

[00:28:47] It was in England and it was with about 200 of the

[00:28:51] smartest people I've met.

[00:28:53] And I don't know why I was invited,

[00:28:57] but I was happy to be there.

[00:28:59] And it was called, the group is called Long View.

[00:29:02] And it's sort of sci-fi meets humanitarian work.

[00:29:09] So the idea is the future could be very,

[00:29:15] very good or it could be very, very bad.

[00:29:19] It's, you know, it could be a dystopia.

[00:29:22] Like the dystopias are a real possibility.

[00:29:25] So let me back up a second and ask you,

[00:29:27] like you were invited, were you invited to speak

[00:29:30] or just attend?

[00:29:32] There were no speakers.

[00:29:33] It was all just meetings.

[00:29:35] So you had experts in pandemics and biosecurity

[00:29:41] and nuclear war and AI.

[00:29:45] AI is something that they're very concerned about,

[00:29:48] like robots taking over, which sounds funny

[00:29:52] and like Hollywood, but, and it won't look

[00:29:56] like the Terminator if it happens,

[00:29:58] but it is a real threat.

[00:30:00] Like this is not something that we should take lightly.

[00:30:03] So the idea was how can we try to not just think

[00:30:09] about ourselves, but think about our 17th great-grandkids?

[00:30:13] And how can we convince other people that this is something

[00:30:17] that they should do because it's a hard sell.

[00:30:19] Like, you know, if you, if you see a commercial for,

[00:30:23] you know, a kid in Africa who doesn't have enough

[00:30:27] to eat or has a disease like you are,

[00:30:30] that is viscerally, of course you're going to give money.

[00:30:33] Of course you're going to like feel remorse

[00:30:37] and guilt and you're going to try to make their lives better.

[00:30:41] But how do you get people to care about someone

[00:30:45] who's not born yet, who you can't see,

[00:30:47] who won't be around for another 10,000 years?

[00:30:50] That is a huge challenge.

[00:30:53] Well, what are some of the issues that people

[00:30:55] are worried about for like 100 years from now?

[00:30:57] And by the way, was there ever a time

[00:30:59] when people were not both worried about dystopia

[00:31:05] and at the same time another group of people were also saying,

[00:31:09] no, it's going to be heaven on earth in 100 years?

[00:31:11] Well, that is a great point.

[00:31:13] That is a great point.

[00:31:15] The argument here is that we might actually be

[00:31:18] in what's called a hinge century,

[00:31:20] that we might be in the most important 100 years

[00:31:23] just because of the rate of technological change

[00:31:29] has gotten faster and faster

[00:31:31] and it is just startlingly fast.

[00:31:33] And that AI, if it happens,

[00:31:35] if you get generally a gelized AI

[00:31:38] where it's actually something that we create

[00:31:43] that can think like a human,

[00:31:45] that is going to be the biggest technological change in history,

[00:31:49] like bigger than cars, bigger than computers.

[00:31:53] So yeah, I guess there are a few things

[00:31:55] that they think about.

[00:31:57] One is nuclear war, which is to me so baffling.

[00:32:01] Like we were obsessed with it in the 50s and 60s.

[00:32:04] We had things, you know, PSAs saying with a turtle saying duck

[00:32:09] and cover get under your desk.

[00:32:13] And then in the 80s, we had this terrifying mini series,

[00:32:19] what was it called?

[00:32:21] About the end of the world.

[00:32:23] And now there's still, we can still blow up the world

[00:32:28] a thousand times when no one talks about it.

[00:32:31] It's so crazy to me.

[00:32:33] Maybe they've gotten a lot more serious about tracking

[00:32:37] the old Soviet Union nuclear weapons

[00:32:40] and there's less nuclear material

[00:32:43] or like weapons grade radioactive material

[00:32:46] in order to make a nuclear bomb

[00:32:48] but arguably your point about technology is relevant here

[00:32:51] which is that it's probably easier to make a nuclear weapon now

[00:32:54] than it's ever been before.

[00:32:56] I mean, I'm not an expert but according to these people

[00:32:59] that I was with, yeah, we should not be letting up our guard

[00:33:03] about this.

[00:33:04] We should be instituting systems that make it really hard

[00:33:07] to obtain this material and to launch it.

[00:33:10] And what's crazy?

[00:33:12] This one, James, blew me away is all these stories

[00:33:16] of the near misses of how the world almost blew up

[00:33:21] and that are totally under publicized.

[00:33:24] Like, do you know the story of Petrov?

[00:33:26] This was in the 80s and he was a Soviet army officer

[00:33:31] who was in charge basically of greenlighting

[00:33:36] a nuclear retaliation on the United States.

[00:33:40] And one day on his radar it showed that there were

[00:33:44] five missiles from the United States coming.

[00:33:47] And if he had followed his instructions,

[00:33:50] if he had done what he was supposed to do,

[00:33:53] he would have told the head general to launch a retaliatory strike.

[00:33:59] But he just had this feeling like, well, maybe it's a mistake,

[00:34:03] maybe it's...

[00:34:04] And thank God he didn't do it and it turned out

[00:34:07] it was a mistake that there was some weird light reflected.

[00:34:11] But I...

[00:34:12] There's a movement to start something called Petrov Day

[00:34:15] where every...

[00:34:16] I think it's September 24th.

[00:34:18] We remember how close we gained to ending the world

[00:34:22] because we just take it for granted that

[00:34:26] we haven't had a nuclear war and there's never going to be a nuclear war.

[00:34:30] But this is still around. This is a huge deal.

[00:34:33] So what's another near miss?

[00:34:35] Were there other ones?

[00:34:37] Oh yeah, there was a bunch.

[00:34:38] There's one with a submarine that...

[00:34:40] It just so happened that one of the Soviet...

[00:34:44] It always serves to me to say, but I'm sure there were American problems too.

[00:34:48] We're better at keeping it secret.

[00:34:50] Exactly.

[00:34:52] So yeah, there are several near misses and it's terrifying.

[00:34:57] Actually, how do we know that Petrov story is true?

[00:35:00] Like why did they put out a press release?

[00:35:02] Petrov saved the world by not launching a nuclear attack.

[00:35:06] Well, we didn't know about it for years.

[00:35:08] It was just...

[00:35:09] Yeah, it was very recently, a little before he died,

[00:35:13] that it was released.

[00:35:16] I forget who uncovered it, but someone interviewed him and he told the story.

[00:35:35] You like to watch new stuff, right?

[00:35:38] Well, go to Hulu and see what's new,

[00:35:40] because Hulu has new stuff all the time.

[00:35:43] Like Vanderpump Villa, the new Daki drama starring Lisa Vanderpump

[00:35:47] where first class luxury meets world-class drama,

[00:35:51] a new season of The Kardashians starring The Kardashians, of course,

[00:35:55] and Grant Cayman, Secrets in Paradise,

[00:35:58] the sizzling new reality show set in the tropical Caribbean.

[00:36:02] It's all new and it's streaming now on Hulu.

[00:36:06] After the end of a good fight, you deserve a nice cold reward.

[00:36:10] Madella, you put in the Alps the Enrich, the Tough Labor

[00:36:13] because you know the bigger the fight, the better the reward.

[00:36:17] Madella, the Mark of the Fight.

[00:36:19] Frequent responsibly, be reported by Crown Airport, Chicago, Illinois.

[00:36:24] So how many nuclear weapons from the old Soviet Union

[00:36:27] are not accounted for at the moment?

[00:36:29] Like, are they missing nuclear weapons?

[00:36:31] Great question. I wish I were more of that.

[00:36:34] You know, that's why I went to the conference as someone who could get the word out

[00:36:40] about how we should be thinking about the future.

[00:36:43] And also, I just want to stress, it wasn't all doom and gloom.

[00:36:47] Like, the idea is the future could be the most amazing,

[00:36:51] it could be like heaven on earth.

[00:36:53] It could be, you know, no one has to work.

[00:36:56] You just do what, you know, you follow the creative bliss.

[00:37:00] You do puzzles all day. You play chess.

[00:37:03] You know, it could be... We can't even imagine how good life could be.

[00:37:08] It's almost like saying to, you know, an ape,

[00:37:12] imagine if like you had all the bananas you could eat.

[00:37:16] Yeah, but you couldn't explain to an ape

[00:37:19] imagine that there are movies and our TV, you know,

[00:37:25] the ape just couldn't comprehend it.

[00:37:27] We cannot comprehend how good the future could be.

[00:37:31] So part of it is getting that point across

[00:37:34] and part of it is avoiding that we blow each other up.

[00:37:37] Now, but it does seem that technology evolves roughly at the same pace as the problems.

[00:37:44] So for instance, a classic example which I've mentioned on this podcast before,

[00:37:49] it's actually mentioned in the book, I think, Super Freakonomics,

[00:37:52] when horse manure, around the year 1900,

[00:37:55] horse manure was so big in New York because everybody traveled with horses

[00:38:00] that they had to shovel out like, I don't know,

[00:38:03] 20 inches of manure that covered all the streets in the entire city every single day

[00:38:08] and dump it over to New Jersey or the ocean or whatever.

[00:38:11] And of course, the development of cars was a technology that,

[00:38:15] like everybody, the environmental disaster of 1900 was that horse manure

[00:38:19] was going to bury the world.

[00:38:21] And suddenly though, cars were invented and that solved the problem.

[00:38:26] And so in general, it seems like technology has kept pace with problems in,

[00:38:32] you know, like overpopulation, oh, we created GMOs

[00:38:36] to feed the extra two or three billion people that exist.

[00:38:40] Even climate change, like solar power has been improving exponentially.

[00:38:44] Nuclear fusion, if we ever switch to that would provide clean energy

[00:38:48] for thousands of years easily.

[00:38:50] So coming out of this conference, what do you view as a real problem?

[00:38:54] And also talk about what the AI problem is.

[00:38:56] Sure.

[00:38:57] And I love your point.

[00:38:58] I love your point.

[00:38:59] I think that is generally true,

[00:39:02] but there is absolutely no guarantee that that will continue,

[00:39:05] that we'll have our solutions.

[00:39:07] And I would say that I think the environmental,

[00:39:12] I'm not sure that we have the technology yet

[00:39:17] or the political will to stop that from being a real catastrophe.

[00:39:23] I hope we do.

[00:39:24] But anyway, yeah, the AI, the idea is that it wouldn't be the Terminator,

[00:39:29] it wouldn't be robots, it wouldn't be like a Will Smith movie.

[00:39:33] But the fear is that we create these AIs that are so smart

[00:39:38] and that their goals are not aligned with human goals

[00:39:44] and that that could be a disaster.

[00:39:46] And let me give you one of the most famous fables in AI

[00:39:50] when people think about how AI could go wrong.

[00:39:54] Suppose you programmed an AI to,

[00:39:58] and you gave it the goal of making paper clips

[00:40:02] and said, make as many paper clips as you can.

[00:40:06] It's a self-learning so it becomes smarter and smarter

[00:40:10] as it starts this goal to make paper clips.

[00:40:13] Well, it's not going to just stop at making a paper clip factory.

[00:40:17] If it has unlimited resource, if it has this intelligence,

[00:40:21] it can figure out a way to hack into the grid,

[00:40:24] hack into the internet.

[00:40:26] It's going to start shipping all metals in the world

[00:40:32] to itself to turn those.

[00:40:34] It's going to harvest humans to turn them into paper clips.

[00:40:38] It's going to send rocket ships off to other planets

[00:40:41] so that you can get materials to turn paper clips.

[00:40:44] So the idea, the worry is that we will not be able to stop.

[00:40:50] Once we give it a goal,

[00:40:52] that goal has unintended consequences that we can't even imagine.

[00:40:56] Yeah, so that's interesting.

[00:40:58] So obviously you could program something in

[00:41:01] that you could put boundaries or limits on it.

[00:41:04] And then there's, do you remember Isaac Asimov's

[00:41:07] iRobot series?

[00:41:09] Didn't he have rules for robots

[00:41:12] that they could never break certain rules?

[00:41:14] So there were three laws of robotics.

[00:41:17] A robot may not injure a human being

[00:41:19] or through inaction allow a human being to come to harm.

[00:41:22] A robot must obey the orders given to it by human beings

[00:41:26] except where such orders would conflict with the first law

[00:41:29] and a robot must protect its own existence

[00:41:31] as long as it doesn't go into conflict

[00:41:34] with the first or second laws.

[00:41:36] So in your situation, the AI would break all three of those rules

[00:41:39] which seem like basic rules you would program

[00:41:41] into any super powerful AI.

[00:41:44] Right, and I do think, I mean that's the goal

[00:41:47] of these people who are in the long termist movement.

[00:41:50] That's the name of it, long termism is to figure out

[00:41:53] how to program these things in really safely.

[00:41:56] But it's not so easy.

[00:41:58] I mean imagine if you tell a five year old

[00:42:02] to stop adults from stealing his toys.

[00:42:08] So he sets up some blocks

[00:42:11] and says okay, no one can get over the blocks.

[00:42:16] So we're kind of like, if we get a computer,

[00:42:19] if we have this computer who gets smarter and smarter

[00:42:23] then they're gonna be like the adults.

[00:42:25] They're gonna figure out a way, whether it's bribing the child

[00:42:29] whether it's getting the, just stepping over the blocks

[00:42:32] or getting a bulldozer to mow down the block,

[00:42:35] whatever it is, they might be able to find a way around it.

[00:42:40] So I mean as Moff is right,

[00:42:43] we got to figure out a way to program this in.

[00:42:46] Because AI could be amazing, it could make the world,

[00:42:49] like I said, it could, you would never have to have

[00:42:52] a crap job again, like you would never,

[00:42:55] there would be no such thing as a maid or housekeeper

[00:42:59] that AI would do it, they would take care of it.

[00:43:02] Yeah, I mean what are the other bad possibilities from AI?

[00:43:07] And I agree there are many,

[00:43:08] I've talked about this with other guests before,

[00:43:11] but I'm curious what came up in this conference.

[00:43:14] Well one example that really resonated with me

[00:43:17] is because it's not sci-fi, it exists, it happened,

[00:43:20] which is YouTube.

[00:43:22] So YouTube has the algorithm that they want to recommend

[00:43:27] videos that you will click on.

[00:43:30] And so that's the only, that was the goal of this algorithm

[00:43:35] like present you with other videos

[00:43:37] that you're gonna click on and like.

[00:43:39] Unfortunately in reality what happened is

[00:43:42] if you watched a video about, I don't know,

[00:43:47] that maybe NASA didn't land on the moon,

[00:43:52] then it's gonna recommend, hey, you think that's something,

[00:43:56] you should look at this, the earth is really flat.

[00:43:59] And that the government has been keeping this from you.

[00:44:03] And that has spread misinformation and disinformation

[00:44:09] everywhere.

[00:44:10] We have a thriving flat earth community

[00:44:13] that we didn't have 20 years ago because of this insanity.

[00:44:17] That's really interesting because obviously,

[00:44:22] presenting you with videos that are gonna be harmful to you

[00:44:26] goes against Asimov's first law of robotics,

[00:44:29] but how would you have programmed it in?

[00:44:32] There's no real way because the only way,

[00:44:35] okay, there's a sort of human way of programming

[00:44:38] and there's an AI way of programming it.

[00:44:40] So the AI way is called deep learning

[00:44:42] where the AI figures things out on its own

[00:44:45] but the human could say, could decide what's harmful or not

[00:44:48] but now we have that problem with censorship

[00:44:51] on all the social media platforms

[00:44:52] when the humans are deciding what to put into the AI

[00:44:56] that's considered harmful.

[00:44:58] But if the AI decides it, how would you do it?

[00:45:00] You'd have to see, okay,

[00:45:02] what videos make someone blood pressure consistently rise up?

[00:45:06] Let's avoid those videos.

[00:45:08] So like anything that makes them fuel this addiction

[00:45:12] to conspiracy theories is gonna make your blood pressure go up,

[00:45:15] your anxiety go up, your cortisol go up and so on.

[00:45:18] So maybe there's some way it could learn

[00:45:20] based on body states before and after watching a video.

[00:45:25] It could learn what's bad,

[00:45:26] what will increase your life and what will decrease your life.

[00:45:29] So too much cortisol will probably decrease your life.

[00:45:31] You need cortisol because you need to be able to react to danger

[00:45:35] but if it's too much, if your body feels like it's constantly in danger

[00:45:39] that'll lead to heart attacks and strokes and so on.

[00:45:42] So I wonder if there's an AI way to learn

[00:45:45] what videos are good or bad for you?

[00:45:47] Well yeah, it is bad.

[00:45:49] That's why these people are there

[00:45:51] to wrestle with these deep questions

[00:45:53] and I'm not an expert.

[00:45:55] I was brought in just to give my thoughts on communication.

[00:46:00] There's some amazing people that you should have on the podcast

[00:46:03] like Will McCaskill and Toby Ord are both philosophers at Oxford

[00:46:08] who think about this stuff all the time.

[00:46:11] But yeah, even that what you presented is very hard.

[00:46:14] How do we define what's good and bad?

[00:46:17] Because you can imagine doing an AI that said,

[00:46:19] your job is to recommend videos that lower cortisol level

[00:46:24] but then it becomes through deep learning,

[00:46:28] it becomes smarter and smarter

[00:46:30] and eventually it's got you on an operating table

[00:46:33] taking out your amygdala to get rid of all your cortisol.

[00:46:37] So it's a deep problem.

[00:46:41] Luckily there are people thinking about it

[00:46:44] but we need tons more of creative, courageous people

[00:46:49] looking into these massive problems.

[00:46:52] And I'd recommend people watch or re-watch

[00:46:55] the podcast I did with Kai Fuli who wrote a book called 2041

[00:46:59] that explores all of these issues in a fictional setting

[00:47:02] of what could occur 20 years from now

[00:47:05] because of AI, because of genomics

[00:47:07] and because of all sorts of things

[00:47:10] that he predicts in that book.

[00:47:12] It's really fascinating

[00:47:13] and it sounds like he would have been an excellent addition

[00:47:15] to this conference.

[00:47:17] What else did you guys talk about?

[00:47:20] Well, there are, you know, pandemics.

[00:47:25] How do we prevent another one that's...

[00:47:28] and how do we prevent human-made pandemics?

[00:47:32] You know, it's not clear what really happened

[00:47:35] with this pandemic but in the future

[00:47:38] it'll just get easier and easier to create horrible pandemics.

[00:47:41] I met one great guy who is working on...

[00:47:46] his idea is, you know, we spend billions of dollars

[00:47:50] fireproofing our buildings.

[00:47:52] Shouldn't we spend some money pandemic-proofing our buildings?

[00:47:57] Like if our buildings and houses had better air filtration

[00:48:02] this pandemic would have been so much better.

[00:48:05] Like this pandemic does not spread outside.

[00:48:09] If we can get the same level of air being recycled

[00:48:14] inside as outside, this would...

[00:48:16] we would not have had...

[00:48:18] it would have been a minor problem,

[00:48:24] not this massive world-stopping problem.

[00:48:26] Is that true that pandemics don't spread outside?

[00:48:28] What if you're at like an outdoor wedding

[00:48:31] that... I mean weren't they worried initially

[00:48:33] that outdoor weddings were super spreader events?

[00:48:36] Yeah, you shouldn't take my...

[00:48:38] definitely look at the statistics from the CDC

[00:48:42] I mean my understanding is that it is much, much harder.

[00:48:47] It still can spread outside but this...

[00:48:49] and it depends on the virus but this particular variant

[00:48:53] that we're dealing with now is much, much harder

[00:48:58] to spread outside than it is inside.

[00:49:01] Yeah, so okay, I'm not so much worried.

[00:49:03] Like obviously I'm worried about pandemics

[00:49:05] because they kill a lot of people.

[00:49:07] COVID there's... I don't know how many deaths

[00:49:09] but it's a huge number.

[00:49:11] But a pandemic probably can't destroy the world.

[00:49:14] It can't kill everybody because once it starts killing people

[00:49:18] that the virus ends with the person who's dead.

[00:49:21] So if it kills very quickly,

[00:49:24] like Ebola kills people very quickly,

[00:49:26] Ebola doesn't really spread worldwide.

[00:49:29] It's a very localized virus when it comes out

[00:49:32] because everybody who can transmit it dies right away.

[00:49:35] I think the thing about COVID was that

[00:49:37] you don't exhibit symptoms for about 10 days

[00:49:40] so you're able to transmit it.

[00:49:42] So maybe, I guess maybe the perfect...

[00:49:45] and you know, the viruses that spread the most

[00:49:48] are the ones that have no symptoms at all.

[00:49:50] Like we all have thousands of viruses in our body right now

[00:49:53] but they just didn't exhibit any symptoms

[00:49:55] and those probably spread to everybody in the world.

[00:49:57] So there's some spectrum there

[00:49:59] but I wonder maybe the perfect storm of viruses

[00:50:02] if you could create a virus where there's no symptoms

[00:50:05] for 200 days and then suddenly within one day you die.

[00:50:09] That is a terrifying thought

[00:50:11] and I will say, like you said

[00:50:13] it may not cause human extinction

[00:50:16] but it could do so much damage that it would be catastrophic

[00:50:20] and set us back thousands of years

[00:50:22] and we would be living in sort of a societal collapse

[00:50:26] like you see in the movies

[00:50:28] and that is one of the other big topics was this idea of lock-in.

[00:50:31] How do we avoid locking into a society

[00:50:34] that is really bad like an authoritarian lock-in

[00:50:37] where it becomes so hard for...

[00:50:41] you know, the American Revolution was good

[00:50:45] and you know, the fact that we beat the...

[00:50:48] Why?

[00:50:50] Why is it good?

[00:50:52] Yeah. I think this is one of those things we assume

[00:50:54] as a good move

[00:50:55] Well we can talk about that on our upcoming podcast

[00:50:59] Good or Bad

[00:51:00] Oh yeah.

[00:51:02] And it is a fascinating issue whether

[00:51:04] like maybe we should have taken Canada's route

[00:51:07] but regardless of putting that aside

[00:51:09] the fact that we were able to wage a successful revolution

[00:51:13] against England

[00:51:16] that is not a given in the future

[00:51:18] when you've got an authoritarian government

[00:51:22] that controls all, you know, that has massive power

[00:51:26] so how do we avoid that?

[00:51:28] How do we keep democracy safe?

[00:51:30] Well let me ask you a question

[00:51:32] how do you even know what's insidious about authoritarian governments

[00:51:36] like whether it's Soviet Russia or Venezuela under Chรกvez

[00:51:41] or you know, there's hundreds of examples

[00:51:45] it's very... or Nazi Germany

[00:51:47] it's very insidious because at first

[00:51:50] everyone's excited for the new leadership

[00:51:53] the new leadership says everybody should have everything

[00:51:57] and we're going to bring us into a new dawn

[00:52:00] everyone's going to have everything they ever wanted

[00:52:02] and of course bad things happen

[00:52:06] you know, under Mao Tse Tsung who was very popular

[00:52:11] very popularly brought into, you know, became the leader of China

[00:52:14] and of course he killed 40 million people

[00:52:17] similar to Stalin, similar to what Hitler did

[00:52:21] and how do you know when authoritarianism is creeping up?

[00:52:26] So for instance in the US last year

[00:52:28] every single business in the country was shut down for months

[00:52:32] causing maybe a million small businesses to go bankrupt

[00:52:37] and many people to commit suicide, have depression

[00:52:41] and all of this was maybe a good thing

[00:52:43] because it prevented... maybe it saved millions of lives

[00:52:46] because COVID didn't spread

[00:52:48] and then it was authoritarian and what wasn't even in that scenario?

[00:52:52] That's a big question and I don't have all the answers of course

[00:52:56] but I would say one safeguard is making sure

[00:53:00] that we limit misinformation and disinformation

[00:53:03] which is, you know, I thought when the internet came

[00:53:06] I was like oh my god this is it, this is the solution

[00:53:09] everyone is going to have access to all the correct knowledge

[00:53:14] but of course it went possibly the opposite way

[00:53:19] where we have access to all the bad knowledge

[00:53:22] all the incorrect knowledge

[00:53:24] and the incorrect knowledge is more exciting

[00:53:26] you know, it's much more fun to think that

[00:53:29] you know a bunch of Mafiosa and Cubans united to kill JFK

[00:53:33] than it was just this one loser with a bad haircut

[00:53:36] so yeah I... but I think that what...

[00:53:40] that is one safeguard against authoritarianism

[00:53:43] is just trying to make sure that we have institutions

[00:53:48] that are self-correcting and that provide real information

[00:53:53] in context and that admit when they make mistakes

[00:53:58] because authoritarians never admit that they made any mistakes

[00:54:02] But I wonder if like when in US history has a president

[00:54:06] and I'm just... I'm playing the devil's advocate

[00:54:08] but like what in US history has a president ever admitted

[00:54:11] he made a mistake?

[00:54:13] That's a good question I gotta look back

[00:54:17] I don't have it off the top of my head

[00:54:19] my sense is that Obama sometimes

[00:54:22] and I think with the Bayes suit he said he made a mistake

[00:54:26] but I do think there were times when he admitted he made a mistake

[00:54:30] but it's a good question

[00:54:32] I think also that is...

[00:54:35] that's why I think science is the best route to knowledge

[00:54:39] because it is self-correcting

[00:54:41] a good scientist will change their mind

[00:54:45] they will look at the evidence

[00:54:47] and not all do and it's... you know science can be corrupt

[00:54:51] and refuse to admit they're wrong

[00:54:53] but good science is to me a whole different level of knowledge

[00:54:59] of epistemology

[00:55:01] That's a great point because if you look again at this past year

[00:55:04] because we didn't know that much about COVID in the beginning

[00:55:07] science did change its mind several times

[00:55:10] about what specifically is not that important

[00:55:13] but I noticed A. people were... started mistrusting science

[00:55:19] when they would change their mind

[00:55:21] so kind of people need to be a little bit more knowledgeable

[00:55:23] about what it means to be a skeptic

[00:55:25] and how most good science comes out of being skeptical of bad science

[00:55:29] so science isn't always 100% correct

[00:55:32] like you say it's self-correcting

[00:55:34] and policy and public opinion have to acknowledge that

[00:55:38] you can't make a policy based on one set of scientific facts

[00:55:41] and then see those facts change as new discoveries develop

[00:55:45] for instance in the beginning of the pandemic

[00:55:47] we were told not to wear masks that it wouldn't be a big deal

[00:55:50] now we were told that for many reasons

[00:55:52] maybe not some scientific but then it became clear

[00:55:54] you have to wear a mask to avoid... to best avoid transmission

[00:55:58] so okay and then I was talking to one person

[00:56:01] who's an expert on the Constitution

[00:56:03] and he's in the Constitution says we're all entitled to life, liberty

[00:56:07] and the pursuit of happiness

[00:56:09] or maybe it was even the pursuit of property

[00:56:13] I forget but...

[00:56:14] Well that's the original that's John Locke's version

[00:56:17] Yeah so but this person pointed out that life comes first

[00:56:20] and then things like property come after that

[00:56:24] so that's the lockdowns could be constitutionally justified

[00:56:27] because it helped... the scientific argument was that it preserved life

[00:56:31] but then you have to really start to ask

[00:56:33] well how much life did it preserve

[00:56:35] and how much life did it destroy?

[00:56:38] like because in New York City right now as we're speaking

[00:56:41] you can't have elective surgeries

[00:56:43] because people are worried about the Omicron variant

[00:56:46] it's much more transmissible

[00:56:48] but I don't know if it causes death or causes even a sickness

[00:56:51] no one's said

[00:56:53] so we kind of have to weigh things

[00:56:56] but now if you block elective surgeries

[00:56:59] well now you're blocking...

[00:57:02] what's it called when you open up the body

[00:57:05] to see if they have cancer? I forget

[00:57:08] Biopsies?

[00:57:09] Yeah yeah now you can't have biopsies

[00:57:11] because that's considered elective surgery

[00:57:13] so some people might die because of this policy

[00:57:15] even if nobody's dying from the Omicron variant

[00:57:18] I think somehow even that process

[00:57:21] has to be self-correcting

[00:57:23] now maybe you should ban elective surgeries now

[00:57:25] I'm not saying one way or the other

[00:57:26] maybe the Omicron variant is the worst thing

[00:57:28] that's ever happened in the world

[00:57:30] but everybody has to understand

[00:57:33] how to weigh the pros and cons

[00:57:34] and I'm afraid people just don't know what that means

[00:57:37] we've never been taught statistics in school

[00:57:40] we learn trigonometry or geometry

[00:57:43] or I don't know linear algebra

[00:57:45] which is useless for daily life

[00:57:47] but basic statistics is incredibly useful

[00:57:49] and we're never actually taught that in school

[00:57:51] most people aren't taught that

[00:57:53] I'm not going to weigh in on the COVID stuff

[00:57:58] because I haven't studied enough to know

[00:58:00] you know I am very pro-mask and pro-endors anyway

[00:58:05] I am very pro-endors just in life in general

[00:58:08] and I have no argument when someone wants me

[00:58:11] to wear a mask I'll wear a mask

[00:58:13] I have no... you see people on airplanes

[00:58:15] get into fights like just relax

[00:58:17] and fly in the air and get to your location

[00:58:20] but I do want to agree with you 100%

[00:58:23] about statistics that to me is so crucial

[00:58:27] we under... you know we focus on the one

[00:58:31] cute girl in the well which is terrible

[00:58:35] she fell down a well that's terrible

[00:58:37] but let's also keep in mind the hundreds of thousands

[00:58:40] of people whose lives are endangered by malaria

[00:58:43] and there's actually a quote

[00:58:45] one of my favorite philosophers Bertrand Russell

[00:58:47] said a quote and he didn't actually say these exact words

[00:58:51] but I like it better the way that

[00:58:54] he supposedly said it which was

[00:58:56] he said the mark of a truly advanced person

[00:59:02] is that they can look at a column of numbers

[00:59:06] and weep which is very hard to do

[00:59:09] statistics do not resonate emotionally

[00:59:12] like I said you know the picture

[00:59:14] of the cute girl down the well

[00:59:16] that resonates emotionally

[00:59:18] but reading a statistic like you know

[00:59:21] 500,000 people per year die of malaria

[00:59:24] that does not resonate emotionally

[00:59:26] but we should fight that

[00:59:29] we should try to make it resonate emotionally

[00:59:31] and that's why I love what you said

[00:59:34] about statistics we've got to teach

[00:59:36] more statistics in schools

[00:59:38] Well you make a good point because you look at evolution

[00:59:40] I mean for most of human

[00:59:43] the existence of humans

[00:59:45] we didn't know how to count further than 10

[00:59:48] and but we did communicate information

[00:59:52] generation after generation through storytelling

[00:59:55] so our brains are built for stories

[00:59:57] and not necessarily numbers

[00:59:59] but I just wonder why do they decide

[01:00:01] that the standard curriculum in schools

[01:00:03] would include trigonometry but not probability

[01:00:05] when probability is how we make

[01:00:07] almost every decision in our lives

[01:00:09] oh my goodness

[01:00:11] you should create like a curriculum

[01:00:13] so with math I would say

[01:00:15] you need to learn everything you need to do

[01:00:17] up until you get good at probability and statistics

[01:00:19] so that means no calculus

[01:00:21] no trigonometry

[01:00:23] no advanced algebra

[01:00:25] probably very little geometry actually

[01:00:27] unless you want to be like a construction

[01:00:29] you know builder or whatever

[01:00:31] or an architect

[01:00:33] but everybody needs to learn probability

[01:00:35] which is not taught in any grade school

[01:00:37] I love that idea

[01:00:39] What would you teach in science?

[01:00:41] Well I would teach

[01:00:43] like you said the scientific method

[01:00:45] just knowing that

[01:00:47] it's not perfect

[01:00:49] but it is the best way

[01:00:51] and

[01:00:53] I love but yeah

[01:00:55] the probability is huge

[01:00:57] let's teach our kids

[01:00:59] I want to pick it

[01:01:01] well think about it before the scientific method

[01:01:03] you need to learn probability

[01:01:05] you need to know what's significant

[01:01:07] when a control group acts one way

[01:01:09] but the test group acts another way

[01:01:11] is it meaningful? Well you only know that through statistics

[01:01:13] or what if you're not using a control group

[01:01:15] like with COVID now it's too early

[01:01:17] to have a real control group

[01:01:19] and test groups so you have to look at a population sample

[01:01:21] which is a different type of scientific method

[01:01:23] and you have to understand what statistically

[01:01:25] significant means there

[01:01:27] What would you teach in history?

[01:01:29] In history?

[01:01:31] Well I would teach

[01:01:33] the world history through

[01:01:35] several different lenses

[01:01:37] so you can look at history

[01:01:41] through

[01:01:43] class warfare

[01:01:45] you can look at history through

[01:01:47] race but you can also look

[01:01:49] at history through

[01:01:51] medical advances or sickness

[01:01:53] or you can look at history

[01:01:55] through

[01:01:57] technological advances

[01:01:59] and lifespan

[01:02:01] and so there are many ways to look at history

[01:02:03] it should not be a

[01:02:05] single prism

[01:02:07] that we say. Yeah you're right

[01:02:09] for the past 20 or 30 years

[01:02:11] there's been a kind of popular type of

[01:02:13] book like you ever

[01:02:15] see the book Salt? Yeah I love that

[01:02:17] book yeah. So that's looking at history

[01:02:19] through the use of

[01:02:21] world history for the past 5000 years

[01:02:23] through the use of salt

[01:02:25] and Stephen Johnson

[01:02:27] has got a lot of books like this

[01:02:29] about air conditioning. If you look at the history of air conditioning

[01:02:31] you could describe the entire

[01:02:33] history of urbanization

[01:02:35] and technological

[01:02:37] advances in the US because

[01:02:39] you couldn't literally work in the south

[01:02:41] or have skyscrapers in the south

[01:02:43] until there was air conditioning

[01:02:45] Oh I love that. Well

[01:02:47] and then the other thing is one that you mentioned earlier

[01:02:49] like the horseman or I

[01:02:51] would stress just how

[01:02:53] terrible the past used to be

[01:02:55] it was not you know the good old days

[01:02:57] they were not good they were

[01:02:59] dangerous, violent,

[01:03:01] sexist, smelly. So we've got a lot

[01:03:03] of problems now

[01:03:05] as I've said you know we're facing

[01:03:07] some serious problems

[01:03:09] but we should gain confidence

[01:03:11] from looking at history to see that

[01:03:13] we can make progress and

[01:03:15] progress can be a real thing

[01:03:17] we can stop suffering

[01:03:19] so that would be another big

[01:03:21] because sometimes people read

[01:03:23] history and it's like

[01:03:25] oh humans are so terrible

[01:03:27] let's just give up

[01:03:29] So what fully impressed you

[01:03:31] about this long termism

[01:03:33] conference? What's your biggest takeaway

[01:03:35] how has your life improved

[01:03:37] by going to this conference?

[01:03:39] Well there's a phrase

[01:03:41] that was used for one

[01:03:43] of the meetings that I

[01:03:45] moderated. I wish I'd come up with

[01:03:47] the phrase but it was

[01:03:49] the phrase is the opposite

[01:03:51] of cold and calculating

[01:03:53] you know how people say

[01:03:55] so cold and calculating.

[01:03:57] Warm and calculating

[01:03:59] let's all be warm and calculating

[01:04:01] why should compassion

[01:04:03] and logic and statistics be at odds?

[01:04:05] No we can

[01:04:07] be compassionate

[01:04:09] and the fact is that if humans

[01:04:11] exist, if humans exist

[01:04:13] as long as the average mammalian

[01:04:15] species exists then we are

[01:04:17] at the very very

[01:04:19] start and we

[01:04:21] have had just a tiny

[01:04:23] percentage of humans to come

[01:04:25] there are billions, trillions

[01:04:27] of humans in the future

[01:04:29] and it's our responsibility

[01:04:31] to try to look out

[01:04:33] for them you know it's

[01:04:35] I call it the golden rule is

[01:04:37] to treat your neighbor

[01:04:39] as you would have your neighbor

[01:04:41] treat yourself but the titanium

[01:04:43] rule in my opinion

[01:04:45] because titanium is kind of futuristic

[01:04:47] is treat your future

[01:04:49] neighbor treat your descendants

[01:04:51] like you would have them

[01:04:53] treat you and that can mean

[01:04:55] you know 100 years or

[01:04:57] 1000 years whatever so

[01:04:59] that's it to me it was just

[01:05:01] an amazing way of looking at the world

[01:05:03] and and bringing in full

[01:05:05] circle obviously the question

[01:05:07] what could go wrong

[01:05:09] and and if it could go wrong how to

[01:05:11] solve it and what could go right over the next

[01:05:13] 100 years these are huge puzzles

[01:05:15] and you need a puzzle like mind

[01:05:17] and that kind of creativity

[01:05:19] like everyone says oh we're going to just

[01:05:21] continue to improve and ai is going to be

[01:05:23] great and quantum computing is going to be

[01:05:25] great and nanotechnology is going to be great

[01:05:27] if you just say that and don't look at

[01:05:29] the opposite and at least consider it

[01:05:31] you could be in trouble like nanotechnology

[01:05:33] could run amok and create

[01:05:35] you know every atom like

[01:05:37] computer could create another atom like

[01:05:39] computer to or they could create two atom

[01:05:41] like computers which could create two more

[01:05:43] and it goes on until it covers the entire

[01:05:45] planet with atomic size

[01:05:47] computers and we all die in seconds

[01:05:49] so even though nanotechnology

[01:05:51] is great for fashion and

[01:05:53] computing and all these other things

[01:05:55] could cause problems

[01:05:57] excellent I love the full circle

[01:05:59] because I totally agree we got to

[01:06:01] approach it like the

[01:06:03] the most important puzzle ever

[01:06:05] and and do you look long term in your life

[01:06:07] or do you think about what's like right

[01:06:09] now you have a book coming out in April the puzzler

[01:06:11] or do you look beyond that

[01:06:13] in your life that is an awesome question

[01:06:15] I don't really look

[01:06:17] long term in my life as much

[01:06:19] as I I try to now

[01:06:21] when I'm thinking of

[01:06:23] doing good you know

[01:06:25] it's not just

[01:06:27] doing you know

[01:06:29] there are a lot of problems now

[01:06:31] but let's also give some

[01:06:33] thought to the problems yet to come

[01:06:35] so

[01:06:37] I will say it has changed

[01:06:39] the way I think about probability in

[01:06:41] my life because I do like

[01:06:43] you said I try to incorporate

[01:06:45] probability you know we often

[01:06:47] think something is going

[01:06:49] to happen or is it you know it's very

[01:06:51] black and white but when my wife says

[01:06:53] you know what time will you be home I'll be

[01:06:55] like there's a 70% chance

[01:06:57] I'll be home at around 6 30

[01:06:59] and that you know she rolls her eyes

[01:07:01] she's 99% chance she's

[01:07:03] annoyed by that but I think we've all

[01:07:05] got to start thinking in probabilities

[01:07:07] because that is

[01:07:09] going to make the world a lot better

[01:07:11] but the unfortunate thing about

[01:07:13] what you just did although it's probably

[01:07:17] when we when humans

[01:07:19] assess probabilities without actual data

[01:07:21] we tend to be wrong most of the time

[01:07:23] so you ever read Philip Tetlock's work

[01:07:25] on forecasting oh yeah

[01:07:27] I love he is a hero of the

[01:07:29] long-termist movement

[01:07:31] yeah because basically he says almost everybody

[01:07:33] who makes forecast

[01:07:35] for a living is wrong almost all the

[01:07:37] time right but the but the

[01:07:39] upside is you can be trained

[01:07:41] to be a better forecaster and there are

[01:07:43] these super forecasters who use

[01:07:45] these various tricks to

[01:07:47] to figure out

[01:07:49] probabilities that are better than the

[01:07:51] so-called professionals excellent

[01:07:53] topic for a podcast actually

[01:07:55] I mean how to be a super forecaster

[01:07:57] well AJ

[01:07:59] a there's some takeaways one is

[01:08:01] we're going to get you

[01:08:03] 200 puzzle books by AJ

[01:08:05] Jacobs within the next year or so

[01:08:07] we need to develop a new academic

[01:08:09] curriculum that focuses on quality

[01:08:11] of life instead of just the same standard

[01:08:13] curriculum that we've had for the past 200 years

[01:08:15] and

[01:08:17] I think understanding skepticism

[01:08:19] vis-a-vis long-termism

[01:08:21] could help really solve

[01:08:23] problems in society like again

[01:08:25] you know and I think it was in 1972

[01:08:27] someone wrote a book that said

[01:08:29] the England won't exist anymore

[01:08:31] because of overpopulation

[01:08:33] and you know by 1970

[01:08:35] by 1980 they said this and of course

[01:08:37] that didn't happen and

[01:08:39] people are making forecasts about the end of

[01:08:41] the world every decade

[01:08:43] since the 50s or 60s

[01:08:45] whether it's nuclear war or climate change or whatever

[01:08:47] and almost every forecast has been wrong

[01:08:49] although to your point which I think is correct

[01:08:51] the rate of technological

[01:08:53] change cannot

[01:08:55] be ignored now it there will

[01:08:57] that's what makes this time different

[01:08:59] because there was before 1900 there was

[01:09:01] not really

[01:09:03] technological change like for a thousand years

[01:09:05] things stayed the same

[01:09:07] but now everything

[01:09:09] is changing and the faster

[01:09:11] things change the faster there's

[01:09:13] more technological changes because computers are getting

[01:09:15] faster and faster and they're able

[01:09:17] to sequence genomes faster

[01:09:19] which means we could start cloning which means

[01:09:21] we could who knows what else

[01:09:23] like everything is growing exponentially

[01:09:25] so

[01:09:27] it's all to say we've got to be careful

[01:09:29] and maybe

[01:09:31] spend time studying puzzles

[01:09:33] more definitely

[01:09:35] and buying my book when it comes out in April

[01:09:37] yeah well we'll have you on in April to talk

[01:09:39] about puzzles and maybe you'll give us some

[01:09:41] puzzles as well

[01:09:43] love it

[01:09:45] alright thanks so much AJ thanks for coming on again

[01:09:47] thanks James I loved it

[01:10:12] hi it's Martha Stewart you know

[01:10:22] I spend a lot of time

[01:10:24] thinking about dirt

[01:10:26] at 3am at all hours of the day

[01:10:28] really what people don't

[01:10:30] know is that not all dirt

[01:10:32] is the same you need

[01:10:34] dirt with the right kind of

[01:10:36] nutrients new miracle grow

[01:10:38] organic raised bed and garden

[01:10:40] soil is so dense

[01:10:42] so full of nutrient rich

[01:10:44] high quality ingredients

[01:10:46] miracle grow

[01:10:48] is simply the best

podcast,Writing,creativity,book,chess,aj jacobs,puzzles,research,problem-solving,constitution,humor,the puzzler,outsourcing arguments,adventurous life,statistics education,world's largest family reunion,voynich manuscript,scientific method,forecasting,warm and calculating,technological change,the year of living constitutionally,super forecasters,outsourcing,ai dangers,the year of living biblically,living biblically,episode,societal issues.,mathematics,pirate,technological advancements,exploration,sakata reddit puzzle,authoritarianism,best selling,zodiac killer code,outsourcing business ideas,probability,history teaching perspectives,long-termism,drop dead healthy,puzzles as creativity enhancement,cryptographic puzzles,misinformation,