The Ultimate Guide to Risky Decisions: Risky Business with Maria Konnikova and Nate Silver: Maria Konnikova and Nate Silver
The James Altucher ShowOctober 09, 202401:13:3667.39 MB

The Ultimate Guide to Risky Decisions: Risky Business with Maria Konnikova and Nate Silver: Maria Konnikova and Nate Silver

James Altucher brings together two brilliant minds: Nate Silver, known for his predictive prowess, and Maria Konnikova, a renowned psychologist and poker player. The trio delves into how they make calculated decisions when the stakes are high. This episode is a masterclass on using rational thinking to handle life's risks, straight from the experts who live and breathe it.

A Note from James:

"Are you a member of the river or the village? That’s the question we’re diving into today. Nate Silver—yes, the Nate Silver from 538—joins us with Maria Konnikova, a master of poker and decision-making. Members of the 'river,' as Nate describes, are rational thinkers. They make decisions based on probabilities and data, not emotions. So, are you in the river or the village? Because today, we’re talking about how to think differently about risk—whether it’s betting on an election, making an investment, or even figuring out how to navigate life. Here’s what you need to know."

Episode Description:

In this episode, James Altucher brings together two brilliant minds: Nate Silver, known for his predictive prowess, and Maria Konnikova, a renowned psychologist and poker player. The trio delves into how they make calculated decisions when the stakes are high. With examples from poker, elections, and everyday life, they discuss how we can all navigate a world full of uncertainty. What does it mean to be a rational thinker? And how can understanding probabilities make you a better decision-maker? Join them as they explore strategies for improving your risk assessment, leveraging data, and making choices that keep you in the game longer.

What You’ll Learn:

  1. Risk Assessment Tools: How to analyze risk effectively using concepts from poker and data science.
  2. The River vs. The Village: Are you making rational decisions, or are you just playing it safe? Find out how to challenge your instincts.
  3. Understanding Probabilities: How to apply probabilistic thinking to everyday situations, from career moves to investments.
  4. Avoiding Cognitive Traps: Learn about common mental biases that can lead to poor decisions and how to overcome them.
  5. Betting on Your Choices: Practical advice on evaluating your options to maximize the chances of success.

Timestamped Chapters:

  • [01:30] – Are You a Member of the River or the Village?
  • [03:21] – Meet the Guests: Nate Silver and Maria Konnikova
  • [10:09] – Maria’s Journey into Poker and Game Theory
  • [14:59] – Understanding Risk and Decision Making
  • [27:55] – The Challenge of Trust and Information in the Digital Age
  • [31:04] – Nate’s Transition from Poker to Election Forecasting
  • [42:37] – The Evolution of Poker Strategy
  • [54:15] – Betting Markets and Inefficiencies
  • [1:00:58] – Decision Making and Risk in Poker and Life

Additional Resources:

------------

  • What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
  • Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!

------------

------------

Thank you so much for listening! If you like this episode, please rate, review, and subscribe to “The James Altucher Show” wherever you get your podcasts: 

Follow me on social media:

[00:00:05] Are you a member of the river or the village? This is an important question for our upcoming episode here. Members of the river as described by Nate Silver, you might know him from his Just Out book, On The Edge, his podcast Risky Business or his pioneering election statistics, probabilities all around elections. He was running FiveThirtyEight.com. Now he's doing the Silver Bulletin, his newsletter.

[00:00:36] And Maria Konnikova, who I don't know if you remember, a few years ago, she didn't know anything about poker, started taking poker lessons. And within a year, she was writing a book called, or she wrote a book called The Biggest Bluff appeared on this podcast, but she became a really great poker player. So members of the river as defined by these two are able to make rational decisions using probabilities, data, and not just gut emotions, but just

[00:01:05] When you're willing to take risks by understanding all the probabilities, this is how, for instance, you become a great poker player, a great, perhaps better on elections, and even perhaps a great investor. It changes the way you look at all the decisions in your life. And we're going to talk about that and many other things. We're going to talk about their podcast Risky Business. We're going to talk about Nate's book, On The Edge. Their newsletters, which is The Leap from Maria or The Silver Bulletin from Nate.

[00:01:35] And we're going to talk even more about elections and poker. So again, keep in mind, keep asking yourself the question, are you a member of the river or the village? Here's the podcast.

[00:01:50] This isn't your average business podcast, and he's not your average host. This is The James Altucher Show.

[00:02:07] Nate, it's nice to meet you. I'm a huge fan. We never met before. Maria, I've known for years and years and years. I think our first podcast together was 2016.

[00:02:14] Really love the book, by the way. I appreciate it. I've been reading it this past week nonstop.

[00:02:19] And just such fascinating stuff. In general, everything you guys do, I love talking about poker, elections, and particularly from like, on elections, particularly from a game theoretic point of view.

[00:02:36] Like, I almost don't care at all about the issues, but I care very much about all the strategies.

[00:02:44] Like, oh, here's how she's playing this debate. Here's how he's doing it.

[00:02:49] And all the game theory on the VP picks and all this stuff that you talk so much about on your podcast.

[00:02:57] And I want to stress how much I enjoy your podcast, but let me bring up the page for it.

[00:03:03] Because do you ever have this happen to you? Whenever I'm in a podcast, I forget actually everything I'm supposed to remember.

[00:03:11] Like, I would have to actually look. Even though I've been spent the past week looking at reading your book, I forgot the title and I just had to say it just now.

[00:03:18] No, I have this thing where like, I'm worried that I forget people's like names. And it's like very rude to like not use someone's name in conversation.

[00:03:26] But like, I'm paranoid of like making, I don't know if it's like type one or type two error, right?

[00:03:31] Oh, I've done that. And I've actually called in the middle of a live TV interview. I called someone, I don't remember when I did this, by their last name instead of their first name.

[00:03:40] Because their last name was like a first name as well.

[00:03:43] And the moment it came out of my mouth, I was like, fuck.

[00:03:47] Okay, well, wait until you do enough. Wait until you do enough podcasts where you have someone on your podcast.

[00:03:53] And mid pod, you're asking questions and mid podcast, they look at you sort of strangely.

[00:03:58] And only then you realize, oh, I've had them on my podcast before.

[00:04:04] That's happened to me. And then I wonder, like, am I an old man? Like, what's going on? Like, am I losing it?

[00:04:10] But, but anyway, your podcast, Risky Business. And again, it talks about, it's sort of the ideal podcast for me.

[00:04:19] It like talks about everything near and dear to my heart, which is, you know, making decisions,

[00:04:25] making as rational decisions as you can, using all the available data around you, whether it's betting on election,

[00:04:32] betting on a poker hand, taking a job, making an investment, getting into a relationship,

[00:04:37] make, you know, all these decisions basically involve some form of mental modeling and data and statistics and probability.

[00:04:46] And then you still might make a bad decision, but it still is the, the rational decision is not always the right decision.

[00:04:54] And I wanted to start off by defining some terms, like you both talk about, and Nate, you mentioned this in your book,

[00:05:01] this kind of imaginary community called the river. And I like to think I am also part of that community,

[00:05:08] but I think my fear is everybody probably, but every, but I was thinking while I was reading it though,

[00:05:13] everybody thinks they're part of that community. So maybe describe it and we could decide.

[00:05:17] So I think of the river as the world of rationales loaded term, but like quantitative analytical risk takers,

[00:05:26] people who are really into numbers and data and people who also are extremely competitive.

[00:05:32] So poker is like the easiest embodiment of this world, but, but also Wall Street, Silicon Valley, et cetera.

[00:05:40] And also people in the industry, not just people who are gambling, but like the casino industry itself is very reverian now where they're trying to manipulate you to like make larger bets on slot machines

[00:05:51] and spend more time playing blackjack and getting their rewards program, et cetera.

[00:05:55] Even while the odds are getting worse and against you, they're, they're doing it very, very, you know,

[00:06:00] they're changing the blackjack odds tables. Like it's, it's really interesting.

[00:06:04] And people don't really notice, right? A lot of times they do it in really subtle ways.

[00:06:09] By the way, Nate, I had a really funny conversation and this, this goes back to what James was saying that everyone would like to think that they're a member of the river community that I haven't told you about.

[00:06:21] So at your book launch party, I was talking with a friend of mine, Vanessa Seltz, who we all know as a brilliant, you know, brilliant mind, brilliant poker player,

[00:06:33] Yale law school graduate, currently works at Jane street Capitol.

[00:06:39] And we were talking about your book and I said something about like, well, you know, river, blah, blah.

[00:06:46] And Vanessa looks at me. She's like, you're not from the river. You're from the village and you're just visiting.

[00:06:51] Oh no.

[00:06:52] And I was like, whoa. She's like, look at your background. She's like, you're, you're like, I was like, Vanessa, you could say the same of you.

[00:06:59] She's like, you're a village who likes to visit the river.

[00:07:02] Oh my gosh.

[00:07:03] That seems like the worst insult, particularly at a Nate Silver book party.

[00:07:07] I was like, Vanessa.

[00:07:08] Oh, you're just a, you're just a tourist to the river. The rest of us live here.

[00:07:12] That's, that's exactly what she said.

[00:07:14] But it is kind of exciting though. Like, like the idea of like having the kind of mind where, okay, I'm going to learn and understand poker and I'm going to, and, and I could do it for a living.

[00:07:25] Or I'm going to bet on elections and I could do it. This is my life. Or the idea that you can use knowledge from this so-called community of the river to basically live your life.

[00:07:34] And everyone else is this just sort of like, I don't know, mediocre villager.

[00:07:40] Like there really does seem to be some sort of elite aspect of being kind of somebody who lives your life by the odds and understanding the risks and so on.

[00:07:48] No, it's like immigrants are the most patriotic Americans. It's like transplanted villagers.

[00:07:53] I mean, I agree that Maria was probably born in the village.

[00:07:57] But you know, the most patriotic Americans are people that move intentionally to the United States.

[00:08:02] Thank you, Nate. Thank you. I appreciate that. And I am an immigrant as well.

[00:08:07] So I've done this multiple times.

[00:08:10] I don't know if Maria was born in the village. I think the fact that she wanted to, that she had this idea that, that became, that became the biggest bluff.

[00:08:17] Like, oh, I'll start from scratch, learn poker, and then write about the experiences.

[00:08:21] Because I think that, that, that idea really came from an itch of secretly being in the river and wanting to fully come out.

[00:08:30] Maria, how'd you, I'm not sure I've ever asked you this point, Blake.

[00:08:33] Yeah.

[00:08:33] How'd you have the idea for The Biggest Bluff?

[00:08:36] So I was somebody who had never really been into the gaming world at all.

[00:08:42] And I wanted to write about the role that luck plays in life and how we can learn to maximize skill and kind of deal with the bad beats of life.

[00:08:53] And it really, it happened because I was going through a lot of bad beats in my own life.

[00:08:56] You know, I'd gotten really sick.

[00:08:58] My grandmother died in a freak accident.

[00:09:00] Like, a bunch of people in my life lost their jobs.

[00:09:04] And I wanted a way to metabolize it, to make sense of it, and also to figure out, okay, how do you still make good decisions, right?

[00:09:11] There's all this shit we can't control, right?

[00:09:14] Like, all of life, like, that's, I get very mad when people are like, oh, poker is just gambling.

[00:09:18] I was like, well, yeah, life is gambling, right?

[00:09:20] All, everything is probabilistic.

[00:09:23] And there's an element of chance to absolutely every single decision we make.

[00:09:28] But how do I, you know, how do I kind of put a frame on that?

[00:09:32] And I actually came to poker as the solution to that through the work of John von Neumann.

[00:09:38] So I was looking at game theory as kind of a way of looking at this.

[00:09:42] And I read Theory of Games and Economic Behavior.

[00:09:45] I read that doorstop of a book and did not understand a lot of it because math is not what I'm good at.

[00:09:52] And I have not done math since I was in high school.

[00:09:56] But I became fascinated by this, by the idea that poker was the birth of game theory.

[00:10:03] And that von Neumann thought that if you could actually solve poker, you'd solve, you know, you'd solve nuclear warfare.

[00:10:09] You'd prevent nuclear war.

[00:10:10] He, you know, that this was like the solution to everything.

[00:10:13] And so I was like, wait, what is this?

[00:10:16] So then I started looking more into poker and I was like, this is my book.

[00:10:19] I want to learn how to play.

[00:10:20] I want to kind of use this as a lens of understanding risk and looking at life in a way that will maximize my decision making.

[00:10:31] Now, of course, like my PhD was in risky decision making, right?

[00:10:34] So like, that's also why I kind of stung.

[00:10:36] She's like, well, you have a PhD.

[00:10:37] I was like, yeah, but my PhD was in risky decision making under conditions of uncertainty.

[00:10:42] Like this is what I was studying.

[00:10:44] And I actually wish that I could go back in time and do some of the studies because I use stock market games.

[00:10:49] I was like, we should have used poker, right?

[00:10:51] Like that would have been so beautiful.

[00:10:53] But I didn't know back then that that would have been such a great tool.

[00:10:56] But you know, the stock market too, though, is very, I mean, let's just say it for what it is.

[00:11:01] The stock market is a giant casino.

[00:11:03] I mean, there's really, you know, you buy a piece of Apple.

[00:11:08] Like if you buy shares of Apple, you're buying like supposedly a small percentage of the company.

[00:11:12] And then when you own a small percentage of a company, you all share in the income of the company.

[00:11:17] That never happens.

[00:11:18] Like you never really share in the income.

[00:11:21] You're just buying and selling, hoping it goes up for some reason.

[00:11:25] And every company is valued differently.

[00:11:27] It has nothing to do with.

[00:11:28] Very few companies actually pay you any money to own a piece of them.

[00:11:32] It's just betting.

[00:11:34] And a lot of people, a lot of hedge funds in particular use models.

[00:11:38] By the way, probably similar, Nate, to the models you use for predicting the election.

[00:11:42] A lot of stock market hedge funds use models to say, oh, Apple has this.

[00:11:46] It's 80% probability of going up tomorrow.

[00:11:49] And then they make their bet accordingly.

[00:11:51] It's so funny, though.

[00:11:53] One of the reasons that I said we should have used poker instead of stock markets is when

[00:11:57] the late, great Danny Kahneman did some consulting for Wall Street, his conclusion from it, and

[00:12:03] I've quoted this before at presentations to members of Wall Street, they don't like it,

[00:12:08] is that trading is much more, is gambling.

[00:12:11] It's like you're in a casino as opposed to poker.

[00:12:13] He's like, poker players actually make much more reasoned and correct decisions.

[00:12:20] You're not playing poker.

[00:12:22] You think you're playing poker.

[00:12:23] You're gambling.

[00:12:24] And he drew this distinction between the two, which I thought was very interesting.

[00:12:28] Well, traders have these physiological reactions that are very similar to poker players, in

[00:12:32] fact.

[00:12:32] Yeah.

[00:12:32] There's a bunch about this in the book.

[00:12:35] But they are quite weird, right?

[00:12:37] They have physiological responses to taking risk.

[00:12:40] It actually seems healthier, by the way.

[00:12:43] You're a better trader if you have a stronger response, ironically.

[00:12:46] But no, for sure.

[00:12:47] It's absolutely, I don't know if it's degenerate, but it's definitely the gambling gene.

[00:12:53] It's degenerate because I was a day trader for eight years, and it's completely a degenerative

[00:12:59] disease.

[00:13:00] And in terms of physiological reactions, I mean, I used to make a trade, and within a few seconds,

[00:13:07] the trade's either going for you or going against you.

[00:13:09] And my blood would start pumping like I would hear it in my ears.

[00:13:13] And I would go across the street.

[00:13:15] There was a church across the street.

[00:13:16] And I would just pray to Jesus, please let them...

[00:13:20] And I'm Jewish, so this was how desperate I was to make it work for me.

[00:13:26] And it was hard.

[00:13:28] I wish...

[00:13:28] And I had switched from poker to stock market trading, figured I'll make more money.

[00:13:32] And it was just so miserable.

[00:13:33] I did make money, but it was a miserable experience for many years.

[00:13:37] But the only way I got better, and you guys discussed this, I mean, this is the heart

[00:13:41] of your podcast, the only way to get better is to really understand risk.

[00:13:46] Like, I could know what the right trade is, but if I don't understand risk, I am going to

[00:13:52] go broke.

[00:13:53] In any of these games, in any of these scenarios, the risk is 90% of the game.

[00:13:58] Yeah, no.

[00:13:59] To be process-oriented and not results-oriented is one of the hardest things, I think, in life.

[00:14:06] And it's the one thing that poker players are like, maybe not as good as they should

[00:14:10] be, but are relatively good at compared to most people.

[00:14:14] Not telling a narrative of, oh, of course, if I'd done this or that.

[00:14:17] I mean, understanding you have incomplete information that oftentimes your alpha in trading or poker

[00:14:24] or life really comes from making decisions quickly based on incomplete information.

[00:14:28] And that's a very helpful poker skill.

[00:14:31] Yeah, it really is.

[00:14:32] Because as you said, James, a lot of it is about risk.

[00:14:37] And it's about learning to parse risk and figure out, okay, what's the risk I'm comfortable

[00:14:44] with as well, right?

[00:14:45] Because one of the things poker teaches you is to be comfortable with uncertainty.

[00:14:49] Because there are two things, right?

[00:14:50] It's both incomplete information, but also uncertain information, which are two different things,

[00:14:55] right?

[00:14:55] So you don't know everything, but there's also an uncertainty parameter where you can say,

[00:15:01] well, I think I'm this confident.

[00:15:03] But it's a thought, right?

[00:15:04] It's not, you don't know because there is uncertainty about all of the different inputs.

[00:15:09] And so you have a model in your head.

[00:15:12] I call it my decision-making model.

[00:15:14] I'm not Nate.

[00:15:15] I'm not a statistician.

[00:15:16] I don't actually have a model where I model my decisions.

[00:15:19] But in my head, I have different parameters.

[00:15:22] And I write them down, actually, when I'm making consequential decisions, like what are

[00:15:26] the things that are important?

[00:15:27] How confident am I of all of these different things?

[00:15:29] And then you tweak the model, right?

[00:15:31] If you actually look to see what happens and you say, okay, how good was my process?

[00:15:37] Did I actually account for all of these things?

[00:15:39] How good, you know, was my certainty warranted here?

[00:15:42] Was I too uncertain?

[00:15:43] You know, and then you actually tweak the model so that next time you can make a better decision.

[00:15:48] And this happens kind of in your head.

[00:15:51] I'm wondering though, Maria, if there's a danger of curve fitting if every time things

[00:15:55] go wrong, you say what happened and then you start tweaking and you feel the need to tweak

[00:16:00] the model.

[00:16:01] Right.

[00:16:01] No, that's why you have to be process and not results oriented.

[00:16:03] That's why what Nate said is so ridiculously important and it's kind of the heart of what

[00:16:09] poker teaches you about decision-making because you have to try to figure out, is the outcome

[00:16:14] because I did something wrong or because I got unlucky, right?

[00:16:18] Like why, why is the outcome this way?

[00:16:20] Because you can make the right decision.

[00:16:22] It can go against you.

[00:16:23] You can make the wrong decision.

[00:16:24] It can go in your favor.

[00:16:26] And in the first case, you don't tweak your model, even though the decision wasn't good.

[00:16:30] I mean, even though the outcome wasn't good.

[00:16:31] In the second case, you absolutely have to tweak your model because, you know, you fucked

[00:16:36] up and then you just got lucky, right?

[00:16:38] Like you might have, you might have gotten a 10% outcome and it doesn't make you a genius.

[00:16:43] It makes you an idiot who got lucky.

[00:16:45] And so then you need to figure it out.

[00:16:46] The problem is, and this is why, you know, real life is so hard.

[00:16:51] In poker, it's pretty easy to figure out, you know, did I get unlucky or did I make the

[00:16:55] wrong decision?

[00:16:56] In real life, like life's noisy.

[00:16:58] There are so many different inputs.

[00:17:00] There are so many different things.

[00:17:02] And our human, our very fallible human brains are very good at curve fitting and hindsight

[00:17:07] bias and all sorts of things where they try to make excuses.

[00:17:12] And so you try to take that skill from poker and to the best of your ability, figure out,

[00:17:17] you know, was the decision process correct or not and only tweak it when it needs to be

[00:17:21] tweaked.

[00:17:22] But this is something where you could do it wrong, right?

[00:17:24] Like you could actually tweak something and then realize, oh shit, like I was curve fitting.

[00:17:28] I have to go back and figure it out.

[00:17:31] Yeah.

[00:17:31] And people have limited bandwidth.

[00:17:33] I think one thing poker teaches you as well is that like, yeah, in a perfect world, you

[00:17:37] are giving full attention to every possible problem and every angle of a poker hand.

[00:17:41] In real life, there are opportunity costs and trade-offs.

[00:17:45] And like, let's say you have some problems, some accounting problem, or you paid your rent

[00:17:49] late or whatever else.

[00:17:51] I don't know.

[00:17:52] It's not necessarily optimal to be on top of all these minor things if you need time and

[00:17:58] space to work hard and be creative and have some semblance of a life, et cetera.

[00:18:05] Yeah.

[00:18:07] Go ahead, Maria.

[00:18:07] No, I was just going to say that actually one of the, I think, most important things about

[00:18:13] what Nate is saying is also a concept that people have been studying and it's understudied,

[00:18:18] which is time poverty, right?

[00:18:21] That that actually like having the mental constraints, being poor on time is one of the reasons that

[00:18:27] it can be very difficult for people to get ahead, right?

[00:18:30] Because when you have all of this overhead cost of just trying to figure out like, where's

[00:18:36] my rent coming from?

[00:18:37] Where's this coming from?

[00:18:37] Where's that coming from?

[00:18:39] You can no longer allocate the resources that need to be allocated to make the quote unquote

[00:18:44] optimal decision through no fault of your own, right?

[00:18:47] And that can be an insurmountable barrier at times.

[00:18:51] Well, and this is an important part of risk-taking because you could have a model for poker and

[00:18:56] then play poker and try to win.

[00:18:58] You could have a model for elections and you're betting on elections and you could try to win.

[00:19:03] But like you were saying just now, Nate, and Maria, you've mentioned this in your newsletter

[00:19:08] on Substack, the leap.

[00:19:10] What if you need $2,000 tomorrow to pay your rent and you have a chance to win a million dollars

[00:19:17] in a bet or lose everything?

[00:19:19] And it doesn't matter what your odds are on the million dollars, 1% or 10% or whatever,

[00:19:26] you still got to make sure you have the $2,000 to pay rent.

[00:19:31] So it's almost like decision-making in life has to include so many different variables.

[00:19:35] You kind of just have to have a feel for risk.

[00:19:38] You're not going to calculate the odds of everything.

[00:19:41] Yeah, look, I feel like this is becoming more and more true as well.

[00:19:46] It's harder to just kind of blindly trust experts and institutions, or at least if you look at

[00:19:51] the polls, trust is going down in every major U.S. institution except for the military, ironically.

[00:19:58] Why is that?

[00:20:00] Good question.

[00:20:01] I think it's kind of halfway deserved.

[00:20:06] You know, look, the biggest political trend of the past 20 years is that increasingly people

[00:20:11] who are college-educated and have high social trust vote Democratic.

[00:20:18] One consequence of that is that, you know, 98% of Harvard professors are Democrats or 98% of public health

[00:20:27] epidemiologists are Democrats.

[00:20:28] And so like, and people are very tempted by like partisan reasoning.

[00:20:34] Partisanship is a hell of a drug.

[00:20:35] And so I do think that like institutions are becoming less trustworthy for that reason.

[00:20:40] But also you have like campaigns.

[00:20:42] I mean, you know, Trump is running to discredit American institutions as well.

[00:20:46] And so it's become politicized.

[00:20:47] But like, but that's kind of a, it's kind of a self-reinforcing downward spiral.

[00:20:53] And also I think there's more asymmetries.

[00:20:55] I mean, you know, people produce this data trail everywhere they go.

[00:20:58] And so therefore organizations that collect this data, companies, et cetera, are now like,

[00:21:03] I think more clever at finding ways to manipulate people.

[00:21:07] You see it in the casino industry, getting them to spend more.

[00:21:09] And so if you're not like, if you're not sharp about this stuff, then I feel like you get

[00:21:13] taken advantage of more and more.

[00:21:19] Take a quick break.

[00:21:21] If you like this episode, I'd really, really appreciate it.

[00:21:24] It means so much to me.

[00:21:25] Please share it with your friends and subscribe to the podcast.

[00:21:28] Email me at altitra at gmail.com and tell me why you subscribed.

[00:21:32] Thanks.

[00:21:43] And I think also maybe the use of AI now.

[00:21:46] We're kind of on a trend because AI can basically create any kind of content.

[00:21:50] We're kind of on a trend where if you see any piece of data, news, information, whatever,

[00:21:56] you should probably just assume it's not real.

[00:21:59] Eventually all data will have equal probability of not being real.

[00:22:04] Yeah, it's actually, it's quite scary because if you look at the psychology of how the brain

[00:22:09] processes information, the model that kind of is the established one in the sense of this

[00:22:18] is probably true, is that our brain has a default true switch, right?

[00:22:23] So we assume everything is true when we first encounter it because that's the way we understand

[00:22:30] reality, right?

[00:22:31] Like I can't question all the time, like, is this really here?

[00:22:34] So like you see it and you're like, okay, water bottle, we're true.

[00:22:39] And then that's effortless, right?

[00:22:41] So like everything that comes in is effortlessly tagged as true.

[00:22:45] And then there's always a second step where we say, okay, is it actually true, right?

[00:22:49] Like where you say, wait, no, like false, right?

[00:22:53] Like that's not, that's not actually something that exists.

[00:22:56] This is false.

[00:22:57] However, when there's cognitive load, when there's anything going on, that second step sometimes

[00:23:02] doesn't happen.

[00:23:03] And so we incorrectly remember things as true, even when they're false.

[00:23:08] This is why like, you know, when you're on a jury, if some piece of evidence is introduced

[00:23:14] and then the judge says, ignore that, like, do not use it.

[00:23:17] I mean, that's not going to work, right?

[00:23:19] Like, and it never works.

[00:23:20] Which is why they always, it's why the lawyers always say they know that it's going to be

[00:23:24] thrown out of the court.

[00:23:25] But you do it anyway.

[00:23:27] And that initial thing will then just stay in the back of your mind and cloud the information

[00:23:33] later on.

[00:23:34] And before, like, before AI and all these things, like, I think in our world, it used

[00:23:38] to be easier to realize, okay, because it is effortful.

[00:23:42] And so we have limited bandwidth, limited cognitive resources.

[00:23:46] So it used to be easier to figure out, okay, like, what are the things that I need to kind

[00:23:51] of verify?

[00:23:51] And some things like, obviously, like if I said, you know, Nate, yesterday I saw a pink elephant

[00:23:56] walking down Park Avenue.

[00:23:58] You'd be like, okay, no, like for one second, your brain had to believe it to process what

[00:24:03] I was saying.

[00:24:03] But then you're like, okay, pink elephants don't exist, right?

[00:24:06] And even if someone dyed an elephant pink, it's not going to be walking in the middle

[00:24:11] of Manhattan right now.

[00:24:12] And so false.

[00:24:13] But when the internet is full of statements that look like, you know, there's a gray elephant

[00:24:19] that just escaped from the zoo, which seems plausible enough, right?

[00:24:25] We're not, we're not like flagging it as pink elephant walking down the street, but it might

[00:24:31] be.

[00:24:32] And now we can't distinguish between the two.

[00:24:34] And so the load on our brains is such that I think a lot of times shit's just going to

[00:24:39] get tagged as true because you can't verify every single time.

[00:24:45] And actually, it's interesting because when I was writing about con artists, I got exhausted

[00:24:49] at some point because you realize that you have to fact check literally every single sentence

[00:24:53] out of their mouth.

[00:24:54] And so you start having to pick and choose.

[00:24:56] You're like, okay, I don't have time to fact check a five hour long interview.

[00:25:00] So I'm going to fact check like the most important things.

[00:25:03] And then like, I ended up figuring out that like this guy made up that he had a sister.

[00:25:08] And it went into my book because it didn't even occur to me that he would make up having

[00:25:12] a sister.

[00:25:13] And like, I didn't fact check it.

[00:25:15] And Nate, I'm guessing that you probably encountered this when, you know, you interviewed SPS.

[00:25:19] Like you, this is, it's exhausting.

[00:25:22] And like when we have to do it literally all the time, like I don't think, our brains can't

[00:25:27] do it.

[00:25:27] What's the axiom where like it takes 10 times more energy to like disprove bullshit than

[00:25:33] to create bullshit?

[00:25:34] Bullshit.

[00:25:35] I mean, it's like 50x more, right?

[00:25:36] Yeah, yeah.

[00:25:37] This is probably why I feel sympathetic to people that I have differences of political

[00:25:42] opinion with, right?

[00:25:44] Like if it's my job to follow politics and policy, then I can make these like fine distinctions

[00:25:50] about, yeah, this thing isn't precisely true, but it's directionally right or this and that.

[00:25:54] You know, if you have another job apart from politics, then, you know, it's hard to distinguish

[00:26:00] these things sometimes.

[00:26:01] And even things like, you know, talking about like, what if you saw like a pink elephant?

[00:26:04] You know, the other day I saw a Clydesdale horse on like Fifth Avenue in Manhattan, which

[00:26:10] makes sense because for some reason they still allow like horse-drawn carriage rides, right?

[00:26:14] But it's utterly bizarre in some sense that you have like a cyber truck and then a horse-drawn

[00:26:20] carriage and then regular, you know, Honda Accords.

[00:26:23] It doesn't really make a lot of taxis.

[00:26:24] It doesn't make a lot of sense.

[00:26:25] Yeah.

[00:26:26] Then you have stories of like different farm animals getting onto the subway tracks in New

[00:26:30] York.

[00:26:31] And so, yeah, so like weird shit does happen.

[00:26:36] And I think, but it's even like, I don't think our brains are like, we're literally not built

[00:26:40] for this because for AI and all of these reality manipulations, because at least before you had

[00:26:47] to parse political statements and that kind of stuff.

[00:26:50] Now we have to parse like what we're hearing.

[00:26:54] Wait, that's, it sounds like Nate's voice, right?

[00:26:57] But what if it's a deep fake?

[00:26:59] What we're seeing, like, wait, it looks like Nate actually was here and was involved in

[00:27:03] this bar fight, but that doesn't seem like Nate.

[00:27:06] So like, let me, is it a deep fake, right?

[00:27:08] So like all of our senses are constantly being manipulated and it's happening all the time

[00:27:14] and the technology is getting so damn good.

[00:27:16] And then there's the next level of this, which is that, okay, we're all talking right now

[00:27:21] with devices near us.

[00:27:23] So presumably one of these devices is listening and selling the data.

[00:27:26] So the next time we all go on YouTube, we're just in this echo chamber where every video

[00:27:30] is pink elephants going down main street or different cities.

[00:27:33] And those are really made videos, but now we're seeing it over and over.

[00:27:37] Oh, I guess really, Maria really did see it.

[00:27:40] There's so many people who are seeing pink elephants these days.

[00:27:42] I guess this is a thing now.

[00:27:43] Like you start to believe it because of the repetition.

[00:27:46] Yeah.

[00:27:47] No, the number of times when there's like a suggestion of who to follow on Twitter in

[00:27:52] ways that I believe originates from some data trail I leave apart from Twitter.

[00:27:58] I don't think it's a coincidence.

[00:28:00] No, it's not a coincidence.

[00:28:02] And...

[00:28:03] But what device is actually listening to us and selling data?

[00:28:05] I can't figure that out yet, but I know it's happening.

[00:28:07] Like I talked to my daughter the other day about, I don't know how the universe began.

[00:28:11] And then suddenly on my newsfeed, it's all these articles.

[00:28:13] Well, they discovered another way that the universe began.

[00:28:16] Like who was listening to me?

[00:28:18] Yep.

[00:28:19] No.

[00:28:19] And I mean, this is paranoia, but like at this point, because we've had so many scandals

[00:28:26] where companies have said, no, absolutely not.

[00:28:29] You know, Alexa is not listening if you don't use the wake up word.

[00:28:34] And then it turns out that, oh yes, it absolutely fucking is.

[00:28:37] And we have these and you have whistleblowers come forward with all these transcripts of

[00:28:42] a device that was asleep and was never listening, but actually was listening all the time.

[00:28:47] So yeah, someone is listening.

[00:28:49] I don't know who.

[00:28:51] I don't have any smart devices in my house, but people still listen.

[00:28:55] Well, I have a phone, right?

[00:28:56] And I have a computer.

[00:28:57] You're screwed.

[00:28:58] I'm screwed.

[00:29:00] That's why I don't want to give any company my DNA.

[00:29:03] Yeah.

[00:29:04] That just seems like a bad idea.

[00:29:05] I don't know.

[00:29:05] No, I have never gone on 23andMe, Ancestry.com, any of these things.

[00:29:10] And to our listeners, my PSA is don't do it because you don't want your DNA data out there

[00:29:16] in these databases.

[00:29:18] I mean, how many times has like your data been stolen?

[00:29:22] So I will not give blood or body samples unless I have to for medical reasons.

[00:29:27] My DNA is all over the place on every one of these sites.

[00:29:32] I remember when I first got my DNA results, it said I had the AP40E gene, which is the

[00:29:39] one where there's a slight chance of early onset Alzheimer's.

[00:29:43] And so I wrote to the CEO of 23andMe and I said, I'm going to make a cookbook, the AP40E

[00:29:53] diet.

[00:29:54] So all the foods that supposedly help cognition and so you don't get Alzheimer's.

[00:29:58] And she wrote back actually and said, oh, that'll be a bestseller.

[00:30:01] And I wrote and I asked, well, why do you say so?

[00:30:03] And she said, because everyone's going to forget if they bought it and then they'll buy it again.

[00:30:08] That's a great joke.

[00:30:11] That was the CEO's response.

[00:30:13] So now, Nate, you went from poker to election probabilities and you've done extremely well

[00:30:20] with it.

[00:30:20] What's the similarities?

[00:30:22] I feel like there's more, it's a little harder to model.

[00:30:26] Like poker, there are rules and there's kind of like a closed world.

[00:30:30] So you could come up with game theory solutions, but there's a lot more unknown data with elections.

[00:30:36] I mean, look, in poker, even playing a lifetime of live poker tournaments, you never really

[00:30:42] reach the long run.

[00:30:42] But in elections, it's crazy, right?

[00:30:45] I mean, you're not going to learn very much from a sample of getting one trial every

[00:30:48] four years.

[00:30:50] Look, we try to turn this open-ended problem into a closed problem by just focusing on the

[00:30:56] polls, more or less.

[00:30:58] Making a really good analysis of how the polls in different states are correlated and empirically

[00:31:04] how polls really are accurate or inaccurate.

[00:31:06] And the answer is they're not always very accurate.

[00:31:08] And so people are like, what factors are you accounting for?

[00:31:12] I'm like, oh, just the polls, right?

[00:31:13] It's a really good way to account for the polls, but there is a prior based on the economy.

[00:31:18] But it's basically just do a really good job of accounting for the polls and not try to

[00:31:22] have it be too open-ended because politics does change a lot.

[00:31:27] But with that said, I mean, you're always thinking about the structure of the model and how elections

[00:31:32] change.

[00:31:33] Things are much more polarized now than they once were.

[00:31:35] That has relevance to how you might design a polling-based forecast.

[00:31:40] But yeah, I don't know.

[00:31:41] It's kind of inherently crazy to like...

[00:31:43] I try not to stake my reputation on it.

[00:31:47] It's just weird when you have one thing that's 10x more popular than the other things you

[00:31:51] do, even the other things are better.

[00:31:53] It's kind of weird.

[00:31:55] When Nate and I, a few years back, first started talking about doing a podcast together, we

[00:32:02] were kind of figuring out what direction we wanted to go in.

[00:32:04] And Nate was like, you know, I really just...

[00:32:07] I don't know.

[00:32:07] I'm tired of politics.

[00:32:09] I think I might want to stay away from politics.

[00:32:12] And I was like, Nate, I love you and I respect you and I respect your mental sanity.

[00:32:18] However, let me tell you, you're not stepping away from politics, at least for the next year.

[00:32:25] Well, you got a lot of kind of backlash when people didn't really understand...

[00:32:30] People didn't really understand the difference between a poll, which is somewhat predictive,

[00:32:35] and the way you do your models, which is more probabilistic.

[00:32:39] So that, okay, in 2016, if there's...

[00:32:43] And you talk about this in the book.

[00:32:44] If there's a 70% chance Hillary's going to win and a 30% chance Trump's going to win,

[00:32:49] like you were sort of predicting in 2016, the 30% is a pretty high probability that someone

[00:32:56] could win.

[00:32:57] It's 30%.

[00:32:58] It's three out of 10 times.

[00:32:59] Well, I'm even more DJ.

[00:33:00] I'm like, the market's 15%, right?

[00:33:02] So we told you to be a long Trump, right?

[00:33:06] Therefore, it's a great bet that EV is very positive from the model.

[00:33:09] So I'm...

[00:33:10] Right.

[00:33:10] From an arbitrage point of view...

[00:33:12] It was great.

[00:33:12] Someone could have made a lot of money following your prediction models in 2016, probably more

[00:33:19] than in 2008 when you predicted every single county correctly or whatever election it was.

[00:33:25] No, I think it's like the highest ROI...

[00:33:27] The one that people hate me for is like the highest ROI forecast relative to the markets.

[00:33:31] Oh, for sure.

[00:33:32] And as I always say when these numbers come up, and as I wrote in The Biggest Bluff, because

[00:33:37] I was so like...

[00:33:38] It was so revelatory to me.

[00:33:41] Nate's numbers for the election are the exact chance...

[00:33:44] Because it was like 29% or 28%.

[00:33:46] It's the exact chance of hitting a pair on the flop in No Limit Hold'em.

[00:33:51] And for anyone who plays poker, how often have you hit a pair?

[00:33:55] Is it like never?

[00:33:56] Oh my God, I never bake a pair.

[00:33:58] No, like all the time, you are always hitting pairs because 28%, 29%, 30% is a hell of a

[00:34:06] lot of time.

[00:34:08] And it makes it very...

[00:34:10] I think that's why poker players actually understand probabilities better in a lot of respects

[00:34:14] because it's visceral.

[00:34:16] You know that.

[00:34:17] You know what it feels like.

[00:34:18] If I tell you that the chance of something is the chance of making a pair on the flop,

[00:34:23] you get it, right?

[00:34:24] Because you've played thousands of hands.

[00:34:27] You understand what that feels like.

[00:34:29] So it means something to you.

[00:34:31] Whereas to some random person who doesn't kind of understand probability that much,

[00:34:37] 70 gets rounded up to 100, right?

[00:34:40] And even 30 gets rounded down to zero because our brains love to round...

[00:34:45] We love those absolutes.

[00:34:47] And I even saw someone was criticizing...

[00:34:49] Nate, I can't believe this.

[00:34:50] I'm so sorry to break the news to you.

[00:34:53] Someone was criticizing you on Twitter.

[00:34:54] I know that this must come as a huge shock because normally you don't get any criticism

[00:35:00] at all.

[00:35:00] But I saw this back and forth on Nate's Twitter feed about someone being like, I don't want...

[00:35:08] This is...

[00:35:09] Probabilities are bad.

[00:35:11] Like, forecasts should not be...

[00:35:12] I'm like, wait, what?

[00:35:13] Like, what in the world?

[00:35:15] You know what I'm talking about, Nate?

[00:35:16] Well, because people get so biased based on their opinions too.

[00:35:19] Like, if you're trying to make money, it doesn't matter who's going to win the election.

[00:35:25] You want to just make the right bet.

[00:35:28] And you see people get upset at...

[00:35:31] I'm sure you get this all the time, Nate.

[00:35:32] Like, if you predict, let's say Kamala is going to win, all the Trump people are going

[00:35:37] to hate you.

[00:35:37] If you predict Trump's going to win, all the Kamala people are going to hate you because

[00:35:41] you're not saying what they want to hear.

[00:35:49] People get mad at you depending on what your data is saying at that exact day.

[00:35:54] Please, I didn't mean to interrupt you.

[00:35:56] No, it's just so...

[00:35:56] It's so linear as a function of who is ahead in the forecast and who gets mad.

[00:36:00] Like, no one even like...

[00:36:01] No one has a good bluff.

[00:36:03] No one ever bluffs and gets mad at me when like Harris...

[00:36:06] No Democrat gets mad when Harris is 60-40, right?

[00:36:08] No Republican...

[00:36:09] It's...

[00:36:09] I don't know.

[00:36:10] People are very transparent in their motivations.

[00:36:12] Yeah.

[00:36:12] Now, Nate has people like Donald Trump saying he's a very smart guy when he has Trump

[00:36:17] ahead and then, you know, the Kamala Harris camp loves him when Kamala's ahead.

[00:36:23] And like I said, it's a very fun parlor game to play.

[00:36:25] You're like, oh, if I haven't looked at Nate's model yet today, let me just quickly look at

[00:36:29] a few Twitter replies.

[00:36:30] And right away, I know exactly who's ahead in his polls.

[00:36:34] Well, like you had this...

[00:36:35] You're having this argument, I guess, recently or right now with Alan Lichtman, who...

[00:36:40] Yeah.

[00:36:40] He has this model based on who's the incumbent, what's the economy doing.

[00:36:43] And it's like a 13-factor model of questions that then says, okay, either the incumbent's

[00:36:49] going to win or the opponent's going to win.

[00:36:51] And then he trashed you because your model is different from his.

[00:36:56] And his model doesn't even apply this year because we had this weird stuff with Harris

[00:37:02] replacing Biden.

[00:37:03] But then he takes this total, like, ad hominem attack on you.

[00:37:09] Oh, you don't have a PhD, so shut up.

[00:37:12] And like, what the heck?

[00:37:14] Like, is that what is happening in academia now?

[00:37:17] Like, if you're not an academic, you're not allowed to comment on anything?

[00:37:21] Look, I mean, this is part of the expert class kind of weaponizing their expertise for

[00:37:27] like partisan reasons.

[00:37:28] I mean, look, the Lichtman model, the 13 keys, I could point out some things that I like about

[00:37:34] it.

[00:37:34] I mean, for one thing, he is like actually putting himself out there and making predictions.

[00:37:39] You know, in some ways to have a multi-factor model, you know, I'd rather have 13 keys than

[00:37:43] like, than one magic key.

[00:37:45] I think that's at least directionally right.

[00:37:47] The problem is like about eight or nine of the 13 keys are quite arbitrary.

[00:37:52] When I go through, I actually think, you know, using his keys, by the way, he's applied

[00:37:57] in the past.

[00:37:57] I think it actually predicts a Trump win and not a Harris win.

[00:38:03] But yeah, that's a, that's a fun, it's a fun rivalry because like, you don't have the

[00:38:07] power to turn the keys, Nate, you know?

[00:38:09] And it's like, I just, I like the, I like to create a little mythology around it.

[00:38:14] Well, also I, I, again, I really, I think he, he never has made a model where the presidential

[00:38:21] candidate shifts mid-election.

[00:38:22] So it's like throws out the window is his whole system.

[00:38:26] Whereas like, let me ask you this, how good would a system be different from your system

[00:38:32] if all I did was take, let's say the top 15 most used polls and just average them in

[00:38:41] each state?

[00:38:41] So that's a simpler version of your model.

[00:38:44] How different would that be from, from your actual model?

[00:38:46] So to just have a polling average is not that, it's not that hard.

[00:38:52] But see Charles Barkley quote, like it, you know, it takes brains to rebound.

[00:38:55] The rebounding part is like actually building the correlation matrix, right?

[00:39:00] Figuring out because you've elected, if it was just a popular vote, then making election

[00:39:04] models relatively straightforward.

[00:39:05] But like, what's the correlation of the vote in Michigan and Wisconsin and Michigan and

[00:39:10] Wisconsin to Arizona and North Carolina and so forth?

[00:39:14] Like, that's the tricky part that, that requires some horse sense and expertise.

[00:39:19] Even though like, like, even though let's say I'm just taking all the polls in Michigan,

[00:39:24] I'm averaging them together.

[00:39:25] All the polls in Wisconsin, I'm averaging together.

[00:39:26] So I don't really need to figure out how they're correlated or do I?

[00:39:30] You do because the electoral college.

[00:39:32] So people, you know, some of the reason there were models in 2016 that had Clinton with

[00:39:38] a 99 or 99.9% chance of victory because they treated each state as independent and the

[00:39:44] errors as independent.

[00:39:45] When in fact, if Trump beats his polls in Wisconsin, he's very likely to also do so in Michigan,

[00:39:50] Pennsylvania and so forth.

[00:39:51] And that's exactly what happened.

[00:39:52] He had a overperformance systematically in the Midwest and therefore this lead that other

[00:39:57] models misdescribed as robust for Clinton proved to be fragile.

[00:40:02] And, and, you know, so not now I know you, you do take into account, for instance, the

[00:40:07] bias of different polls.

[00:40:08] So like if Rasmussen reports, it tends to be, you know, uh, overstate the Republican numbers

[00:40:15] by 3% in every single poll, you adjust accordingly from what I understand.

[00:40:19] Am I sort of correct?

[00:40:20] That's right.

[00:40:20] I mean, yeah.

[00:40:21] Polls that consistently lean, um, you can call it a bias or whatever you want to call it.

[00:40:25] Polls that consistently lean toward Republicans or Democrats, the model detects that and adjust

[00:40:29] those polls.

[00:40:29] But, but like you were saying earlier though, we've only had like two other elections with

[00:40:33] Trump.

[00:40:34] So it's hard to really know.

[00:40:35] Like for instance, right now we have this election where some Democrats like Tulsi Gabbard or RFK

[00:40:40] Jr.

[00:40:41] have endorsed Trump.

[00:40:43] Maybe that makes it people are more let, you know, I think one of the reasons polls are,

[00:40:47] are sometimes biased is people are afraid to say who they're going to vote for.

[00:40:50] And maybe people are less likely to be afraid.

[00:40:52] Now you just won't know that until after the election.

[00:40:55] I mean, look, uh, in 2016, Hillary Clinton won the popular vote by 2.1 or something points

[00:41:02] and lost the electoral college in 2020.

[00:41:05] Biden won the popular vote by four and a half points and won the electoral college narrowly.

[00:41:10] Um, polls this time are saying Harris is going to win by 3.3 points in the popular vote or

[00:41:15] whatever.

[00:41:16] And the electoral college is a toss up.

[00:41:18] So they're like not exactly going out on a limb here.

[00:41:20] They're making a very safe prediction for the time being.

[00:41:24] It's very close to being a toss up right now.

[00:41:26] Um, and by the way, pollsters are like, they do, they are aware of the fact that like everyone

[00:41:31] shat on them after 2016 and 2020.

[00:41:33] So they've changed their methods, whether they've corrected adequately or overcorrected.

[00:41:38] I don't know, but like, but you know, they have incentives to get the answer right.

[00:41:57] You guys talk a lot about poker, obviously, and a lot about Nate in your book, On the Edge,

[00:42:02] you talk about sports betting a lot, but the one thing I worry about with poker is, and

[00:42:07] I'm curious, Maria, like you're, you're, you're in there every day playing poker and, and,

[00:42:11] and you've been, you secretly just want to be a professional poker player and nothing

[00:42:15] else.

[00:42:15] I know.

[00:42:16] But, uh, uh, poker is a crowded sport now.

[00:42:20] I mean, uh, you know, back in like 1999, it wasn't as popular.

[00:42:25] It hadn't yet had the TV phase.

[00:42:26] It hadn't had Chris Moneymaker bring a surge in popularity.

[00:42:29] And so maybe it was a little bit more inefficient where you could win without knowing everything.

[00:42:34] But now everybody, like every kid is memorizing all the statistics, everything.

[00:42:39] Has it become harder?

[00:42:41] Obviously it has.

[00:42:42] Like, how do you deal with the fact that it's like a hard game now?

[00:42:44] It's so funny.

[00:42:45] You're not the first person to ask me this.

[00:42:48] And as Ed also to say like, you-

[00:42:49] I'm so unoriginal.

[00:42:50] I can't.

[00:42:50] No, no, no, no.

[00:42:51] Okay, forget it.

[00:42:51] I'm sorry.

[00:42:53] And to like, different version is like, you know, you picked a really bad time to get into

[00:42:57] poker.

[00:42:58] So my answer is like, I didn't know, you know, poker pre 99.

[00:43:03] Like, first of all, you know, like I, I wouldn't have been old enough to play, but even like,

[00:43:08] I didn't start, I didn't even play after the Moneymaker boom where I would actually argue

[00:43:14] based on what I know it got easier because so many people got into it thinking that they

[00:43:18] could win.

[00:43:19] And so there was like this period, like this golden age of online poker where everyone

[00:43:23] thought they could be the next Chris Moneymaker.

[00:43:27] And people are like, but it used to be so much easier.

[00:43:29] Like you're, you're getting in at such a hard time.

[00:43:31] It's like, I don't know.

[00:43:32] Like when I started, it was the dawn of the solver era, right?

[00:43:36] Like my first year playing poker was the first year that people were using PO solver, like

[00:43:41] in any sort of a concentrated way.

[00:43:44] And so this is all I know, like for me, the game has always been like this and I love it.

[00:43:50] Like, I actually love the fact that you have to think through the game theory, that you have

[00:43:55] to think through all of these different things.

[00:43:57] And that's so many people, like, I actually don't know, like, obviously it's gotten harder,

[00:44:01] but so many people don't know how to use solvers correctly, right?

[00:44:05] It's giving also this false, false air of precision where it shouldn't exist.

[00:44:10] And people think that they know the solution, even though, first of all, poker has not

[00:44:15] been solved.

[00:44:16] GTO game theory is optimal.

[00:44:17] It's not actually game theory optimal because the game hasn't been solved.

[00:44:21] It all depends on the inputs.

[00:44:23] It depends on how you build the model.

[00:44:25] It depends on the assumptions.

[00:44:26] It depends on so many different things.

[00:44:28] And because I play live poker, there's like this other host of variables that enter into it.

[00:44:36] And if you have a false sense of precision, then it comes often with hubris and with, you

[00:44:45] know, letting a lot of other things, not realizing what a lot of other things that you need to

[00:44:51] be paying attention to are.

[00:44:53] And of course, the best players in the world understand all of this, right?

[00:44:57] And they're able to use the models correctly and to kind of extrapolate from them and adjust,

[00:45:02] et cetera, et cetera.

[00:45:03] But on the whole, like, sure, maybe it's harder than 99 or 2009.

[00:45:10] I don't know.

[00:45:12] But like, I love all of the challenges that this is bringing.

[00:45:18] And I don't think we're in any danger of solvers ruining the game.

[00:45:22] If we're in danger of anything ruining the game, it's AI and bots and online cheating.

[00:45:29] But...

[00:45:30] Yeah, online is messy, to say the least.

[00:45:33] Online is messy, yeah.

[00:45:34] Yeah, because you can use the solvers sitting right next to you.

[00:45:37] And the physical stuff then of, you know, being in person of like the various tells or whatever

[00:45:44] you use, you know, in the physical games, you know, it doesn't come into play.

[00:45:49] But Nate, you're someone who...

[00:45:53] You started playing poker kind of before this sort of computer solver, AI type of era.

[00:45:59] And then you kind of lost your way for a while to this election stuff.

[00:46:02] And now you're playing poker more.

[00:46:05] And have you noticed the difference in your own results and play as a result?

[00:46:11] No, look, I think on net, it probably helps somebody like me because you can essentially substitute a lot of...

[00:46:19] I think Maria, you said this before.

[00:46:21] You can substitute a lot of hours on the felt, on the poker felt, by studying solvers a little bit more.

[00:46:27] Yeah.

[00:46:29] You know, on the flip side, like I kind of acknowledge that like I'm never going to be one of these 20-year-olds

[00:46:37] who like studies Pio solver sims or GeoWizard.

[00:46:41] What's it called?

[00:46:42] GTO Wizard.

[00:46:43] GTO Wizard every day.

[00:46:44] Geo Wizard.

[00:46:46] So I kind of am becoming more of like a field player, right?

[00:46:49] Like I understand what the theory says pretty well and when to deviate from it.

[00:46:56] I think people in general, in general, I think they fuss a little bit too much about these things

[00:47:01] that when you like one one hundredth of a big blind or something,

[00:47:04] where if you can like read people well and play exploitatively effectively,

[00:47:08] then that can be very big advantages.

[00:47:11] So look, I think it raises the floor a lot.

[00:47:14] There are clearly people that like are casual and just through osmosis pick up better GTO play, right?

[00:47:22] There are fewer truly soft spots at the poker table than there once were.

[00:47:25] I'm not sure it raises the ceiling as much.

[00:47:28] I mean, I...

[00:47:29] Yeah.

[00:47:30] And there was a...

[00:47:31] I actually listened to a really, I thought, interesting interview a month or so ago with Ike Haxton,

[00:47:38] who's I think one of the best poker players in the world.

[00:47:41] And he said something that made me go, whew.

[00:47:45] He's like, oh, like people...

[00:47:46] He's like, honestly, bet sizing really doesn't matter that much.

[00:47:49] He's like, you shouldn't really be worried about like if you're, you know, betting a third pot or a quarter pot or...

[00:47:55] Or less.

[00:47:56] He's like, you just need to like...

[00:47:58] You need to understand like what you're doing, why you're doing.

[00:48:01] And he's like, I don't think bet sizes.

[00:48:02] He's like, what are you leaving on, you know, what are you leaving if you get the bet sizing a tiny bit wrong?

[00:48:09] And people will pounce on you and be like, oh, this is such a fish because you're never supposed to bet, you know, 60% on a board texture like this.

[00:48:17] Everyone knows that you only bet X percent.

[00:48:19] And that's just bullshit.

[00:48:22] And there might be like a EV difference of 0.4 big blinds.

[00:48:27] You know, it was just funny to hear Ike say that.

[00:48:31] And I was like, oh, that makes me feel better.

[00:48:33] Let's take a specific example from poker.

[00:48:34] So like, let's say you get your hand and it's before the flop and you're the last one to bet and you make a raise and everyone calls.

[00:48:43] And now the flop comes out, everybody checks to you and you're supposed to make this so-called continuation bet, like bet out there.

[00:48:50] And it's considered bad form to not to somehow.

[00:48:54] Like it's almost like a rule, like you have to do this.

[00:48:56] But clearly there's exceptions to every rule.

[00:49:00] So for instance, if you have a decent hand, you could not make the bet and everyone thinks you have a bad hand because you broke the rules.

[00:49:07] I mean, that's just a simple example.

[00:49:09] But like, you know, it seems like for every rule, anytime someone says you have to do this, that's a perfect opportunity for great players to bend the rules.

[00:49:17] Absolutely.

[00:49:17] And what solvers are actually teaching us is that most things are at a mixed frequency, right?

[00:49:22] Like most of the times there's not 100%.

[00:49:24] Like, and that, I think that's really, that's really important to know.

[00:49:29] That's really interesting.

[00:49:30] And some of the strategies that used to be seen as bad, solvers use them.

[00:49:35] And it turns out they're not bad.

[00:49:37] So like in the situation that you described, James, you, if someone actually breaks that and decides to lead into the preflop raiser, right?

[00:49:47] It's called a donk bet, donk from donkey because it's bad.

[00:49:51] And anyone doing it is a donkey.

[00:49:52] And if you are someone who doesn't check, doesn't respect the flow on things and bets into the preflop raiser, you must be a donkey.

[00:50:00] Well, it turns out that that's absolutely not true.

[00:50:02] And there are absolutely positions and boards where you're supposed to bet.

[00:50:05] And it's donk bets are actually quite good.

[00:50:08] And you're supposed to do it, you know, a good percentage of the time.

[00:50:11] And so solvers are showing us strategies that people used to think were really dumb can be actually quite smart.

[00:50:18] So I think that they're much more interesting than people realize.

[00:50:22] And it's the same thing now happening on Polymarket, which Nate, you're an advisor to Polymarket.

[00:50:27] It's the biggest betting market in the world right now.

[00:50:30] It's amazing.

[00:50:31] I mean, a billion dollars is bet on the one question of who's going to win this election.

[00:50:35] But still, hundreds of thousands or millions of dollars was bet the other day on whether or not Tim Walls was going to say tampon in the debate.

[00:50:43] So like it's fascinating in these betting markets.

[00:50:46] There's hundreds of possible things to bet on.

[00:50:48] And because these betting markets are so new, I imagine as opposed to sports betting, for instance, there must be inefficiencies that one could take advantage of.

[00:50:57] I mean, there are probably inefficiencies in sports betting too.

[00:51:01] It's kind of a barbell distribution where, you know, if you're betting on Kazakhstani ping pong, you can probably have an edge.

[00:51:09] And the things that everybody bets on, like the Super Bowl, there isn't necessarily enough sharp money to clear out all the dumb money.

[00:51:16] The in-between, the day-to-day NFL regular season game or NBA regular season game is much tougher.

[00:51:23] Yeah, there are probably edges to be had.

[00:51:25] Although in politics or in sports, you get banned or restricted if you're any good.

[00:51:29] So that's the problem, right?

[00:51:31] Yeah, there are probably edges to be.

[00:51:32] But Polymarket, you don't.

[00:51:33] No, you don't.

[00:51:35] Because it's peer-to-peer ultimately.

[00:51:36] So yeah, if you really want to model the likelihood of Tim Walls saying tampon, you probably have an edge there.

[00:51:44] I think I'm not sure that's best use of your talents, but that's probably a beatable market, I would think.

[00:51:49] Like right now, I mean, right now, looking at Polymarket this second and looking at your model, what are some decent bets?

[00:51:56] And I'm not saying people who are listening to this should go out and make those bets because this podcast will come out a few hours or days or whatever after we have this interview.

[00:52:03] So all the odds are going to change and all the bets are going to change.

[00:52:06] But like looking at Polymarket right now, what inefficiencies do you see compared with what your election models are saying?

[00:52:12] Look, in principle, right now they have Trump-Harris 50-50 and we have Harris with a 56% chance, right?

[00:52:20] So in principle, you'd bet on Harris, but remember, like prediction markets are more flexible and can account for things that like my model count of count for, right?

[00:52:29] So for example, right now you have a port strike where American or not American, I guess they are American, where workers, longshoremen are striking on the East Coast and the Gulf Coast, which is expected to lower GDP if the strike continues for more than a week or two by 0.5 percentage points.

[00:52:47] You can put that in the model and figure what's the effect of an 0.5, 0.50 basis point decline in GDP on the election odds.

[00:52:54] And also there's this, you know, war in the Middle East and everything.

[00:52:57] So I think it's probably pretty reasonable, but in principle, you would bet the Harris side of the line.

[00:53:03] I do think that because a lot of the prediction markets are crypto adjacent and Trump has been more crypto friendly,

[00:53:12] there might be like a slight pro-Trump bias just based on the demographics of the people that are making the trade.

[00:53:23] But yeah, look, for the most part, we're quite close to Polymarket, our forecast at Silver Bulletin.

[00:53:28] And so it seems like a good thing to me.

[00:53:31] What about the popular vote?

[00:53:33] Like, so Polymarket has, it's like 75 cents to do, you know, by the way, so the way Polymarket works is you bet 75 cents.

[00:53:40] And if you're right, you win 100 cents.

[00:53:42] So, and so on.

[00:53:45] So right now, Kamala is at 75 cents to win the popular vote.

[00:53:49] What's your model saying about her winning the popular vote?

[00:53:52] Yeah, we have her at 77.

[00:53:54] So in the grand scheme of things, and also, you know, our model, I mean, it's behind a paywall technically,

[00:53:57] but like the types of people who were trading on Polymarket are probably, you know, paying for the newsletter.

[00:54:03] We have the popular vote at 77%.

[00:54:05] So really we're very, we're very tightly aligned with the model in part because I'm sure the model like looks at our numbers or the markets look at our model and that affects perception of where the numbers should be.

[00:54:18] And you've said in the past and you've, and you know, Pennsylvania is, is the state.

[00:54:24] And it reminds me of an, there was an Isaac Asimov story like 40 or 50 years ago.

[00:54:29] I forgot what it's called, but you know, at first everyone was saying, okay, you can forget about the rest of the country.

[00:54:33] It all boils down to this one state.

[00:54:35] Then it all boils down to this one city or then this one neighborhood.

[00:54:38] And he finally had one person that it all boils down to.

[00:54:43] And so this whole story was about this person who decides the election.

[00:54:46] And, uh, uh, but given that it's highly likely Pennsylvania or Michigan say will be the state to influence the entire election.

[00:54:56] And you give the odds.

[00:54:57] If, if Trump wins Pennsylvania, he has 92% chance of winning the election.

[00:55:02] If Kamala wins Pennsylvania, she has 90% chance of winning the election.

[00:55:07] And given that, do you think she should have chosen Josh Shapiro to be her vice presidential candidate?

[00:55:12] Like, should they be looking at your models more to make their big decisions about who should be the second in command of the country?

[00:55:18] Nate and I have talked about this so much on the pod and we were both.

[00:55:21] I know you guys have talked about this on Risky Business, the podcast, which is, I wanted to mention that.

[00:55:27] We, I think we both, well, uh, I don't want to put words into Nate's mouth, but I was definitely very much team Shapiro.

[00:55:34] And I think Nate was as well.

[00:55:35] Um, and right, Nate?

[00:55:38] Yes.

[00:55:38] Yeah.

[00:55:38] Look, and then Walsh was kind of mediocre in that debate.

[00:55:41] If we're being honest, uh, he was nervous, which I understand, you know, I get nervous sometimes in public appearances, but, um, but yeah, no, I think for the Pennsylvania factor alone, um, you know, and the fact that like, I think she seemed to take it as a negative that like Shapiro was seen as more moderate, but for the most part, undecided voters want reassurance that, um, the candidates are more moderate.

[00:56:03] So I thought, I thought that was a mistake.

[00:56:05] And, and if, if Tim Walls, I think totally normal VP pick, but if he was going to prove me wrong, then that debate where he was, you know, kind of mediocre would have been the opportunity to do it.

[00:56:18] Yeah.

[00:56:18] I think that, um, Shapiro seemed, I think a lot of things that were seen as negatives for Shapiro are actually positives.

[00:56:25] Cause the other thing that I think was a negative was that he was more ambitious, right?

[00:56:29] He's someone who has kind of more political aspirations and walls was more of a quote unquote team player.

[00:56:35] It's fucking politics.

[00:56:37] You want the ambitious guy.

[00:56:39] Like you want the guy with the killer instinct.

[00:56:42] You want the guy.

[00:56:42] Do you think there was a factor that he was Jewish?

[00:56:45] Because at that time, particularly convention time, there was that, there was probably more anti-Israel protests then than there are now.

[00:56:53] And that probably it not, I'm not saying anybody's anti-Israel or anti-Jewish, but all of these things she had to probably take into account.

[00:57:01] Yeah.

[00:57:01] Um, I think, I mean, I think so.

[00:57:04] Yeah.

[00:57:04] I don't know.

[00:57:05] I try to studiously avoid comment on some of this stuff, but yeah, look, I mean, obviously there are obviously blurred lines between the way certain people's, cause he had a mainstream democratic stance on the war in Gaza.

[00:57:20] Right.

[00:57:20] Um, although he had anyway, I, yeah, I, it may have been a factor.

[00:57:24] I don't know.

[00:57:25] I think on some level it was a factor.

[00:57:28] And I think that everyone will vehemently deny that it was a factor.

[00:57:32] Even he denied it, obviously, cause he has to.

[00:57:34] Um, but, um, I, I, I will, I think I've said this before, but like, yes, I absolutely think it was a factor.

[00:57:42] You can tell he's probably plotting his revenge though.

[00:57:46] Well, that's the thing.

[00:57:47] Like Pennsylvania's might've been a little pissed off.

[00:57:50] Like that, that might be a factor in what happens.

[00:57:52] You know that like, if, uh, I think he's rooting for Harris to win, right?

[00:57:57] Um.

[00:57:57] Of course.

[00:57:59] But if she loses by like eight electoral votes and loses Pennsylvania, you know, he might, he might crack open.

[00:58:05] I don't know if he's a drinker or not.

[00:58:07] Might crack open a beer and have a little laugh with himself, I would think.

[00:58:10] Yeah.

[00:58:11] I really hope that doesn't happen.

[00:58:12] But as I think we've talked about, like if Harris loses Pennsylvania and that's kind of the, the tipping point of the election, I think they will look back at this and be like, oh fuck.

[00:58:22] Um, you know, we, we messed up, but it's too late at that point.

[00:58:26] Um, but I also think that the debate would have been very different with Shapiro debating Vance, um, to be perfectly honest.

[00:58:32] Yeah.

[00:58:32] Cause I think, I think Shapiro was a lawyer.

[00:58:34] It's like two lawyers.

[00:58:34] They had a similar kind of training, but like, but this is what I'm wondering is, you know, all this ultimately is about decision making.

[00:58:40] And if people look at, at good models, like whether it's poker, a game theory optimal model for poker or, or Nate, your models for elections.

[00:58:50] Like if I was the presidential candidate and I'm not, maybe this is why I'm not, I would have looked at your models and said, okay, I need to really focus on Pennsylvania.

[00:58:57] So I'm going to pick a Pennsylvania to be my VP candidate.

[00:59:00] And I'm going to put a hundred million dollars in ads in Pennsylvania.

[00:59:03] And I'm probably going to ignore pretty much every other state.

[00:59:06] Yeah.

[00:59:07] Look in, in, um, you know, to be fair, a vice president usually doesn't matter that much.

[00:59:12] So you're probably only getting like an extra point or so out of Pennsylvania, but I think

[00:59:16] they overthought it.

[00:59:18] They look after Harris replaced Biden, Democrats were so thrilled.

[00:59:23] It's what I call the New York Knicks phenomenon, right?

[00:59:26] Where the Knicks were so bad for so long that, I mean, the Knicks are pretty good now, but like,

[00:59:30] you know, when they merely made the playoffs a couple of years ago, people were like, oh my God,

[00:59:33] this is amazing.

[00:59:34] Um, so they were kind of riding a little high off the fumes of the vibes.

[00:59:38] And I think they, I think they wanted to keep the good vibes going in the short run, you know,

[00:59:43] have America's hockey dad or football dad or whatever he is.

[00:59:46] And not thinking about like, we would really like to have an extra 50,000 votes in Pennsylvania

[00:59:51] in November.

[00:59:52] One of the things that we always talk about, Nate, is that, you know, it might be just

[00:59:57] one point, but in an election this close, every single point matters.

[01:00:02] And every single one of those tiny edges matters because they end up adding up.

[01:00:07] And when you're, when you're basically, when you're neck and neck, like you need to be thinking

[01:00:12] about this differently than if you're an overwhelming favorite.

[01:00:17] Yeah, I, I agree.

[01:00:18] And, uh, uh, you know, as you guys have pointed out, I mean, there's so many different factors

[01:00:24] in these elections that who knows really what will be the critical factor, but you've identified

[01:00:29] a couple of critical factors, um, that people should pay attention to.

[01:00:34] But again, in terms of like, let's say I want to just make money, you know, it, it does

[01:00:39] seem like there's so many bets out there that are, you know, like you were saying, like

[01:00:44] Kazakhstani ping pong or whatever, like there's that are inefficient, potentially inefficient.

[01:00:48] That's going to be now my, my metaphorical, the Kazakhstani ping pong team, if they exist

[01:00:53] forever now is in my head as like the inefficient market.

[01:00:57] Nate, do they exist or did you just plant false information into our heads?

[01:01:02] No, but I mean, during the pandemic, right there, people were desperate to have a sweat

[01:01:06] on things and it was like, yeah, Filipino ping pong.

[01:01:08] I mean, literally people got into really obscure, obscure stuff.

[01:01:11] If degens, we'll find a way to degen in some way.

[01:01:16] Do either of you bet on polymarket or any of these prediction markets?

[01:01:20] Can't in the US.

[01:01:20] Polymarket is not open to Americans.

[01:01:23] Really?

[01:01:24] Yeah.

[01:01:24] So you have to use predicted.

[01:01:25] Is predicted still open to Americans?

[01:01:29] You need, you need to use CalShe.

[01:01:31] Yeah.

[01:01:32] CalShe.

[01:01:33] I don't think I know.

[01:01:34] You mentioned that one in a podcast, but I've never been there.

[01:01:37] There's been news since our podcast.

[01:01:38] So the CalShe is open again so that their election betting is open.

[01:01:43] You can bet at Manif...

[01:01:44] The other prediction market I like is Manifest or Manifold rather, which is free.

[01:01:51] But so it's play money, but has a lot of really smart people trading there and a good community

[01:01:56] around it.

[01:01:56] So if you're in the US, bet, play money on Manifold.

[01:02:01] If you're anywhere else, then bet Polymarket, I say as a paid advisor to Polymarket.

[01:02:07] And as a non-paid advisor, I will say if you're in the US, bet real money on CalShe.

[01:02:11] Okay.

[01:02:12] I'm going to check it out.

[01:02:14] So, okay.

[01:02:16] Finally, again, we've talked about how everyone kind of secretly wants to be in the river.

[01:02:21] Nobody wants to be in the village, i.e.

[01:02:22] that people making dumb decisions as opposed to well-thought-out, rational, risk-based decisions.

[01:02:28] How does one...

[01:02:29] There's really two questions.

[01:02:31] One is, how do you recognize whether you're in the village or the river?

[01:02:35] Because Dunning-Kruger, almost everyone I ever speak to about poker says they're the

[01:02:40] best poker player in the world.

[01:02:43] And number two is, a lot of people probably, if they're risk-averse, they don't really know

[01:02:50] how to be...

[01:02:51] Or not necessarily risk-averse, but I don't know.

[01:02:55] If they're the sort of person who doesn't make these sort of probabilistic expected value

[01:02:59] bets, it's hard to change your personality to do that.

[01:03:04] Well, I'll say a few things as someone who apparently is still in the village, but takes

[01:03:12] four ways.

[01:03:12] You're accused of still being in the village.

[01:03:13] Yes, but it takes four ways into the river, which is that learning poker has definitely

[01:03:19] helped me embrace risk a lot more in all areas of my life because I'm someone for whom it

[01:03:28] wasn't natural.

[01:03:29] And the funny thing is, I've taken huge risks in personal decisions throughout my career,

[01:03:35] quitting jobs without having another job or any money.

[01:03:38] Doing things and making very big changes where I have actually taken pretty substantial risks.

[01:03:45] But when it came down to it, I started off poker being way too cautious.

[01:03:52] There are usually two different types of poker players at the beginning.

[01:03:56] And most tend to err on the side of over-aggression, which is good, right?

[01:04:02] If you have to pick one, like be a nit or be too aggressive, it's better to be too aggressive.

[01:04:06] And I was definitely a nit at the beginning where it was really difficult for me to kind of take

[01:04:12] gambles even when I started learning more and figuring out, oh, I should really be bluffing here.

[01:04:18] I couldn't pull the trigger.

[01:04:20] I had, you know, there was mental blocks and I think a lot of it comes from socialization,

[01:04:26] being female.

[01:04:27] Like there are lots of social factors that play into this.

[01:04:31] But once I unlocked that, like my poker playing really changed and my decision making in real life

[01:04:37] really changed.

[01:04:38] And so I don't think my personality, like who I am hasn't changed, but I've been able to

[01:04:44] think about decisions differently, which was part of the reason I got into poker.

[01:04:48] And it worked, right?

[01:04:49] It actually did what I thought it was going to do and made me much more tolerant of risk

[01:04:54] of uncertainty of all of these things.

[01:04:56] And so that, you know, one of the reasons why I think everyone should learn to play poker,

[01:05:01] but learn to play poker correctly is for this because I actually think it's really good to

[01:05:07] be able to think like this in day-to-day life and to be able to kind of have that gear where

[01:05:12] you can take the gambles you need to take, when you need to take them, and when you can

[01:05:16] kind of understand those types of calculations much more easily.

[01:05:21] And James, I am definitely not the best poker player in the world.

[01:05:24] I am probably, I would say I'm not the best poker player on this podcast.

[01:05:29] I don't know.

[01:05:30] I don't know.

[01:05:30] It's probably pretty close.

[01:05:33] I'm higher.

[01:05:34] I think I'm a lower floor.

[01:05:37] I don't know.

[01:05:38] I'm more gambler than Maria, even though she's trying to be more gambler.

[01:05:41] I'm more gambler than Maria.

[01:05:43] Yeah, look, I even, people who have read the book, I hear people who are very much

[01:05:47] in the village who say, okay, I at least am now starting to think about more things in

[01:05:52] expected value terms.

[01:05:54] Understanding that life is uncertain and that sometimes the riskier course of action where

[01:06:00] there's more uncertainty nevertheless has the higher expected payoff.

[01:06:04] And, you know, we all hopefully live 80, 90 years.

[01:06:07] We get a lot of decisions that we get to make.

[01:06:09] You know, the more advanced stuff, the game theory stuff where, okay, what happens if

[01:06:12] I'm trying to maximize my EV and so is everyone else?

[01:06:15] I mean, that's the 201 class.

[01:06:17] But like, to think in terms of like, what's the expected outcome and not being inherently

[01:06:22] afraid of uncertainty when the rewards that way the risks, I think is something that people

[01:06:26] definitely can learn.

[01:06:28] Mm-hmm.

[01:06:29] Absolutely.

[01:06:29] Yeah, and I guess so it's a matter of being comfortable with, like you were saying, with

[01:06:34] risk.

[01:06:35] And not only with risk, but with risk size.

[01:06:37] Like, even if you understand expected value, you could still go broke not quite understanding

[01:06:43] risk size.

[01:06:44] Like, how much you should bet.

[01:06:45] And that's very, you have to be a killer also to understand risk size very well.

[01:06:51] You know, bet sizing is an underrated topic in general, which I think does analogize in some

[01:06:57] way.

[01:06:57] I mean, look, having optionality in life is really important.

[01:07:03] Making choices where you are able to make more choices down the road.

[01:07:06] Some people are phobic of optionality and phobic of having too many choices available.

[01:07:10] But like, you know, being in a position where you can take advantage of opportunities, I think

[01:07:14] is kind of like the secret behind a lot of people's success.

[01:07:20] Well, oh, go ahead, Maria.

[01:07:22] No, I was going to say, yeah, you need to keep being able to put yourself in a position

[01:07:26] to get lucky, right?

[01:07:27] Because everything in life is uncertain.

[01:07:30] And so how do you maximize your decision making, maximize your skill so that you keep putting

[01:07:34] yourself on kind of the percentage side, right?

[01:07:37] You keep taking the bets that will put you in a position to get lucky.

[01:07:40] But if you go broke, you can't do that again.

[01:07:43] So that's kind of the optionality.

[01:07:45] And I think that's a really important thing to keep in mind.

[01:07:47] Right?

[01:07:47] Like, it's almost like given luck and given the fact that people can go broke, it's almost

[01:07:53] like you have to make sure you diversify as many ways as possible your opportunity

[01:08:00] to say you have survivorship bias, right?

[01:08:04] Because all success, for all we know, is survivorship bias.

[01:08:07] Like, oh, if you're the type of person who pursues your passions, you're going to have success

[01:08:13] in life.

[01:08:13] Well, we don't know all the people who pursued their passions who are just homeless right

[01:08:17] now.

[01:08:18] But if you kind of diversify all the strategies you use so that you can have survivorship

[01:08:23] bias in at least one of them, you're doing well.

[01:08:26] And I guess you choose based on which ones have the highest expected value.

[01:08:31] Yep.

[01:08:31] Yeah.

[01:08:32] I mean, if you want a good critique of my book, it probably suffers from survivorship

[01:08:38] bias in a lot of ways where you're profiling the successful people.

[01:08:40] Although there are like kind of anti-heroes in the book, like Sam McManfree.

[01:08:45] Although his mistake was that he felt like you shouldn't hedge your risks at all.

[01:08:51] And you should go all in on whatever crazy strategy you have to maximize EV.

[01:08:55] And even if it means, like literally if it means ruining your life.

[01:08:57] So there's a lesson there as well, I think.

[01:09:00] Yeah.

[01:09:02] He sized that bet too large.

[01:09:05] Yeah, I know.

[01:09:06] I mean, basically if you bet like, there's a simulation in the book about what if you

[01:09:11] bet an NFL season and ignoring the Kelly criteria, which is the criteria that most gamblers use

[01:09:18] to make bets and is actually too aggressive, most people think.

[01:09:21] And basically like 99 times out of 100, you go completely broke.

[01:09:26] But one time out of 100, you wind up with like more money than Elon Musk.

[01:09:30] So from an expected value standpoint, it seems good.

[01:09:33] But like most people are sane enough to understand that 99% happens a lot and outweighs a 1% chance

[01:09:41] of being Elon Musk.

[01:09:42] And SBF willingly did not care or understand that.

[01:09:46] And I think it's a combination of that plus hubris and overconfidence.

[01:09:50] Because a lot of the things that SBF was making bets on, you know, what if like in something like Nate's model

[01:09:59] that he ran out, like there's less uncertainty, right?

[01:10:03] Like you know that these, that you're actually doing this simulation.

[01:10:06] But what if like, what if your percentages are off?

[01:10:09] Like what if any of your calculations are off and you're betting on the fate of the world, right?

[01:10:14] And you're betting on outcomes like that.

[01:10:16] Then if you're overconfident and your certainties are wrong, if your confidence intervals are wrong,

[01:10:21] like if anything is off by a little bit, then you're fucked.

[01:10:25] If you're trying to go for that, like, well, it's actually plus CB.

[01:10:28] No, it actually might not be plus CB because you don't know that the math is correct.

[01:10:32] Well, and also you're making decisions in a world, like Sam Bankman-Fried wasn't just

[01:10:36] making decisions about investments.

[01:10:38] He was also gambling his life.

[01:10:39] Exactly.

[01:10:40] And, you know, he could have incorporated that into his personal model.

[01:10:44] Oh, the expected value has to include the fact that there's an infinitely bad outcome in some cases.

[01:10:50] So, or an almost infinitely bad outcome.

[01:10:53] And, you know, life decisions have to include the factors of life in your model.

[01:10:58] Yeah.

[01:10:59] I mean, I guess to his credit, you know, I assumed he had a billion dollars sitting in a Swiss bank account

[01:11:05] somewhere or something, which is probably smart.

[01:11:06] And he didn't, he really was all in on this, you know, scheme to steal, although they were paid back,

[01:11:13] the scheme to steal funds and reroute them to Alameda and make a bunch of degenerate bets on shit coins.

[01:11:20] He was truly all in it, which I think probably he, he probably found himself honorable to,

[01:11:25] I mean, he said that people who, if you're not willing to ruin your life,

[01:11:29] he thinks you're a wimp is what he told me basically.

[01:11:31] And, uh, there are a lot of negative adjectives you can use for SPF, but I wouldn't call him a wimp.

[01:11:38] He authentically enjoys degenerate levels of gambling.

[01:11:43] I'm a wimp.

[01:11:46] We're probably all wimps.

[01:11:47] It's probably healthy to be a wimp in that sense.

[01:11:50] Well, I can't recommend all of your products enough.

[01:11:53] I mean, we were taught, I mean, definitely Risky Business, the podcast, or is it?

[01:11:58] Again, I have to look up the name for the second time in this podcast.

[01:12:00] Correct.

[01:12:01] Risky Business, the podcast, which I listen to despite me having to look up the name every time

[01:12:04] because I get nervous.

[01:12:06] Risky Business, the podcast is a must listen.

[01:12:08] I'm going to describe it more, or I did describe it more in the intro.

[01:12:13] And definitely I would read, you know, your, your, both of your latest books, On the Edge.

[01:12:18] Nate Silver just came out and, and The Biggest Bluff by Maria Konnikova.

[01:12:22] Maria, when's your next book coming out?

[01:12:23] Um, probably in, I would say like, oh, I don't want my editor to, uh, to hear this conversation

[01:12:31] and hold me to it, but I'm working on it.

[01:12:33] She's not going to listen to this.

[01:12:33] I'm working on it.

[01:12:34] Um, uh, I would, I would say probably early 2026.

[01:12:39] Okay.

[01:12:40] And, and you both have, uh, newsletters, The Leap by Maria Konnikova, The Silver Bolton

[01:12:47] by Nate Silver.

[01:12:48] Highly encourage.

[01:12:50] I get these.

[01:12:50] As soon as I get these, I stop everything I'm doing and reading them.

[01:12:53] No, no joke.

[01:12:54] Like they're one of the few newsletters I read.

[01:12:56] And I really appreciate you guys coming on the podcast.

[01:13:00] Maria's been on many times.

[01:13:01] Nate, it's a pleasure to meet you for the first time.

[01:13:03] Thanks again.

[01:13:04] It's always such a pleasure.

[01:13:05] Absolutely, James.

[01:13:06] Great to chat.

manifold,risk management,deep fakes,expected value,game theory,nate silver,survivorship bias,kelly criterion,poly market,maria konnikova,crowded poker field,rational decision making,cognitive load,ai,election probabilities,probabilities,podcast.,decision making,poker,calshi,predicted,trust in institutions,betting markets,