A Note from James:
Today, we've got Andrew Gold back on the podcast. He's an investigative journalist and the author of "The Psychology of Secrets: My Adventures with Murderers, Cults, and Influencers" This book is riveting—every page dives into the intriguing world of secrets. What are secrets? What are the big secrets we know of? How do you tell someone’s secrets? From murderers to cults to influencers, Andrew’s insights are fascinating. He also runs the YouTube channel Heretics, which I highly recommend. But for now, let’s get into our conversation about the nature of secrets. Here's Andrew Gold.
Episode Description:
In this episode, James welcomes back investigative journalist and author Andrew Gold to discuss his latest book, "The Psychology of Secrets: My Adventures with Murderers, Cults, and Influencers." Andrew shares his deep insights into the nature of secrets and their impact on our lives. From the hidden doctrines of cults like Scientology to the subtle ways secrets influence our personal relationships, Andrew reveals the psychology behind why we keep secrets and how they shape our behavior. This conversation delves into the interplay between secrecy and storytelling, the social dynamics of secret-keeping, and the ethical dilemmas of radical honesty.
What You’ll Learn:
- The psychological mechanisms behind why people keep secrets and how they affect our mental health.
- The role of secrets in cult dynamics, including how leaders manipulate followers by controlling access to secret knowledge.
- How secrets can both strengthen and destroy personal relationships.
- The ethical considerations and potential consequences of radical honesty.
- The impact of AI and data on uncovering and preserving secrets in the digital age.
Chapters:
01:52 The Fascinating Dynamics of Cults and Influence
02:57 Navigating the Complexities of Public Appearances and Media
05:51 The Ethical Dilemmas and Impact of AI on Truth
25:40 Unveiling the Power of Secrets in Cult Dynamics
39:15 The Enigmatic Influence of Secrets on Society and Individual Psyche
42:18 The Intricacies of Secrecy and Society
43:03 The Evolution of Secrets and Their Societal Impact
45:08 The Power of Secrets in Personal and Social Dynamics
54:56 Exploring the Psychology Behind Sharing Secrets
58:23 The Complex Relationship Between Secrecy, Honesty, and Society
01:06:42 Navigating the Challenges of Content Creation and Audience Expectations
01:17:43 Reflections on the Journey of a Content Creator
Additional Resources:
- Andrew Gold’s YouTube Channel: Heretics
- The Psychology of Secrets on Amazon
- Sam Harris' Book: Lying
- Peter Boghossian’s Work
- Trigonometry Podcast
------------
- What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
- Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!
------------
- Visit Notepd.com to read our idea lists & sign up to create your own!
- My new book, Skip the Line, is out! Make sure you get a copy wherever books are sold!
- Join the You Should Run for President 2.0 Facebook Group, where we discuss why you should run for President.
- I write about all my podcasts! Check out the full post and learn what I learned at jamesaltuchershow.com
------------
Thank you so much for listening! If you like this episode, please rate, review, and subscribe to “The James Altucher Show” wherever you get your podcasts:
Follow me on social media:
[00:00:00] This Memorial Day, click into cordless power
[00:00:04] at The Home Depot.
[00:00:05] Tackle your yard cleanup with the precision
[00:00:07] and gas-like power of the Ryobi 40-volt
[00:00:09] expanded cordless string trimmer.
[00:00:11] Then power through the heavier debris
[00:00:13] with more than an hour of runtime
[00:00:15] on the Ryobi 40-volt cordless battery blower.
[00:00:19] Right now you can get either one for only $159 each.
[00:00:22] It's time to click into Memorial Day doing
[00:00:25] at your cordless power source, The Home Depot.
[00:00:27] How doers get more done.
[00:00:31] When you're drinking a frozen beverage from McDonald's,
[00:00:34] your brain may not like how refreshingly cold it is,
[00:00:37] but the rest of your body, oh yes,
[00:00:40] it's gonna relish every moment of it
[00:00:43] because there are drinks.
[00:00:45] Then there are drinks from McDonald's.
[00:00:47] Try the new ice-cold frozen Philadelphia Fanta
[00:00:51] with piping hot fries.
[00:00:52] Get any size frozen drinks for $1.49.
[00:00:55] Price and participation may vary,
[00:00:57] cannot be combined with any other offer.
[00:00:59] Ba-da-ba-ba-ba.
[00:01:10] The Psychology of Secrets,
[00:01:12] My Adventures with Murderers, Cults and Influencers.
[00:01:17] This was just a great book.
[00:01:19] Like every page was riveting.
[00:01:22] It's so interesting how many secrets,
[00:01:25] like just everything about secrecy,
[00:01:27] like what are secrets?
[00:01:29] What are the big secrets that we know of?
[00:01:32] How do you tell if someone's saying a secret or,
[00:01:36] and then on top of it,
[00:01:37] all this stuff about cults, influencers, murderers.
[00:01:41] Andrew Gold, once again, back on the podcast.
[00:01:44] He has the YouTube channel, Heretics.
[00:01:46] I highly recommend it,
[00:01:47] but I really recommend this book,
[00:01:49] The Psychology of Secrets.
[00:01:50] You can order it on amazon.co.uk.
[00:01:53] And our conversation has really given me a lot of things
[00:01:58] to think about, about the whole nature of secrets.
[00:02:01] I hope you enjoy it.
[00:02:03] Here's Andrew Gold.
[00:02:08] This isn't your average business podcast,
[00:02:11] and he's not your average host.
[00:02:14] This is The James Altucher Show.
[00:02:25] It was all right.
[00:02:25] Like it's a stressful thing.
[00:02:27] It's sort of a conveyor belt kind of system.
[00:02:29] It's like, you know,
[00:02:30] you're backstage with a couple of people
[00:02:32] and you start to realise, right,
[00:02:33] they've got certain views,
[00:02:35] so I'm probably supposed to be against them.
[00:02:37] So I need to have the views that are not the ones they have.
[00:02:40] And they were a bit cliquey or clicky,
[00:02:42] I think Americans say,
[00:02:44] the two of them, because they've been together,
[00:02:45] they're quite woke and they're doing their thing.
[00:02:47] And I was like, oh, am I the bad guy here?
[00:02:49] I'm supposed to come in and defend Kevin Spacey
[00:02:52] and all those kinds of things.
[00:02:54] And so I felt a bit, oh gosh, this is awkward.
[00:02:57] And then you go in and he's sort of doing his thing,
[00:02:59] like looking down, he's not really, you know,
[00:03:01] and I was like, oh, hi, Piers.
[00:03:02] And he's like, hi, hello.
[00:03:04] And then suddenly you're going live.
[00:03:06] And then when you're finished,
[00:03:07] you're sort of shuffled out of there.
[00:03:09] So I didn't really get to meet the man so much,
[00:03:12] but it was, you know, it's good.
[00:03:13] It gives you some street cred or something.
[00:03:15] Yeah, no, I think that's the goal really,
[00:03:18] is that it gives you that authority
[00:03:20] that you've been on that show
[00:03:22] and your opinions are recognized enough
[00:03:25] and your knowledge is recognized enough
[00:03:26] that you're chosen to represent that point of view.
[00:03:29] So I think that's really great.
[00:03:30] I always find though, whenever I've been on like talk shows
[00:03:33] or television like that,
[00:03:36] I feel like I have to like take five showers
[00:03:38] to clean myself off afterwards.
[00:03:40] Just because it just, for me, it feels a little icky.
[00:03:44] And I feel like I'm not really,
[00:03:47] and this is nothing, this is only my own experience.
[00:03:49] This has nothing to do with your experience
[00:03:50] with Piers Morgan.
[00:03:51] I just feel like I haven't really contributed anything
[00:03:52] to the world, even though I was on this, you know,
[00:03:55] what's usually considered a very important platform.
[00:03:58] And anyway, that's my thing.
[00:04:01] I know exactly what you mean.
[00:04:02] I know exactly what you mean.
[00:04:02] And because I live a couple of hours away from London,
[00:04:06] I have to shower because physically I also feel icky.
[00:04:09] So physically and psychologically,
[00:04:11] I get home and there's just absolute disgusting headspace.
[00:04:14] And again, you know, what the hell was this whole thing?
[00:04:16] And what do I think?
[00:04:18] You know, because I hadn't actually, you know,
[00:04:20] they called me, they messaged me when I was on the plane back.
[00:04:22] I'd been in New York, New Yoik, in your country.
[00:04:25] That's my impression.
[00:04:26] And they said, can you come on Piers tomorrow?
[00:04:28] I was on the plane like, oh God, I've got to go on the bloody,
[00:04:31] I've got to go on Piers.
[00:04:32] And they were like,
[00:04:32] you've watched the Kevin Spacey documentary, haven't you?
[00:04:34] And I was like, yeah, yeah, yeah, yeah, yeah.
[00:04:36] Because I don't know, I thought maybe it was years ago,
[00:04:37] but it turns out it's a new one.
[00:04:38] I didn't know that.
[00:04:40] So I go on and I was just, I was playing my role.
[00:04:42] And I'm going, come on, like, what did he do wrong?
[00:04:45] Someone gave him a blowjob
[00:04:46] because they wanted to get up in their career.
[00:04:47] Because I'm just saying, do I think this?
[00:04:49] I don't know what I think.
[00:04:50] And the woman next to me was like, wait,
[00:04:52] you think it's fine for him to be grabbing people
[00:04:53] in corridors and things, do you?
[00:04:55] You think all this is fine?
[00:04:56] And I was like, oh, I didn't know that was,
[00:04:58] she was like, you haven't even watched this, have you?
[00:05:00] And I was like, yes, yes, yes, no, I did.
[00:05:01] Of course I did.
[00:05:02] But I hadn't.
[00:05:05] Oh my God.
[00:05:05] So you had a secret.
[00:05:07] Yeah, yeah, yeah, fuck.
[00:05:10] I shouldn't have secrets anymore
[00:05:11] because they're not good for you.
[00:05:13] But I would have been socially ostracized to such an extent
[00:05:16] had I revealed said secret on the show
[00:05:19] that I would never have been invited back.
[00:05:21] Well, okay, we're obviously gonna talk about secrets
[00:05:25] and whether they're good for you or not.
[00:05:27] But what was the Kevin Spacey, there's a new documentary.
[00:05:29] So I always thought the whole thing with Kevin Spacey
[00:05:31] is that 30 years ago at a party,
[00:05:33] he groped somebody inappropriately
[00:05:35] and he got canceled.
[00:05:36] That really was when he got canceled.
[00:05:38] And then I guess new things came to light.
[00:05:40] But what's the new documentary about?
[00:05:42] It's Channel 4 in the UK's released it.
[00:05:44] And it's just one of those ones,
[00:05:46] you've seen these before,
[00:05:47] where it's just like person after person
[00:05:49] in a beautifully shot thing.
[00:05:50] It's like a guy sitting there and it's like,
[00:05:52] who's this guy?
[00:05:53] This is gonna be another victim.
[00:05:54] And by the sort of fourth or fifth,
[00:05:57] you're like, okay, I don't need to watch this anymore.
[00:05:59] So I didn't watch the whole thing
[00:06:00] even when I did watch it after getting home from Piers.
[00:06:03] But yeah, people, he told me that
[00:06:06] if I wanted to get ahead in my career,
[00:06:08] I had to give him a blow job.
[00:06:09] And he lied to me about this or that.
[00:06:12] And he kept telling me he was gonna take my script
[00:06:14] to this place.
[00:06:15] But every time it was like,
[00:06:16] but first you have to do this.
[00:06:18] Those kinds of things,
[00:06:19] which I think we now see as abuses of power.
[00:06:22] But 15, 20 years ago.
[00:06:24] Let me ask you this,
[00:06:25] like you talk in the book about AI
[00:06:27] that can roughly recognize in faces,
[00:06:32] not necessarily if someone's lying,
[00:06:35] but you started talking about this.
[00:06:36] But like, the example was the parole,
[00:06:41] which people are likely to go against,
[00:06:44] act badly when they're on parole.
[00:06:47] And the AI was good at identifying that.
[00:06:49] Do you think in the future,
[00:06:50] AI will be good at telling?
[00:06:52] These people said this in the documentary
[00:06:54] and we always like to believe
[00:06:55] that they're telling the truth.
[00:06:56] We wanna assume the best in people
[00:06:58] and particularly when there's calling
[00:06:59] that they're being abused
[00:07:00] because we don't want abused people to be ignored.
[00:07:02] But do you think, clearly sometimes people do lie.
[00:07:06] Do you think AI will be able to tell
[00:07:08] in documentaries like this if someone's lying or not?
[00:07:10] Yeah, I mean, you raised such a good point
[00:07:12] because I'm watching these guys
[00:07:13] and when I'm watching it,
[00:07:14] I'm believing them,
[00:07:15] but that's not enough to put someone in prison, right?
[00:07:17] And then, okay, is the line,
[00:07:20] is the threshold about where we judge someone
[00:07:23] to go to prison different to the one of,
[00:07:25] I should just, I don't wanna watch his movies anymore.
[00:07:27] Like those lines are probably a little bit different, right?
[00:07:29] The burden of proof changes.
[00:07:32] It's a really hard one to know
[00:07:33] because that's an unknowable firstly.
[00:07:35] Secondly, I think that, yeah,
[00:07:36] there's a big part in the book
[00:07:37] about we put too much emphasis on body language and things
[00:07:41] when we think people are lying or telling the truth.
[00:07:44] It's actually very hard to know.
[00:07:45] However, lie detecting,
[00:07:48] well, lie detectors, polygraph tests,
[00:07:50] they are better than they should be
[00:07:52] because we talk about how bad they are.
[00:07:53] They still get between 60 and 80% right.
[00:07:55] That's insane.
[00:07:57] So...
[00:07:58] And is that better than humans?
[00:08:00] That is better than humans.
[00:08:02] Yeah, that is better than humans.
[00:08:03] Humans, I think, I need to check the book again,
[00:08:06] but we get like 50% right.
[00:08:09] It's like a guess.
[00:08:10] We're so bad.
[00:08:11] We have no idea if someone's lying or telling the truth
[00:08:14] and officers and people,
[00:08:17] it depends if someone's like a particularly manipulative liar
[00:08:20] and if their faces match or if they're mismatched too.
[00:08:23] So when they're matched, meaning someone's lying
[00:08:26] and they look like a shifty person in a cartoon,
[00:08:29] their eyes are darting back and forward,
[00:08:31] humans are like 70, 80% accurate
[00:08:33] and investigators and detectors are even better.
[00:08:36] When it's someone who's mismatched,
[00:08:37] all of us are equally absolutely terrible,
[00:08:40] but the lie detector remains 60 to 80% accurate.
[00:08:43] So it's pretty good,
[00:08:46] but not enough to put someone in prison
[00:08:48] for the rest of their life
[00:08:48] because every hundred people you do,
[00:08:50] you'll get 20 to 40 wrong.
[00:08:52] And what's interesting though,
[00:08:53] you said psychopaths can get away with lying
[00:08:56] with a lie detector 88% of the time.
[00:08:58] So lie detectors are not good with psychopaths.
[00:09:01] Yes, psychopaths can't fool lie detectors
[00:09:04] and that was one of the most fascinating things I found.
[00:09:05] Oh, they can't fool lie detectors?
[00:09:07] That's really interesting.
[00:09:09] They don't fool them at a particularly
[00:09:10] more of a rate than we do.
[00:09:12] I mean, the thing with the statistics around lie detectors
[00:09:14] is it changes depending on the lie detector machine
[00:09:17] or who's using it, who's doing it.
[00:09:19] So that's why they vary so much.
[00:09:21] But the point there was that the psychopaths
[00:09:23] are actually not better.
[00:09:25] And that shocked me
[00:09:26] because I thought that was the whole thing
[00:09:27] someone who's neurotic like me would give myself away
[00:09:30] and these psychopaths,
[00:09:31] but basically it's done on your base rates.
[00:09:33] They have to really warm you up for about an hour
[00:09:35] just having a casual conversation.
[00:09:37] So a psychopath's base rate or their body movements,
[00:09:40] their skin, their heart,
[00:09:42] everything is going to be really low anyway.
[00:09:44] So that just means that the machine
[00:09:46] has to be really, really sensitive
[00:09:48] to any slight changes the machine will pick up on.
[00:09:51] There's also something called duper's delight.
[00:09:54] Psychopaths get a bit of a kick out of lying to you.
[00:09:56] So that kick is going to be registered anyway.
[00:09:58] So the psychopaths can't fool it any better than we can.
[00:10:02] So let me ask this because I always feel,
[00:10:06] I used to think I was a good judge of people,
[00:10:08] but then I realized everyone I spoke to
[00:10:11] always said that they were good judges of people.
[00:10:13] Like everybody thinks they're a good judge of people.
[00:10:15] And I realized kind of just looking back over time
[00:10:18] that I'm actually not really that great a judge of people.
[00:10:20] Often I'm very wrong and I'm probably worse than average
[00:10:24] because I think I kind of tend to be very trusting.
[00:10:26] And is anybody an above average judge of people?
[00:10:32] I don't know you know,
[00:10:33] because there's a great book called
[00:10:35] The Intelligence Trap by David Robson
[00:10:37] about really smart people and it shows how smart,
[00:10:39] and you're clearly a very smart person,
[00:10:41] but you're not a good judge of character.
[00:10:43] You had me stay at your house.
[00:10:44] That was a very big, it could have been a big mistake,
[00:10:47] which was lovely by the way.
[00:10:48] Just watching you now has brought back all the memories.
[00:10:50] I wish I were there just hanging around.
[00:10:51] Come on over again anytime.
[00:10:53] It's a long way, isn't it? Expensive travel and everything.
[00:10:56] But I will, I will. I'd love to.
[00:10:58] But yeah, more intelligent people make more mistakes,
[00:11:02] which is again remarkable.
[00:11:04] Arthur Conan Doyle is a great example of that.
[00:11:07] He believed in fairies.
[00:11:08] He's the writer for Sherlock Holmes.
[00:11:11] You know, this is the master of deduction
[00:11:12] and he thought that he saw a photo and it was a fairy
[00:11:15] and he was positing that fairies,
[00:11:18] they seem to have a belly button or a navel
[00:11:21] or whatever Americans say for belly button,
[00:11:22] which must sound really ridiculous to you.
[00:11:24] What's a belly button in America?
[00:11:26] A belly button.
[00:11:26] Oh, right, okay.
[00:11:27] That just sounded really English when I said it.
[00:11:29] So yeah, he believed in it, had a whole theory about it.
[00:11:34] The thing is you're super intelligent
[00:11:35] and you're able to convince yourself of all of these things
[00:11:38] because you're very good at doing that.
[00:11:40] So I don't know if anyone's good
[00:11:43] at being a judge of character.
[00:11:45] Certainly we found that statistically detectives
[00:11:48] and inspectors and things are really not much better
[00:11:51] than the rest of us.
[00:11:52] They're trained to look out for lies.
[00:11:54] They're better at detecting incoherent parts
[00:11:57] of people's excuses for why they weren't in certain places
[00:12:01] or why they were there.
[00:12:03] They've been trained to do that,
[00:12:04] but I don't think they're particularly better.
[00:12:06] And I would say that anyone who thinks they're really good
[00:12:09] probably isn't just for a bit of Dunning-Kruger
[00:12:11] kind of thing, you know?
[00:12:12] Yeah, and I kind of think everybody thinks
[00:12:15] they're pretty good.
[00:12:16] Like there's several skills out there.
[00:12:18] Like, you know, it's that whole thing
[00:12:19] where nine out of 10 people think
[00:12:21] they're an above average driver
[00:12:24] or most people I talk to who play poker
[00:12:27] think they're an above average poker player.
[00:12:29] Most people think they're above average judge of people.
[00:12:31] So I think there's a kind of category of skills
[00:12:34] where people in general think they're better than average
[00:12:37] when you can't have everybody better than average.
[00:12:39] Some people have to be below average.
[00:12:41] So I'm willing to take the hit on driving
[00:12:44] and judge of character and even like lying.
[00:12:47] It's very hard for me to tell someone's lying.
[00:12:50] But I would think with AI,
[00:12:52] let's say you feed it 10,000 facial expressions
[00:12:55] of someone who's lying and 10,000 facial expressions
[00:12:57] of someone who's telling the truth.
[00:12:58] And then you make glasses with this lie detector
[00:13:02] in the glasses and I could look at someone
[00:13:04] and then use AI to determine what the chances are
[00:13:07] that they're lying or not.
[00:13:08] I bet you that's possible.
[00:13:09] I bet that is.
[00:13:11] Stuff like that will surely come around
[00:13:13] and that's going to bring up
[00:13:13] all sorts of philosophical issues.
[00:13:15] If the machine says someone did it,
[00:13:18] even if we don't have the evidence,
[00:13:19] just because it's showing they lied,
[00:13:20] is that enough to put someone away?
[00:13:22] There was a scene in the book I did with Dr. Marvin Chun
[00:13:26] who is involved in brain reading,
[00:13:28] which is, you know, it's now been around for years and years,
[00:13:31] brain reading where you put someone in a fMRI scanner
[00:13:36] and you can actually,
[00:13:37] you can draw what they are thinking of
[00:13:40] and you can read what they are,
[00:13:42] what the words they're thinking
[00:13:44] or what they're listening to.
[00:13:45] And that's actually real
[00:13:47] and that's absolutely insane.
[00:13:49] At the moment, it's not, it's still a bit blurry,
[00:13:52] it's still a bit vague,
[00:13:52] it doesn't get it quite right
[00:13:54] because the fMRI scanners are only really good enough
[00:13:56] to get to the nearest like thousandth neuron at a time
[00:14:00] to know which ones are firing up and which ones aren't.
[00:14:02] But it stands to reason that within,
[00:14:05] it might be 10 years, it might be 100,
[00:14:06] it might be 1,000,
[00:14:07] we will be able to locate individual neurons
[00:14:10] and see if they're firing up
[00:14:11] and we'll get a really good idea.
[00:14:13] And I bet someone will be able to do that
[00:14:14] with just a phone, you know,
[00:14:15] you just hold out to someone's face
[00:14:17] and you'll know what they're thinking
[00:14:18] and if they're lying.
[00:14:20] But it's interesting what we do know,
[00:14:23] like you mentioned this one statistic in the book,
[00:14:25] which is that the longer,
[00:14:27] and this is counterintuitive,
[00:14:29] the longer people are married,
[00:14:30] the less they actually can, you know,
[00:14:34] guess what the other is thinking.
[00:14:36] Yes.
[00:14:37] Which surprised me.
[00:14:38] Yeah, we get,
[00:14:39] and also that was from someone's book
[00:14:42] and I wish I could remember his name to give him a shout out
[00:14:44] but I can't think of who it was,
[00:14:45] but it was a book and it was good.
[00:14:47] And the point here was that
[00:14:50] couples that have explosive arguments
[00:14:53] tend to fare better long-term
[00:14:55] because they're constantly resetting
[00:14:57] their opinions of one another
[00:14:58] and they're resetting who they are.
[00:15:00] You have an explosive argument,
[00:15:01] it's like, oh, that's who she is,
[00:15:03] like she's, you know,
[00:15:04] and that's constantly happening.
[00:15:06] Whereas if you're one of those couples
[00:15:07] who never argues, never has explosive arguments,
[00:15:10] 30 years later, you suddenly find like,
[00:15:12] who is this person I'm living with?
[00:15:14] I haven't recalibrated anything
[00:15:16] to understand who the person,
[00:15:17] the 45-year-old is compared to the 20-year-old
[00:15:19] that I first met, you know?
[00:16:25] But the rest of your body, oh yes,
[00:16:28] it's gonna relish every moment of it
[00:16:31] because there are drinks.
[00:16:32] Then there are drinks from McDonald's.
[00:16:35] Try the new ice cold frozen Philadelphia Fanta
[00:16:38] with piping hot fries.
[00:16:40] Get any size frozen drinks for $1.49.
[00:16:43] Price and participation may vary,
[00:16:44] cannot be combined with any other offer.
[00:16:47] Ba-da-ba-ba-ba.
[00:16:52] It's so interesting, all this stuff,
[00:16:53] like, do you know,
[00:16:55] you ever read the book about data
[00:16:57] by Seth Stevens-Davidowitz?
[00:16:59] No.
[00:17:00] I forgot the name of this book.
[00:17:01] He's been on the podcast twice about,
[00:17:03] so he's a data analyst and he used to work at Google
[00:17:06] and he had an interesting anecdote,
[00:17:08] which is that in India, for some reason,
[00:17:14] there's a lot of search queries about
[00:17:18] the fetishizing of, I think it was,
[00:17:22] you know, feeding off of your wife's breast milk
[00:17:27] during sex.
[00:17:29] So I think that was the actual fetish.
[00:17:33] I'm not entirely sure.
[00:17:34] But you wouldn't have known this about this,
[00:17:37] and it was only from this one country.
[00:17:39] And you wouldn't have known this if it,
[00:17:41] like, it's not like people talk about it or write about it,
[00:17:44] but it's just like all these people
[00:17:45] in the privacy of their homes Google it.
[00:17:47] And so that-
[00:17:48] What the why?
[00:17:49] Why it is?
[00:17:51] Well, I don't know why it is,
[00:17:52] but it's interesting how data allows us to kind of,
[00:17:56] you know, in all of its different forms,
[00:17:58] like in this case, it's Google queries.
[00:17:59] Data is sort of unraveling the secrets
[00:18:02] of society pretty quickly.
[00:18:04] Yeah, that's smart.
[00:18:05] I'm just thinking of two possible reasons.
[00:18:08] Okay, so one is, and this is, I'm just half joking,
[00:18:11] but just they don't have any food, they're hungry.
[00:18:15] And so breast milk, because that is a way, right?
[00:18:18] If you're on an island,
[00:18:20] you can drink your own wee once.
[00:18:22] I'm not talking about if you're just on holiday
[00:18:24] on an island somewhere, you shouldn't do that.
[00:18:26] You should go like swimming or something
[00:18:27] and have a nice time and go for dinner.
[00:18:29] But if you're like on a desert island, drink your own,
[00:18:32] you know, I don't know what you can do with your sperm.
[00:18:34] I haven't looked into that.
[00:18:35] But presumably breast milk can,
[00:18:37] but you have to, the woman has to have just had a,
[00:18:39] I forgot that they don't just constantly eat breast milk.
[00:18:42] Okay, forget that for a second, but that might be.
[00:18:44] And then cows are sacred, aren't they?
[00:18:47] What's sacred?
[00:18:48] Oh yeah, cows are sacred, that's right.
[00:18:50] That's where it comes from.
[00:18:51] And it's interesting that, again, like,
[00:18:56] with the fact that there's so much data in the world now,
[00:18:59] this is what's, you know, you mentioned how,
[00:19:03] you know, different sex toys were keeping track of data
[00:19:06] inappropriately, like one company lost
[00:19:09] in a class action suit because it turned out
[00:19:11] they were keeping the data on people's usage
[00:19:13] of their Bluetooth enabled sex toys.
[00:19:17] But also I think it was Garry Kasparov
[00:19:21] was telling me about this.
[00:19:21] He is doing some consulting for a cybersecurity firm.
[00:19:24] You gotta be careful even of your nest,
[00:19:27] you know, your internet thermometer,
[00:19:29] because what if, you know, from two to 3 p.m. suddenly
[00:19:33] one day it's hotter than usual in the bedroom
[00:19:37] and the person traveling on work sees why,
[00:19:40] why was all of a sudden the bedroom like,
[00:19:42] you know, 90 degrees?
[00:19:43] Like there's a lot of data that we never would have had
[00:19:47] before in the last million years,
[00:19:49] but now it's telling us all these things
[00:19:52] about our lives and our personalities
[00:19:54] and society's lives that were secret before.
[00:19:58] And all this is gonna be more and more revealed.
[00:20:00] Yeah, we had, it seems like we may have had
[00:20:03] a very small window in human history
[00:20:05] of the last 300,000 years where privacy was a thing.
[00:20:10] And, you know, centuries ago,
[00:20:13] they didn't have corridors or hallways
[00:20:15] or anything in a house.
[00:20:16] That's a new concept.
[00:20:17] So it was just like a room
[00:20:19] and everyone sleeps in that room.
[00:20:20] So the parents having sex,
[00:20:22] that's just in front of the kids.
[00:20:24] The kids go into the toilet,
[00:20:25] everyone's hearing and smelling that.
[00:20:27] Like privacy just wasn't a thing.
[00:20:29] That's a relatively new concept for us.
[00:20:32] And we've managed, it's brilliant and wonderful,
[00:20:35] although maybe you could argue it makes us weaker
[00:20:37] because we're just,
[00:20:38] we're so used to having all of our space.
[00:20:40] And then as soon as that's encroached upon, we're upset.
[00:20:43] But yeah, that seems to be dying quite quickly
[00:20:46] because of all of these things.
[00:20:47] The death of privacy, the sex toys are everywhere.
[00:20:50] The Alexas are everywhere.
[00:20:52] And one of the segments in the book was about this guy,
[00:20:55] Babos, I'm not gonna guess his very long Greek name.
[00:20:58] I don't know how to pronounce it.
[00:21:00] But he had this relationship
[00:21:03] with a British woman called Caroline, I believe it was,
[00:21:07] in Greece.
[00:21:08] And one day, police get called.
[00:21:11] And it turns out Babos is like,
[00:21:13] he's got like a thing over his mouth,
[00:21:15] he's on the floor all tied up.
[00:21:16] And she, the wife and the dog are dead.
[00:21:21] And there's the baby, I think the baby survived.
[00:21:24] And it took like, you know, there was months of Babos,
[00:21:27] this Greek guy in front of the press,
[00:21:29] just crying, going to the grave every day, really sad,
[00:21:33] working with the authorities to find
[00:21:35] what he said were Albanian murderers
[00:21:37] who had come over and stolen their money
[00:21:39] and tied them up and all of this stuff.
[00:21:41] And I'm sure most people,
[00:21:42] when I started telling the story have already guessed,
[00:21:44] I mean, Babos was the killer.
[00:21:45] Or maybe they wouldn't have guessed that.
[00:21:46] But Babos was the killer himself, right?
[00:21:48] And the way they got him was like,
[00:21:53] after the time, so he had tied himself up,
[00:21:55] he'd killed their dog as well
[00:21:57] because he wanted to make it look like
[00:21:58] these Albanian gangsters had come and killed his wife,
[00:22:00] but he killed his wife.
[00:22:02] They got him, you know, he had like disabled his CCTV,
[00:22:05] he'd done loads of clever different things
[00:22:07] with the technology to make sure that it wasn't shown.
[00:22:09] And she was wearing like a heart thing on her watch
[00:22:13] that he had just neglected to think about.
[00:22:15] And that stopped beating at the time much earlier
[00:22:19] than he said that it had happened.
[00:22:21] So that started to cast doubts on some of the story.
[00:22:24] And then they looked at his phone on his health settings
[00:22:26] and it showed that he was doing lots of steps
[00:22:28] after a certain hour,
[00:22:30] because he was moving up and down
[00:22:31] when he said he had been tied up.
[00:22:32] So that's how they got him.
[00:22:33] Like there are so many things
[00:22:34] you think you've disabled all of it,
[00:22:36] and there's more that knows
[00:22:38] and can tell everyone what you're doing all the time.
[00:22:41] It's sort of like the other aspect of AI,
[00:22:43] which is, you know, you see all these videos now
[00:22:46] where Joe Biden says something
[00:22:48] or Donald Trump says anything
[00:22:49] or LeBron James says something,
[00:22:51] and it's not what they said,
[00:22:53] but AI can make the face look like they said something
[00:22:55] and it can mimic the voice.
[00:22:57] And so now clearly there's no way to really know
[00:23:00] or soon it'll be the case that there's no way to know
[00:23:02] what's real, what's not.
[00:23:04] I wonder if we have to just take a different view of,
[00:23:07] we just have to say, look, we don't really know what's,
[00:23:09] we're never again gonna know what's real and what's not.
[00:23:11] Like we don't know what's a secret,
[00:23:13] what's the truth, what's a lie.
[00:23:15] We kind of just have to give up on that fact.
[00:23:18] You would, I feel like so far,
[00:23:21] and I don't know if this will always happen,
[00:23:22] capitalism, I'm not some like mad pro-capitalist guy,
[00:23:25] but so far it seems to always find a solution
[00:23:28] for the problems.
[00:23:29] It causes its own problem and then somehow solves it.
[00:23:32] So I wonder if, you know, it might,
[00:23:37] it's caused this problem that now we're not gonna be able
[00:23:39] to know what's real and what isn't.
[00:23:41] Just from like science and technology,
[00:23:42] everything's growing and we're building these mad things
[00:23:44] and it's gotten out of control.
[00:23:46] Well, those same clever people, I'm hoping,
[00:23:49] will find ways to show, hey, this is genuine
[00:23:52] and this is not.
[00:23:53] And who knows if they're gonna be,
[00:23:54] I mean, I've heard murmurings of that,
[00:23:56] ways to sort of imprint something into the,
[00:23:59] maybe the DNA of the video to say,
[00:24:01] this is a real video that hasn't been touched by deepfake.
[00:24:03] I mean, I'm extremely worried.
[00:24:04] I mean, everyone is worried about losing their jobs to AI.
[00:24:07] And I don't know if you've seen,
[00:24:08] have you seen those like 24-hour news stations
[00:24:10] full of just fake people?
[00:24:12] No.
[00:24:13] Oh man, because I, you know, every single person,
[00:24:16] I'm sure you've found this as well,
[00:24:17] is like, it's mad, like AI is gonna take everyone's jobs,
[00:24:20] not mine though, because my job is so important
[00:24:22] and special and different.
[00:24:23] And so I, you know, and I suppose I felt like that
[00:24:25] as a YouTuber, because I've now, since I last,
[00:24:28] I guess I was a podcaster and I think I'm now a YouTuber.
[00:24:31] I thought, well, how can they do that?
[00:24:32] And now I've seen these news channels
[00:24:34] and it's people that look and sound just like me and you.
[00:24:37] They look like they're in a real studio
[00:24:39] and they're doing 24-hour news.
[00:24:40] They've got inflection in their voice.
[00:24:42] They're going like, hey, over to you, John.
[00:24:44] What do you think was going?
[00:24:45] Well, it's a bit crazy what's happening up here.
[00:24:47] Oh, you're, you know, you're American, John.
[00:24:48] Yeah, I am.
[00:24:49] And that's going on.
[00:24:51] And why couldn't they just have,
[00:24:53] and once you can have one or two like AI YouTubers,
[00:24:56] you can have an infinite number.
[00:24:57] So there could be millions of people all putting out
[00:25:01] their own AIs.
[00:25:02] Well, how the hell am I going to compete with this like AI
[00:25:05] that is tuned to say the most perfect things
[00:25:08] at the most perfect times
[00:25:10] to get as much interaction as possible?
[00:25:12] The only thing I can think is
[00:25:13] if there is something in it to show,
[00:25:15] no, no, but this is an AI
[00:25:17] and you hope that people still want to watch real people,
[00:25:19] but they might be able to get away with it
[00:25:21] without us knowing who's AI and who's not.
[00:25:24] We know people don't really care that much
[00:25:25] about watching real people because for instance,
[00:25:28] they watch cartoons and anime and so on.
[00:25:31] And they're just as happy doing that for entertainment
[00:25:34] as they are watching real people.
[00:25:35] But you're a writer and a journalist
[00:25:38] and like an investigative type of journalist.
[00:25:40] So like, let's say I was building a media organization
[00:25:44] and let's say it was a financial news service.
[00:25:46] I would fire all the financial reporters
[00:25:48] and replace them with AI
[00:25:49] because AI could do probably a better job
[00:25:51] like of reporting Microsoft's earnings,
[00:25:53] probably a better job
[00:25:54] because they know past earnings of Microsoft.
[00:25:57] They can say, oh, it's similar to how Oracle had a dip
[00:25:59] because of this, this, this.
[00:26:01] So AI is going to be a better basic financial journalist.
[00:26:04] And for celebrity gossip too, I would fire everybody.
[00:26:08] Oh, Britney Spears had a weird video the other day,
[00:26:11] the same day her, some legal case was coming up.
[00:26:14] Like AI could do that better than humans,
[00:26:16] but look at what you do.
[00:26:18] In this book, for instance,
[00:26:19] you have spent time building up a relationship
[00:26:22] with Amanda Knox who's famous or infamously
[00:26:26] had this case in Italy, this murder case in Italy
[00:26:29] that was held the world by, you know,
[00:26:31] it was a gripping story and you interviewed her
[00:26:34] and you have your own unique take on interviews.
[00:26:36] I think for true creatives, AI will be like a collaborator
[00:26:41] but you're always going to be able to enhance
[00:26:43] what the AI does.
[00:26:45] Even if the AI is great, even if the AI is great,
[00:26:48] the AI is not going to interview Amanda Knox
[00:26:51] and build a relationship with her
[00:26:52] and, you know, get the answers you got.
[00:26:55] And I'm just using that as an example.
[00:26:57] The whole book is filled with stories and examples
[00:26:59] and people who opened up to you about secrets,
[00:27:03] now maybe they would open up the same way to an AI,
[00:27:06] but you're going to have your personal interpretation
[00:27:08] and you're always going to be able to take
[00:27:12] what an AI produces and enhance it.
[00:27:14] But for a video, you can't enhance it, it's already done.
[00:27:18] Like it's just, here's the video, you know,
[00:27:22] let's put me in there now to make it better.
[00:27:24] That's not going to be a role that happens.
[00:27:27] So I think for some things,
[00:27:29] AI is going to be a great collaborator
[00:27:30] and make your job better.
[00:27:32] But for other things, yeah, the job's going to be fired.
[00:27:36] But, you know, I do think, you know,
[00:27:40] AI and data are going to play a big role,
[00:27:42] have a big effect on the role of secrets.
[00:27:45] Just like, you talk a lot about cults in the book.
[00:27:49] I think in general, the internet
[00:27:51] and the more widespread spread of knowledge
[00:27:55] has probably reduced, I would think,
[00:27:56] has reduced the influence of cults on people.
[00:27:59] But maybe I'm wrong.
[00:28:00] You talk a lot about cults in this book.
[00:28:02] Yeah, yeah, well, I used to do a lot of YouTubing on cults
[00:28:06] and now I'd sort of do more culture war stuff
[00:28:09] because I caught it controversy, I suppose.
[00:28:11] I suppose as cults have become less controversial,
[00:28:13] it's just because everyone's on board with that
[00:28:15] except for Tom Cruise and John Travolta,
[00:28:17] like everyone's anti-cult except for those two guys.
[00:28:20] And even John Travolta, I think secretly,
[00:28:22] he's like, let me out, you know?
[00:28:24] Yeah.
[00:28:25] So the cult stuff I find fascinating
[00:28:28] and I think you're right.
[00:28:30] There were two big, I mean, Scientology is the one
[00:28:31] everyone likes to know about
[00:28:32] because Scientology is the celebrity one
[00:28:35] and it was really big at one point
[00:28:37] and they still claim to have millions
[00:28:39] and millions of members.
[00:28:41] It's thought that they actually have 20 to 30,000 now
[00:28:44] which is smaller than like our numbers of listeners.
[00:28:47] Like we have larger cults now in that sense
[00:28:49] than Scientology.
[00:28:52] So they had a lot of trouble.
[00:28:54] I mean, the first thing that happened was WikiLeaks.
[00:28:55] They got loads of stuff about the origin story
[00:28:58] of Scientology out there
[00:28:59] and then South Park famously put out this episode
[00:29:02] with Tom Cruise and John Travolta
[00:29:06] all about the secrets of Scientology
[00:29:07] and what they actually believe.
[00:29:09] And that's one of the things I think people were surprised by
[00:29:11] is if you look at cults,
[00:29:13] people think that cults only tell their truth
[00:29:15] to their members and they keep other truths out.
[00:29:18] But not only do they actually,
[00:29:20] I mean, they do keep other truths out
[00:29:21] but they also don't reveal their own doctrine
[00:29:24] to their members.
[00:29:26] And that's something that religions used to do.
[00:29:28] You go back to Christianity years ago,
[00:29:29] it was only available in Latin or Greek or whatever.
[00:29:33] They didn't want the populace
[00:29:35] to actually be able to understand what they were worshipping.
[00:29:37] That's a way to keep people,
[00:29:39] if you keep them not knowing all the time,
[00:29:40] they're just gonna be obedient
[00:29:41] because they don't know what's in there.
[00:29:42] That's a real way, secrets to keep power over people.
[00:29:46] Scientology, again, people think that Scientologists,
[00:29:49] you know, they're like,
[00:29:50] how can you believe in this stuff?
[00:29:52] Galactic warlord Xenu has taken nine
[00:29:55] or like hundreds of billions of people from his planet
[00:29:58] and put them into Earth,
[00:30:00] which was known as Tegiac back then,
[00:30:02] and in Scientology speak,
[00:30:04] and put them into volcanoes to kill them.
[00:30:07] And their souls came out,
[00:30:08] they're called operating thetans.
[00:30:10] And the operating thetans were put into cinemas.
[00:30:12] This is actually what they believe.
[00:30:14] They put into cinemas,
[00:30:15] and the cinemas taught them fake things
[00:30:17] called Christianity and Judaism and Islam.
[00:30:20] And the real thing is Scientology.
[00:30:22] So who could possibly believe that?
[00:30:24] But most Scientologists don't even know about that.
[00:30:27] You don't learn about that until you go up these levels
[00:30:29] called operating thetans.
[00:30:31] One, two, and operating thetan three,
[00:30:34] or operating thetan level three
[00:30:35] is when you learn about Lord Xenu.
[00:30:38] By that point,
[00:30:39] you've already spent hundreds and hundreds
[00:30:40] of thousands of dollars.
[00:30:42] You're in so deep.
[00:30:43] You've cut off all of your family and friends.
[00:30:45] So there's the boiling frog concept
[00:30:48] where it's happened so slowly
[00:30:49] that you just have to sort of go along
[00:30:51] with whatever they're telling you by this point.
[00:30:53] And the sunk cost fallacy as well.
[00:30:55] You've put so much in,
[00:30:56] you've got no choice.
[00:30:57] You've lost all your friends and family.
[00:30:59] Just keep on going.
[00:31:01] So it's amazing how,
[00:31:02] and even then,
[00:31:03] every level up you go,
[00:31:04] the more money you give,
[00:31:05] the more secrets you are told,
[00:31:07] the more special you feel as well,
[00:31:09] but also the more in their control you are.
[00:31:12] Yeah, it's interesting.
[00:31:16] Yeah, you're right.
[00:31:17] All these cognitive biases you meant,
[00:31:19] that you just said,
[00:31:19] like sunk cost fallacy and so on.
[00:31:21] And when I was reading that in your book,
[00:31:23] I was thinking there were two other things at play
[00:31:26] which remind me of Robert Cialdini's book, Influence.
[00:31:29] One is consistency.
[00:31:30] So with Scientology,
[00:31:32] or NXIVM is a good example.
[00:31:34] NXIVM was a fascinating example
[00:31:35] because as you point out in the book,
[00:31:37] there's not the whole kind of science fiction backstory.
[00:31:40] And it was sort of unique in famous cults because of this,
[00:31:44] but it starts off just as a self-help thing.
[00:31:47] Like, hey, come to this meeting
[00:31:49] and we're going to talk about
[00:31:51] how you could feel better in life
[00:31:52] and perform better and be more productive.
[00:31:55] And so it's almost like,
[00:31:58] there's like this consistency thing happening
[00:32:00] which Robert Cialdini talks about.
[00:32:02] Get people to say yes.
[00:32:03] Like, hey, don't you want to know
[00:32:04] how to live a better life?
[00:32:05] Oh, yes, of course you do.
[00:32:07] So you kind of bring them in with consistency.
[00:32:09] And then as they trust you
[00:32:10] and as they get used to saying yes to you,
[00:32:12] they stay consistent with that.
[00:32:14] And now they say yes to more and more extreme things like,
[00:32:17] hey, do you want to go to this retreat?
[00:32:19] It's only $10,000.
[00:32:21] And oh yes, of course I do.
[00:32:24] And I remember one time going to,
[00:32:27] what was the thing that came after Est?
[00:32:31] It was cult-like.
[00:32:33] And I remember going to the first meeting.
[00:32:34] I went because there was this girl I liked
[00:32:36] who said, oh, we should go to this.
[00:32:38] And then afterwards, two people like double teamed me
[00:32:42] and said, hey, we're going to have a retreat next weekend.
[00:32:46] Do you want to go?
[00:32:47] And I'm like, well, I have to check my schedule.
[00:32:51] And then they really started pouring it on.
[00:32:53] Like, oh, are you not ready?
[00:32:55] Are you not a committed kind of person?
[00:32:59] Like, do you always doubt yourself and blah, blah, blah.
[00:33:03] So there was like all these tricks they were using
[00:33:05] to try to get me to say yes, because I was already there.
[00:33:09] Interesting.
[00:33:10] NXIVM is interesting because I knew someone
[00:33:13] who was very high up in NXIVM,
[00:33:16] and I knew when she was still in NXIVM
[00:33:18] and she would talk to me.
[00:33:20] This is when NXIVM wasn't like a common word.
[00:33:24] Keith Raniere was still out there doing his thing.
[00:33:26] And she would tell me stuff like,
[00:33:29] why are people persecuting us?
[00:33:30] We're just trying to help people.
[00:33:32] And, but of course that was the first step
[00:33:35] towards a much more, you know, a darker side to it.
[00:33:38] And so I wonder if that's like
[00:33:40] how people get sucked into these things.
[00:33:43] The consistency aspects.
[00:33:44] Yeah, no, absolutely.
[00:33:46] There's probably so many aspects to it.
[00:33:47] Yeah, you start saying yes,
[00:33:49] and if they've just gotten you.
[00:33:50] It's interesting you went with a girl as well,
[00:33:52] because that's, a lot of people get into cults
[00:33:54] because of that as well.
[00:33:55] I mean, Steve Hassan is known
[00:33:56] as sort of the cult godfather or something
[00:33:59] for writing about the dangers of cults.
[00:34:03] And he wrote this thing called the BITE model, B-I-T-E.
[00:34:06] And it's like one of those things
[00:34:07] where it's a list of this is what a cult is kind of thing.
[00:34:10] But he, when he was younger,
[00:34:12] he was like, I don't know, in his 20s.
[00:34:13] He was at university.
[00:34:14] He'd just been broken up with by his girlfriend
[00:34:16] and he was in the library
[00:34:17] and three beautiful women approached him.
[00:34:20] And he was just like, at that point,
[00:34:21] I just lost hold of all of my senses
[00:34:24] and was very vulnerable.
[00:34:25] And these beautiful women were saying,
[00:34:26] hey, you got to come along
[00:34:27] to this thing called the Moonies,
[00:34:29] which was Reverend Moon in South Korea
[00:34:32] who was thought to be the second coming of Christ.
[00:34:34] I don't think that everyone is,
[00:34:37] I mean, there's a line in the sort of anti-cult community
[00:34:40] on YouTube and all these things.
[00:34:42] They will sort of, I think they're scared to victim blame.
[00:34:45] And I understand that urge
[00:34:47] that you don't want to blame
[00:34:48] the people who've fallen into cults.
[00:34:50] And I think we shouldn't blame them.
[00:34:51] We should just be honest about what's going on.
[00:34:54] And most of us, I think most people listening,
[00:34:56] if they were told,
[00:34:57] no matter how beautiful these three women were,
[00:35:00] they were told the second coming of Christ
[00:35:01] is a South Korean guy
[00:35:03] and you should come and join this thing.
[00:35:05] And most of us would say,
[00:35:07] no, you were right, thank you.
[00:35:09] So I don't know.
[00:35:11] But yeah, I agree with what you're saying
[00:35:13] with the consistency aspect.
[00:35:14] Once he said yes once or twice,
[00:35:16] the guy's in.
[00:35:43] The sizzle of McDonald's sausage.
[00:36:05] It's enough to make you crave your favorite breakfast.
[00:36:08] Enough to head over to McDonald's.
[00:36:11] Enough to make you really wish this commercial
[00:36:14] were scratch and sniff.
[00:36:16] And if you're a sausage person,
[00:36:18] now get two satisfyingly savory sausage McGriddles,
[00:36:21] sausage biscuits or sausage burritos for just $3.33
[00:36:24] or mix and match.
[00:36:25] Price and participation may vary,
[00:36:26] cannot be combined with any other offer
[00:36:28] or combo meal, single item at regular price.
[00:36:30] Ba-da-ba-ba-ba.
[00:36:35] Think about the guys you wrote about with Scientology.
[00:36:37] It's the one who like joined twice even.
[00:36:39] And you know, and again, most people, like you say,
[00:36:43] don't know about the Lord Zinu stuff and all that
[00:36:46] until they're level five or whatever in the Scientologist.
[00:36:49] And so people, and it's like you said too,
[00:36:53] the smarter you are, the more likely it is
[00:36:55] you could believe in these things.
[00:36:57] You know, I think it's true too.
[00:36:59] The higher IQ you are,
[00:37:01] the more likely it is you could be hypnotized.
[00:37:03] Like you start to tell a story of this guy,
[00:37:05] I think his name's Paul Robeson,
[00:37:07] wins a Nobel Prize and is somehow convinced
[00:37:10] to bring cocaine over the borders of Argentina.
[00:37:13] Like how did that guy get convinced to do that?
[00:37:16] Nobel Prize winner.
[00:37:17] It's so, it's amazing.
[00:37:18] And we all think we're not gonna get duped by stuff.
[00:37:21] And I had, I nearly got duped just the other day.
[00:37:24] I'm trying to make myself a Wikipedia page,
[00:37:26] which is an embarrassing thing to have to admit,
[00:37:28] but I know that it will help me for my career.
[00:37:30] So then screw it, we'll try and get ahead.
[00:37:32] And I was like, why don't I have a Wikipedia?
[00:37:33] So I tried to make it.
[00:37:34] And anyone who's tried to do that kind of thing
[00:37:36] on Wikipedia will know it's like impossible
[00:37:38] because there's a million rules to follow.
[00:37:40] There's a million rules and there's nasty people.
[00:37:44] Not all of them, I wanna add,
[00:37:46] but some who are just in control of these things
[00:37:49] and have undue amount of power
[00:37:51] because it's this decentralized thing.
[00:37:53] Yes, 100%.
[00:37:54] And so I'm sitting there going, what's, you know,
[00:37:56] anyway, I put a draft in.
[00:37:57] So now there's a draft.
[00:37:58] It says it could take up to three months.
[00:38:00] Someone on LinkedIn sent me a message
[00:38:02] and it's a LinkedIn person, works for Wikipedia,
[00:38:05] does all this stuff, blah, blah, blah, blah, blah.
[00:38:08] I've seen you've got a draft in.
[00:38:09] There's just a few little things you need to change
[00:38:11] and then we can put it up.
[00:38:12] Saying we, as if he's at Wikipedia, all this stuff.
[00:38:14] And that's really tempting.
[00:38:17] And he said, can you just, here's my email address.
[00:38:19] And his email address was Jonathan at wikiadmin.org
[00:38:23] or whatever it was.
[00:38:24] Oh my God.
[00:38:25] Yeah, like that's really compelling, isn't it?
[00:38:26] If you're me and you want to...
[00:38:28] So I'm going, oh, thank you.
[00:38:29] So I sent the email and I'm just,
[00:38:32] because I thought, you know, I don't know.
[00:38:34] It didn't even cross my mind this was fake
[00:38:35] because it was so obviously real.
[00:38:37] Although, you know, one of the things that actually got me
[00:38:39] because I did look again.
[00:38:40] So I emailed saying, hi, it's me.
[00:38:42] That'd be great, thank you so much.
[00:38:43] And then he replied saying, yeah, yeah,
[00:38:45] just give me a phone number
[00:38:47] and I'll just give you a quick call.
[00:38:48] And then I thought, okay,
[00:38:49] like you could have just messaged me
[00:38:51] what the problems were on LinkedIn.
[00:38:53] You could have messaged me in the email
[00:38:55] but now you're asking for more information.
[00:38:57] You want my phone number as well.
[00:38:58] What do you want my phone number for?
[00:39:00] And then I had another look at his LinkedIn
[00:39:02] and it said he speaks Urdu.
[00:39:04] And I know that's, I don't,
[00:39:05] there's nothing against people who speak Urdu
[00:39:06] which is one of the many languages of India.
[00:39:09] But his name was like Jonathan.
[00:39:11] And I thought, hang on,
[00:39:12] there are very few people called Jonathan who speak Urdu.
[00:39:15] Something's not right here.
[00:39:16] So I didn't sort of follow up with any of that.
[00:39:20] But they've just done a good job.
[00:39:21] So we could all be gotten, but very few of us.
[00:39:25] Okay, here's the thing with the cult people.
[00:39:27] When you join a cult,
[00:39:28] I think, and I get a lot of flack for saying this,
[00:39:31] I think that part of you wants to be above other people.
[00:39:35] And I think that's with the victim blaming thing,
[00:39:37] no one wants to admit that bit.
[00:39:39] They just say, hey, clever people can be susceptible.
[00:39:41] And I think, yeah, clever people
[00:39:43] who want to climb a hierarchy
[00:39:45] and to be able to be in control of other people
[00:39:47] and to be special.
[00:39:48] So that's what I think.
[00:39:49] I think that's very true.
[00:39:51] I think the hierarchy thing is critical.
[00:39:53] Like when you look at cults that have,
[00:39:55] let's say spun out of Scientology,
[00:39:57] meaning high level people who are at Scientology
[00:39:59] who ended up starting their own things,
[00:40:01] it always has that, hey, there's 12 levels.
[00:40:04] Here's the initiation for the first level,
[00:40:06] then second and third.
[00:40:07] And it's always more and more money
[00:40:09] to get to like higher and higher levels.
[00:40:11] And that works.
[00:40:13] You look at like, you know, I don't want to,
[00:40:16] not everything with a hierarchy is bad,
[00:40:18] but you look at like sports.
[00:40:19] Okay, you're either in the minor leagues or the major leagues
[00:40:23] or you're in this division or the higher division.
[00:40:26] And people judge their self-worth
[00:40:31] by where they are in the hierarchy that they choose.
[00:40:33] You can't be in every hierarchy,
[00:40:35] but oh, let's say you're a standup comedian.
[00:40:37] I saw this in the standup comedy world.
[00:40:39] Are you performing at small clubs, bigger clubs,
[00:40:43] you know, theaters, arenas, stadiums?
[00:40:47] Like there's a whole hierarchy there.
[00:40:49] And I think that's how society is more or less
[00:40:52] formed that way.
[00:40:53] So it makes sense that cults, you know,
[00:40:56] oh, here's a hierarchy that I actually could rise in
[00:40:58] and that they love me.
[00:40:59] They're love bombing me
[00:41:00] and they want me to rise in this hierarchy
[00:41:02] as opposed to most other hierarchies.
[00:41:04] That's it, the status game.
[00:41:07] Not only that, like you want to have power over others,
[00:41:10] not just people in the hierarchy,
[00:41:11] but hey, if you join this cult,
[00:41:14] you're going to have powers or some extra knowledge of life
[00:41:17] that you could use to, you know, make money or whatever.
[00:41:22] Yeah, you know what's proof of that?
[00:41:24] I just want to say about what you were saying
[00:41:25] about that need to climb ranks and be special
[00:41:28] and all of that stuff.
[00:41:29] I'm part of a, because I did a lot of YouTube
[00:41:31] with all the ex-Scientologists.
[00:41:33] So I'm part of their sort of ex-Scientology community.
[00:41:34] And as you can probably guess,
[00:41:36] it became quite culty itself and loads of infighting,
[00:41:39] loads of, they'd be telling these stories
[00:41:41] about their time in Scientology
[00:41:42] and acting a little bit as though like,
[00:41:44] God, it was so bad.
[00:41:45] But you could tell there was a little bit of pride
[00:41:47] about one of them was higher up than the other
[00:41:49] while they were in the cult.
[00:41:50] And they've all fallen out
[00:41:52] and it's been an absolute disaster
[00:41:53] all the ex-Scientologists.
[00:41:54] And I think it's just personality types.
[00:41:56] You've just got the personality types.
[00:41:58] I mean, to be fair, many of them were born into Scientology,
[00:42:00] but the ones who joined,
[00:42:02] these are people who want to climb up
[00:42:04] and be ahead of people.
[00:42:06] Yeah.
[00:42:07] And so I'm just curious,
[00:42:08] like you include the stuff about cults
[00:42:11] in a book called The Psychology of Secrets.
[00:42:15] I guess what's interesting there is that
[00:42:17] even the members of the cult don't know all the secrets
[00:42:19] and revealing the secrets
[00:42:22] is a way of luring people more and more in.
[00:42:24] So secrets have this hold on us.
[00:42:26] They're like a magnet.
[00:42:27] Like if you think about every movie
[00:42:29] and novel in the world,
[00:42:31] okay, sometimes people structurally describe
[00:42:34] a movie or a novel as, you know,
[00:42:36] the arc of the hero, Joseph Campbell's,
[00:42:38] you know, the hero's journey.
[00:42:39] But another way to describe the structure of a story
[00:42:42] is that there's a giant secret in the middle
[00:42:46] and it's gift wrapped with many layers of wrapping
[00:42:49] and each wrapping gets you further and further
[00:42:53] into the story until finally the secret is revealed.
[00:42:56] Like that's all what a story is.
[00:42:58] You delay it as much as possible.
[00:42:59] But if Scientology gave away too much of their secrets
[00:43:02] too early on, it's like, well, what do I do now?
[00:43:04] There's no point.
[00:43:05] So yeah, that's absolutely true.
[00:43:07] And it's obviously not just Scientology.
[00:43:08] This is something that authoritarian cults
[00:43:11] have done for a long time
[00:43:12] is to use secrets against one another.
[00:43:14] If you were high up in the Stasi,
[00:43:16] you might get a little bit of extra information
[00:43:18] about your neighbours rather,
[00:43:19] and who wouldn't want that
[00:43:20] rather than be the neighbour who isn't getting information,
[00:43:23] but their information about them
[00:43:24] is going to everybody else.
[00:43:26] So this use of secrets is a way to suppress people.
[00:43:31] I mean, it really is.
[00:43:33] And I think if you...
[00:43:36] Sorry, I lost my train of thought.
[00:43:38] Shit, what was I saying?
[00:43:40] So the secrets is like kind of withholding secrets
[00:43:43] and then giving secrets is almost like a currency of power,
[00:43:46] like in the authoritarian society sense.
[00:43:49] Yeah, and to speak to your movie,
[00:43:50] but this is what I was going to say
[00:43:51] to your point about movies
[00:43:52] and under the stories that we tell one another,
[00:43:55] it's definitely true
[00:43:56] that if something is supposed to be a secret
[00:43:57] or if part of it is kept secret,
[00:43:59] it really elevates that to the sublime,
[00:44:01] the mundane to the sublime.
[00:44:03] I give an example of my dad.
[00:44:06] One of the earliest memories I have as a child
[00:44:08] walking in on my dad
[00:44:09] and he's got like silver foil in his hair
[00:44:12] and he's dyeing his hair basically,
[00:44:14] getting highlights in there.
[00:44:15] You know, he was quite vain.
[00:44:16] He had this long hair.
[00:44:18] And nobody would remember that
[00:44:20] like as an early childhood memory
[00:44:22] except that he went,
[00:44:24] no, don't come in, like no, no, no, close the door.
[00:44:27] And I was like, oh shit,
[00:44:29] I've like stumbled in on something secret here
[00:44:31] and that has just like imprinted itself
[00:44:33] on my psyche forever
[00:44:36] and it's just such a boring, stupid thing.
[00:44:38] It's just him getting hair dye
[00:44:39] but you make something a secret,
[00:44:41] it just elevates it to this incredible level
[00:44:43] and I think good storytellers,
[00:44:46] good authoritarian leaders and cult leaders,
[00:44:48] they know exactly how to use those secrets
[00:44:50] to sort of ensnare you.
[00:44:53] Yeah, and like, you know, think about it.
[00:44:55] If he hadn't yelled out like,
[00:44:57] you know, get out of here or whatever,
[00:45:00] would you have remembered it?
[00:45:01] No, not in a million years.
[00:45:03] So it's interesting too how
[00:45:07] it's not the secret itself,
[00:45:08] sometimes it's the wrapping around the secret.
[00:45:12] Like that's why the unraveling done artfully
[00:45:15] is either a story or an authoritarian regime
[00:45:18] or a cult.
[00:45:19] And you know, you mentioned early on in the book,
[00:45:21] something like 97% of people in a survey
[00:45:23] say they have some deep, dark secret
[00:45:26] and people communicate.
[00:45:30] If I tell you a secret,
[00:45:31] let's say we're hanging out,
[00:45:32] I tell you a secret,
[00:45:33] that's almost a way of bringing two people closer.
[00:45:36] Oh, this person's sharing a secret with me,
[00:45:39] now I'm going to share a secret with them
[00:45:40] and it's sort of,
[00:45:42] that's often a way of two people becoming closer
[00:45:46] and that could be used in a manipulative way too
[00:45:48] if you know that's why you're revealing the secret.
[00:45:50] And so it's interesting how fundamental
[00:45:55] the psychology of secrecy is to all communication really.
[00:46:00] Absolutely.
[00:46:01] And I think also,
[00:46:02] I try to think about what a secret actually is
[00:46:05] and how it differs from something that's just private
[00:46:08] or a lie is also,
[00:46:10] is that a secret?
[00:46:11] Is it not a secret?
[00:46:12] Where do these things overlap?
[00:46:13] And it's interesting,
[00:46:14] I love thinking about the fact that,
[00:46:17] okay, so a secret it seems
[00:46:19] is actually a societal transgression.
[00:46:22] It's you've done something that you want to keep quiet
[00:46:26] that goes against the norms
[00:46:28] of the particular time and place you live in.
[00:46:30] So I don't know,
[00:46:32] if you kept a slave in the 18th century,
[00:46:35] then that might be private,
[00:46:37] but it's not no big secret,
[00:46:39] it's just everyone knows that's what you're doing.
[00:46:41] If I had a slave in my house now,
[00:46:44] that would be a huge secret, you know?
[00:46:46] And the only thing that's changed
[00:46:48] is the time and society and whatever,
[00:46:49] like human beings are still human beings.
[00:46:52] And the value of that information has risen, right?
[00:46:55] Because fewer people have slaves,
[00:46:57] one would think as a percentage of society,
[00:46:59] fewer people have slaves living in their basement
[00:47:02] now than in the 1800s.
[00:47:04] And so the value of that piece of information has risen
[00:47:08] because supply has gone down
[00:47:09] and demand has probably gone up for that knowledge.
[00:47:13] The value of that information has gone up to the level
[00:47:15] where you could say this is a secret.
[00:47:17] No, absolutely.
[00:47:18] And then another example is like,
[00:47:21] you might think that how often your parents have sex,
[00:47:23] that's a secret or whatever.
[00:47:25] Your parents have sex, that's a secret.
[00:47:26] And it's not really because...
[00:47:28] But that information has no demand.
[00:47:30] There's no demand for that information.
[00:47:32] So what's the difference between that secret
[00:47:33] and the slave in the basement?
[00:47:37] Well, that secret goes along with societal expectations,
[00:47:40] that's all it is.
[00:47:41] And funnily enough, if a couple doesn't have sex,
[00:47:45] so the less salacious option is just,
[00:47:47] a couple doesn't have sex, right?
[00:47:48] Boring. What a boring piece of information.
[00:47:51] But that is a secret.
[00:47:52] That is a secret because society expects
[00:47:55] that a couple has sex.
[00:47:55] When the sex dries up, that becomes a secret.
[00:47:58] Them having sex is just private.
[00:48:00] So anything that's private,
[00:48:02] if you make that something that would cause ostracization
[00:48:06] where it aired, if anyone knew about it,
[00:48:09] or if it was something that goes against societal expectations,
[00:48:12] that's when it becomes a secret.
[00:48:14] And, you know, that has really important...
[00:48:20] I think it's important because the way
[00:48:23] that we portray ourselves on Instagram,
[00:48:25] for example, social media,
[00:48:26] and I know this is an overspoken about topic,
[00:48:28] but we're these perfect pristine people
[00:48:31] and all of these kinds of things.
[00:48:33] It means that when you do something
[00:48:34] that doesn't go along with those ridiculous expectations,
[00:48:37] you feel a lot of shame
[00:48:38] because those are the societal expectations.
[00:48:41] They don't align with what we're all actually doing.
[00:48:43] And that's why in my survey,
[00:48:45] I did a survey to find out how much people pick their noses
[00:48:47] because I just find that such a ridiculous, silly thing
[00:48:51] that, you know, I told a story about a friend of mine
[00:48:54] who he always remembers one of the first memories he has
[00:48:57] when he was like 10, his mom's taking him to school
[00:48:59] and he's got his finger right up his nose,
[00:49:02] like really having a pic.
[00:49:03] And then he suddenly notices
[00:49:05] that outside the window, they're in traffic.
[00:49:08] There's these three girls about his age
[00:49:09] who've been walking alongside the car,
[00:49:11] looking at him the whole time and laughing with each other
[00:49:14] and pointing and stuff.
[00:49:15] And it's just what a traumatic thing.
[00:49:17] And it's so bizarre because it's something that everyone,
[00:49:20] according to my survey, does.
[00:49:21] Most people feel they do it way too much.
[00:49:24] And yet we have to pretend it doesn't happen.
[00:49:26] And that makes it into a secret
[00:49:27] rather than just a private matter.
[00:49:30] There's one theory which we just talked about,
[00:49:31] which is what's the value of this information?
[00:49:34] The other is what's the penalty for this information?
[00:49:37] So, and that's something that's a little bit personal.
[00:49:39] For him, the penalty of these girls
[00:49:42] seeing him pick his nose was pretty high.
[00:49:45] Even though it's girls he's never gonna see again,
[00:49:47] he never has to think about this in the future.
[00:49:49] But somehow in his head right that moment,
[00:49:50] the penalty was so high,
[00:49:52] he remembers it for the rest of his life.
[00:49:54] And it's the same thing with,
[00:49:58] for your dad, his self-conception of the penalty
[00:50:02] of his secret was so high
[00:50:04] that that's what made it kind of bigger than it was.
[00:50:08] So that's like a self-perception kind of thing.
[00:50:11] And also things like insider information in the stock market,
[00:50:15] there's a legal penalty.
[00:50:17] So that's why some things are a secret.
[00:50:20] If you have information and that can't get out there,
[00:50:22] there's a penalty for that information
[00:50:24] getting out there, a legal penalty.
[00:50:26] And as opposed to other things,
[00:50:29] which might just be, there's no penalty,
[00:50:31] there's no value, but it's private then.
[00:50:35] If someone said, James, how often do you have sex?
[00:50:39] Okay, I'm not gonna reply, that's secret.
[00:50:44] I think it's private.
[00:50:44] I think that's private.
[00:50:46] Unless the number is either like zero or like a thousand.
[00:50:51] And then it's like-
[00:50:52] Right, because then there's a penalty.
[00:50:53] There's some ostracism, there's some judgment.
[00:50:55] Yeah, yeah.
[00:50:56] I think a lot of this is an evolutionary hangover.
[00:50:59] If you look at tribes,
[00:51:00] imagine tribes going about their business,
[00:51:02] doing their thing back in the day,
[00:51:05] and then you did something that you knew
[00:51:07] went against societal expectations.
[00:51:09] So in my friend's case, it's had his finger up his nose.
[00:51:12] That went against society.
[00:51:14] So he suddenly had this pang of utter humiliation.
[00:51:19] And that's a way for your body to tell you,
[00:51:21] you're walking around in these tribes like,
[00:51:22] hey man, you just did something
[00:51:24] that the rest of the tribe don't like,
[00:51:26] and I'm gonna make it feel really bad for you in here
[00:51:29] so that you don't ever do that again
[00:51:31] and don't get kicked out the tribe.
[00:51:33] Right, so that's really interesting because,
[00:51:36] okay, so Darwin had his theory of evolution,
[00:51:39] but then he had this other theory,
[00:51:41] which he wrote about,
[00:51:42] I think it was his theory of sexual selection,
[00:51:44] which he asked the question, why are artists attractive?
[00:51:49] Why are rock stars attractive?
[00:51:51] And it's because everyone loves a rock star.
[00:51:56] And his theory was that they sort of rose
[00:52:01] beyond evolution in the sense that
[00:52:04] an artist had no real value in society.
[00:52:07] If you painted on the cave walls,
[00:52:09] you're not like hunting and getting food or anything.
[00:52:12] So this idea that you've risen beyond
[00:52:16] the normal expectations of evolution
[00:52:18] and that becomes value.
[00:52:19] Like how could this person be so special
[00:52:21] that he's even beyond evolution?
[00:52:23] And so Darwin's theory was that
[00:52:25] that is sexually rewarded more than even
[00:52:28] being the best hunter of bears.
[00:52:31] And in this case, if you don't care that someone
[00:52:36] saw you picking your nose,
[00:52:39] you're cooler than the people who do care.
[00:52:42] The people who do care are like more nerdy,
[00:52:43] like, oh my God, I picked my nose and they saw,
[00:52:45] but the people who are cool are like,
[00:52:47] screw you, I can do whatever I want.
[00:52:49] They get the girls or the guys or whatever.
[00:52:52] And I noticed, and the reason I bring this up
[00:52:56] is I noticed when I was writing,
[00:52:58] when I switched around 2010,
[00:53:01] I used to write about mostly just stocks and financial stuff.
[00:53:03] And then I started writing about how times
[00:53:05] that I've gone broke and other kind of shameful things
[00:53:09] where things I used to be ashamed about,
[00:53:10] but then I would just write about them openly.
[00:53:12] My audience probably, my writing audience probably 10X
[00:53:16] over what it was before because I was owning,
[00:53:20] I was rising beyond the shame I was supposed to feel
[00:53:24] about going broke.
[00:53:26] I think that's, there's definitely something to that.
[00:53:29] I'm not, the only thing I think is that,
[00:53:32] of course you could do something really embarrassing
[00:53:34] and it would get millions of views,
[00:53:35] but it might not make people think you're cool.
[00:53:39] I think people like to think you're cool
[00:53:41] because you're a great writer
[00:53:42] and because the things that are embarrassing
[00:53:44] are things that happen to everyone all the time
[00:53:45] and people can relate.
[00:53:47] There's also a difference between,
[00:53:48] as you say, you're owning it like that.
[00:53:49] There wasn't much that my friend could have done,
[00:53:51] I don't think, when he had his finger up his nose like that.
[00:53:53] So, I know he has,
[00:53:55] I think there are some things that cause such revulsion
[00:53:57] in a particular society.
[00:53:59] So it just so happens in Western modern society,
[00:54:02] a finger up a nose is just revulsion.
[00:54:05] And if you try and explain your way out of that,
[00:54:07] the groupthink is so strong,
[00:54:09] you'll see people jump on you.
[00:54:10] We can take a much stronger, more disgusting example.
[00:54:14] There was recently this guy in Spain who's a politician,
[00:54:16] I don't know if the news reached the States,
[00:54:19] and he was found to have been eating shit, basically.
[00:54:25] And he'd put them on some website or I don't know what
[00:54:27] and people had found out that it was him.
[00:54:29] And obviously that's disgusting
[00:54:32] and he quit his job and all of these kinds of things.
[00:54:34] And then a friend of mine, do you know Peter Boghossian?
[00:54:38] Yeah, yeah, he's been on the podcast a bunch of times.
[00:54:43] Oh, great, okay, I love Peter.
[00:54:46] And so Peter tweeted,
[00:54:49] because he's always just going to be that guy.
[00:54:51] He just doesn't care and he owns it.
[00:54:53] And he goes, I'm going to be that guy.
[00:54:54] And he said, okay, the guy did a horrible thing,
[00:54:56] but here's something, why don't we just leave him alone?
[00:54:59] Right?
[00:55:00] And you would think that if it were possible
[00:55:02] to just own stuff that people would have gone,
[00:55:03] good point, Peter, they absolutely hounded him.
[00:55:07] And I spoke to Peter, he said, I just don't care.
[00:55:09] I don't care.
[00:55:10] Like, fine.
[00:55:11] But I was stupid enough to then tweet like, come on,
[00:55:13] because people were putting under Peter's thing,
[00:55:15] saying obviously you like eating shit as well.
[00:55:18] That's how the groupthink works.
[00:55:19] You can imagine a tribe.
[00:55:20] I mean, we know about witch hunting.
[00:55:22] You can imagine a tribe back in the day.
[00:55:23] You do one thing that you're not supposed to do,
[00:55:26] everyone's going to jump on you
[00:55:27] because they're scared that they're going to get accused
[00:55:29] or whatever it might be.
[00:55:30] I find it so cowardly and contemptible, but it's just life.
[00:55:33] And so loads of people saying that.
[00:55:34] So I just replied saying, guys, like, you know,
[00:55:36] just because you're missing the point here,
[00:55:39] he's just saying a guy did a disgusting thing.
[00:55:41] He's lost his job.
[00:55:42] He's been embarrassed.
[00:55:44] Job done.
[00:55:44] Let's move on.
[00:55:45] And as you can imagine, I got loads of tweets going,
[00:55:47] oh, obviously Andrew and Peter both like eating shit
[00:55:50] and all this stuff.
[00:55:51] So I think that idea of owning it is a nice idea
[00:55:53] and can work in certain circumstances.
[00:55:55] But if you sort of screw up in any way,
[00:55:57] that tribe, that cultish mindset of the human being
[00:56:00] is there to punish you quickly.
[00:56:03] I think you're right.
[00:56:04] I think because like in my,
[00:56:05] in the situation I described about myself,
[00:56:08] it was okay.
[00:56:11] I still got all the trolls and the haters.
[00:56:13] That was interesting to me.
[00:56:14] But the reason why I 10xed the audience
[00:56:17] and got many more people who liked my stuff
[00:56:20] was because this has happened to many people.
[00:56:22] They've all kept it secret
[00:56:24] and they all related to what I was writing.
[00:56:27] Whereas not everyone's going to relate to like eating shit
[00:56:30] or being supportive about eating shit.
[00:56:32] So that, you didn't really have a natural audience
[00:56:35] of people to support you.
[00:56:37] Yeah.
[00:56:37] Just to reiterate to anyone listening,
[00:56:39] I wasn't supporting the act of eating shit.
[00:56:41] Neither was Peter.
[00:56:42] It was just a mind,
[00:56:43] I mean, Peter does that, anyone who knows him,
[00:56:45] he likes to, he's very much a contrarian.
[00:56:48] I mean, as are you and as am I,
[00:56:50] and I imagine many people
[00:56:51] who listen to your podcast are as well.
[00:56:53] And you just want to sort of throw something out there
[00:56:55] and go, well, you know,
[00:56:56] maybe we shouldn't all jump on this guy.
[00:56:57] I'm very, I hate any kind of jumping,
[00:56:59] everyone jumping on one guy.
[00:57:01] And I always feel bad for that guy.
[00:57:03] And I don't know.
[00:57:05] But again, that's another dimension of secrecy.
[00:57:08] Depending on how relatable the secret is,
[00:57:12] supporting that secret.
[00:57:14] So now one level further,
[00:57:16] supporting that secret will subject you
[00:57:19] to the same type of judgment.
[00:57:22] And we see this all the time.
[00:57:26] People will judge you based on who you hang out with.
[00:57:28] And sometimes that's appropriate,
[00:57:29] but sometimes like in this case,
[00:57:31] just a tweet supporting Peter who made a tweet saying,
[00:57:34] hey, leave this guy alone.
[00:57:35] That got you the same,
[00:57:38] you inherited suddenly the same hostility
[00:57:41] that the Spanish politician got
[00:57:44] when his secret was revealed.
[00:57:45] So there's such this magnetic power on secrecy
[00:57:49] that is so immense,
[00:57:51] which is why I think approaching this
[00:57:53] as the psychology of secrets is so compelling.
[00:57:58] Secrets are really powerful.
[00:58:01] And that's why as well,
[00:58:03] I started looking at what kinds of people
[00:58:05] do we reveal our secrets to?
[00:58:06] And it turns out that most people,
[00:58:08] when you ask them, who do you think you reveal secrets to?
[00:58:10] We tend to say we reveal our secrets to polite people.
[00:58:13] Like, oh, a nice polite person, I'd probably tell them.
[00:58:16] And in actual fact, it's the complete opposite.
[00:58:17] We don't.
[00:58:18] And that's because on some level
[00:58:20] we know they would be the worst people to tell.
[00:58:23] Because what does polite mean?
[00:58:24] Polite means you adhere to societal norms
[00:58:26] at a particular time and place.
[00:58:28] Again, what is polite in Japan
[00:58:29] might not be what's polite in America.
[00:58:31] We know what's polite in America
[00:58:32] and the UK is even different.
[00:58:33] It's different all over.
[00:58:35] It's a totally subjective social construct.
[00:58:38] It doesn't mean anything.
[00:58:40] And if you happen to be harboring a Jew
[00:58:43] in the time of the Nazis,
[00:58:45] well, the polite thing to do
[00:58:46] is to report that somebody's doing that.
[00:58:48] So you would not tell a polite person
[00:58:50] or during times of the Stasi or whatever.
[00:58:52] You wouldn't tell them, by the way,
[00:58:54] I went and watched pornography or whatever
[00:58:56] because they hated pornography in the Stasi.
[00:58:58] So we never tell polite people.
[00:59:00] Polite is not...
[00:59:00] And if you want to get secrets out of other people,
[00:59:02] don't sort of stress how polite you are.
[00:59:05] The best thing to be is assertive.
[00:59:07] And I think someone like Peter,
[00:59:08] in pushing back there, he was assertive, right?
[00:59:11] That Spanish guy doing his disgusting habits
[00:59:14] would have been much better placed to tell just Peter
[00:59:17] than to tell all of these people
[00:59:18] who would consider themselves to be polite,
[00:59:19] but really are just bullying this guy
[00:59:21] waiting for him to kill himself, you know?
[00:59:23] Well, assertive but not judgmental.
[00:59:25] Like you'd have to get a sense
[00:59:26] that they're not gonna judge you.
[00:59:28] That's true.
[00:59:29] That's true.
[00:59:30] I mean, one of the examples I looked at
[00:59:32] there's a journalist here called Louis Thoreau,
[00:59:35] Americans sometimes pronounce it
[00:59:36] the son of the travel journalist Paul Thoreau.
[00:59:40] He's like the guy in the UK that people will think of
[00:59:42] when it's like, oh, everyone reveals their secrets to him.
[00:59:44] How does he do it?
[00:59:45] How does he get people to trust?
[00:59:46] And everyone just thinks,
[00:59:47] oh, he's really polite and whatever.
[00:59:49] And I went and watched back
[00:59:50] and he was actually pretty judgmental.
[00:59:52] He was like, that's bullshit what you're saying.
[00:59:53] That's not true to Nazis when he was interviewing them,
[00:59:56] stuff like that.
[00:59:57] But he's definitely assertive
[00:59:59] and he definitely does show compassion.
[01:00:01] So those were the two biggest indicators of traits
[01:00:04] in people that we revealed to.
[01:00:06] So what, in what sense was he compassionate?
[01:00:08] So he's saying that's bullshit
[01:00:09] but maybe he wasn't being judgmental.
[01:00:11] He's just saying, I don't really believe that that is,
[01:00:15] like he's not judging the secret.
[01:00:16] He's just saying, I don't believe that that's true.
[01:00:18] So maybe then-
[01:00:19] That's true.
[01:00:20] That's true.
[01:00:21] There's an element of,
[01:00:22] there is an element of judgment there.
[01:00:23] I just think it's like, they're seeing this guy,
[01:00:25] they're going, hey, he's, I'm a Nazi
[01:00:28] and I hope no one cuts that bit out of context.
[01:00:31] I'm a Nazi and no one wants to talk to me ever.
[01:00:34] Well, here's this journalist and he's hanging out with me.
[01:00:36] We're going on the school run to take my kids to school.
[01:00:39] We're like having lunch together.
[01:00:41] And yes, he keeps, as you say, having a go at me
[01:00:43] and he is a bit judgmental.
[01:00:45] He's judging some of the stuff
[01:00:45] but he sees me as an equal.
[01:00:47] He doesn't think he's morally superior to me.
[01:00:48] He's not polite.
[01:00:50] So something about me trusts him
[01:00:52] because I guess evolutionarily we know a polite person
[01:00:54] is someone who sticks too rigidly to social mores.
[01:00:59] And he's compassionate.
[01:01:01] I see that how he's interacting with his crew
[01:01:04] or how he's talking to my daughter right now.
[01:01:06] He's interested and he wants to get to the human in people.
[01:01:10] I think that's what,
[01:01:11] that's how we get people to tell us their secrets.
[01:01:12] So those are the traits we have to sort of emphasize.
[01:01:15] And there's always this relationship
[01:01:17] between secrecy and lying
[01:01:19] because if you don't want to tell a secret
[01:01:20] and someone directly asks you,
[01:01:21] you have to lie to cover up the secret.
[01:01:24] And you discuss in the books,
[01:01:26] and you brought up Anne Frank earlier,
[01:01:29] but you discussed how Sam Harris who wrote a book, Lying.
[01:01:33] Sam Harris has been a frequent guest on this podcast.
[01:01:37] You mentioned how he had a professor in college
[01:01:40] who was able to convince him
[01:01:41] even in the Anne Frank situation,
[01:01:43] you shouldn't lie to the storm troopers
[01:01:45] who come to the door asking if Anne Frank's there.
[01:01:48] What's the argument for not lying?
[01:01:51] I mean, 100% I would lie if Anne Frank was in my attic.
[01:01:55] I would hope that I would lie.
[01:01:58] You never really know how you react in a certain situation,
[01:02:01] but I hope I would lie.
[01:02:03] So what's the argument though for not lying?
[01:02:06] I think I'd do that thing where I'm like,
[01:02:08] no, she's not here,
[01:02:09] but like with my eyes,
[01:02:10] like looking like she's up there kind of thing,
[01:02:12] don't kill me.
[01:02:14] As long as you don't kill me, I'm fine.
[01:02:16] Now, hopefully I would also be brave, but who knows.
[01:02:18] So Sam didn't really give,
[01:02:21] I mean, it was quite muddy what he seems to say.
[01:02:24] He's not entirely sure of what he's saying.
[01:02:26] He doesn't give examples of what the teacher
[01:02:28] used to say in that Anne Frank situation.
[01:02:32] And he even, after sort of spending a long time saying,
[01:02:35] you know, we must just always tell the truth.
[01:02:37] It's always best to tell the truth.
[01:02:38] He then says only a psychopath
[01:02:40] would actually endorse this kind of behavior.
[01:02:43] Like I must tell the truth in that circumstance.
[01:02:46] I mean, let me just interrupt for one second.
[01:02:49] Like I guess there's a macro argument
[01:02:52] where is why you'd have to be a psychopath.
[01:02:55] The macro argument might be,
[01:02:57] I'm going to tell the truth here
[01:02:58] and Anne Frank is gonna die.
[01:03:00] But because I told the truth,
[01:03:02] they're gonna be more likely to listen to
[01:03:04] and believe all my neighbors
[01:03:06] who will lie about the girls they have in their attics.
[01:03:09] And all of those girls will be saved
[01:03:12] because the people will believe
[01:03:14] because they knew I was telling the truth.
[01:03:16] Yeah, I mean, I think that's maybe,
[01:03:19] it's a bit specious, isn't it?
[01:03:20] But you're hopeful in that situation,
[01:03:22] you'd be panicking, wouldn't you?
[01:03:23] I'd be like, yeah, okay, maybe I'm helping.
[01:03:25] I'm helping other people.
[01:03:26] I'm helping all the others to survive.
[01:03:28] I mean, Immanuel Kant gives this answer
[01:03:30] because he's a truth purist
[01:03:31] and he was a philosopher, a German guy.
[01:03:35] And so he said,
[01:03:37] the issue here is that you lie
[01:03:40] and say no, Anne Frank's not up there.
[01:03:41] Now this was before Anne Frank.
[01:03:44] So it was just a random person that you're harboring.
[01:03:47] But you say Anne Frank's not up there
[01:03:49] and then you don't know this
[01:03:50] but Anne Frank climbs out the window
[01:03:52] and escapes.
[01:03:54] And then the guard, the Nazi,
[01:03:57] he's left because you said no, Anne Frank's not up there.
[01:04:00] And the Nazi then comes across Anne Frank in the road
[01:04:03] and kills her.
[01:04:04] And that's your fault
[01:04:05] because your lies led to that situation.
[01:04:09] Yeah, I read that argument in the book
[01:04:11] and I was thinking that seems really convoluted.
[01:04:14] Anne Frank has to be shopping in the grocery store now
[01:04:18] and then the Nazis just randomly run into her
[01:04:20] and now she definitely dies
[01:04:21] as opposed to maybe she would have died.
[01:04:23] And so, by the way, it does make me think
[01:04:27] that there were probably too few
[01:04:30] or too many philosophers back then.
[01:04:32] Now just everybody's a philosopher
[01:04:33] because you have Twitter.
[01:04:34] So the value of being a philosopher has gone to zero.
[01:04:38] The supply is now infinite of philosophers.
[01:04:42] Poor philosophers.
[01:04:44] Hmm?
[01:04:44] Well, the philosophers,
[01:04:45] I mean, they weren't exactly swimming in money before,
[01:04:47] were they, the philosophers?
[01:04:48] Right.
[01:04:48] Now it must be their economy,
[01:04:49] but yeah, it's just sunk.
[01:04:50] It's just crashed.
[01:04:52] Yeah.
[01:04:52] So, I mean, what did your own relationship to secrecy,
[01:04:57] like, okay, you know,
[01:04:59] and I'm pretty sure we've talked about this before,
[01:05:01] but where do you stand on radical honesty?
[01:05:03] Because that's related to all this secrecy
[01:05:06] and the Anne Frank stuff and so on.
[01:05:08] It's just certain kinds of people.
[01:05:11] I was on this podcast called Trigonometry.
[01:05:13] Do you know that?
[01:05:14] Yeah.
[01:05:15] So Constantine Kissin and Francis Foster.
[01:05:18] And I had a great time on there.
[01:05:20] And when we finished, Constantine said,
[01:05:23] well done, that was great.
[01:05:25] That was really, really, really good.
[01:05:27] That's going to do well, you know, that kind of thing.
[01:05:29] And I said, yeah, but I mean,
[01:05:32] I say that to everyone when they come on my podcast.
[01:05:34] Like if someone, you know,
[01:05:35] I just always tell them, you were brilliant.
[01:05:37] You were wonderful.
[01:05:38] That was so good.
[01:05:39] I mean, I've even said that in times where I think,
[01:05:40] God, I can't even put that one out.
[01:05:42] That's so bad.
[01:05:43] But I'll still just be like, fantastic.
[01:05:44] Thank you so much.
[01:05:45] Yeah, you're really good.
[01:05:46] And he just went, I thought he would sort of agree.
[01:05:49] I don't know.
[01:05:50] And he's like, really?
[01:05:51] Why?
[01:05:53] I went, what?
[01:05:53] Why?
[01:05:54] Because it's easier
[01:05:56] or because this is a person who's given up their day
[01:05:59] to come and whatever.
[01:06:01] And if they are going to realise later down the line
[01:06:03] that, oh God, my video wasn't actually very good
[01:06:06] or whatever, at least they've realised it.
[01:06:08] You know, they're at home,
[01:06:09] they're with their family,
[01:06:09] they're alone or whatever.
[01:06:11] Instead of like face-to-face with the guy
[01:06:13] who's like looks really disappointed in them.
[01:06:15] That's such an awkward thing
[01:06:16] and so sad for them to have to go through.
[01:06:18] And he said, well, you know, you've just got to be honest.
[01:06:21] You should always be honest.
[01:06:23] I don't know what's right there.
[01:06:25] I just know it's not natural for me to do it.
[01:06:27] And it appears that it is natural for him to do it.
[01:06:30] And I think that helps him out a lot
[01:06:32] because he has no issues with just being honest
[01:06:35] and putting everything on the table
[01:06:36] and it makes him a wonderful arguer.
[01:06:38] But I couldn't do it.
[01:06:39] I think you wouldn't be able to do it either.
[01:06:41] I definitely would not be able to.
[01:06:43] No, you can't.
[01:06:44] You can't even reply to a text, you know?
[01:06:46] I know.
[01:06:47] Oh, wait, have you texted me and I've not replied?
[01:06:50] No, I didn't bother.
[01:06:50] Which is by the way...
[01:06:52] No, but it's true.
[01:06:53] This is a huge problem for me
[01:06:55] that's getting bigger every single day
[01:06:57] that I literally cannot respond to people.
[01:06:59] And then everyone who I like,
[01:07:02] because those are the people who texted me
[01:07:03] are people who I like,
[01:07:04] everybody who I like starts to hate me.
[01:07:06] Yeah.
[01:07:07] And I don't know what I can do about this problem.
[01:07:09] It's some kind of issue I have.
[01:07:10] I'm fortunate that with you,
[01:07:12] I went into this with my eyes open
[01:07:13] because when I met you a year and a half ago, I think,
[01:07:17] I'd already read your articles.
[01:07:18] You don't even reply to your own family.
[01:07:20] So I gave you a bit of shit about it,
[01:07:21] but I just thought, okay, this is someone I'm meeting
[01:07:23] who's very nice, very hospitable
[01:07:25] and what a lovely experience it was.
[01:07:27] And it really, really was.
[01:07:28] I mean, you know,
[01:07:29] and I'm not going to have an ongoing relationship
[01:07:33] with this man until Jay maybe messages me or something
[01:07:36] because he can't reply to texts.
[01:07:38] It's just as if phones weren't a thing
[01:07:40] and I've accepted that.
[01:07:42] I can't respond to phone calls either.
[01:07:44] Oh, God.
[01:07:45] So I don't, it kind of makes me isolated,
[01:07:48] but I'm trying, I'm trying really hard.
[01:07:52] It's, I need like some, I don't know how to do it,
[01:07:55] but you know, it's so, by the way, just in general,
[01:08:02] all the topics that you write about
[01:08:03] and that we talk about, they're so interesting.
[01:08:06] Each one deserves its own podcast.
[01:08:08] Like I love reading about NXIVM
[01:08:10] because I knew personally someone who was like neck deep
[01:08:13] in that cult while they were still in it.
[01:08:15] I love reading about Scientology
[01:08:16] just because I didn't really know
[01:08:18] that all that stuff about Lord Zeno.
[01:08:19] I hadn't seen the South Park episode
[01:08:21] and it was interesting how even people
[01:08:25] very deep into Scientology were still high enough
[01:08:28] to know all of these secrets.
[01:08:29] So it was fascinating to see how they unraveled their secrets
[01:08:32] in order to basically extract more money from people.
[01:08:35] And, you know, just in general,
[01:08:37] the psychology of secrecy made me think
[01:08:38] of the connection between secrecy and storytelling.
[01:08:42] And it builds a whole new model of storytelling
[01:08:44] if you think about stories in terms of secrets.
[01:08:48] And it's just always fascinating stuff
[01:08:50] that you're working on.
[01:08:50] Like what else are you working on?
[01:08:52] What are you working on now?
[01:08:53] Thank you for the kind words, by the way.
[01:08:55] I do really appreciate that.
[01:08:56] And it reminds me of the storytelling.
[01:09:00] I mean, that's such a big part
[01:09:01] that we've actually lost in,
[01:09:03] because I try to look at the good sides as well.
[01:09:05] And I did have a go at Hasidic Judaism in the book,
[01:09:08] but they would say to us like,
[01:09:10] okay, you've got your individualism
[01:09:12] and you've got your American dream and all of this stuff
[01:09:14] and you can do what you want to do.
[01:09:15] Okay, but you've lost what we have, what they have,
[01:09:18] which is they gather around a fire or whatever it might be
[01:09:22] and they tell one another stories
[01:09:23] and they can't wait for the next story.
[01:09:25] And there's something nice about that.
[01:09:27] Now I wouldn't change it for what we have,
[01:09:28] not in a million years,
[01:09:30] but I also want to be honest that I think,
[01:09:32] oh, it's sad that we don't really have that.
[01:09:36] I mean, I started this new channel that's going to probably,
[01:09:38] I mean, it might piss off half your listeners,
[01:09:40] but the other half will absolutely love it.
[01:09:42] Because I was doing the YouTube channel
[01:09:44] about cults and ideologies.
[01:09:45] And every now and then I went into controversial terrain.
[01:09:49] Like people don't mind, like I said,
[01:09:51] everyone's on board with you about Scientology.
[01:09:53] But when you go into like,
[01:09:56] well, either side of the political spectrum,
[01:09:57] like is Trump a cult or whatever?
[01:10:00] People are really angry.
[01:10:01] And if you go on the other side,
[01:10:03] is trans ideology a cult or something like that?
[01:10:06] Everyone goes bananas.
[01:10:07] And so I thought, I've just got to start a new channel
[01:10:09] and just do that stuff.
[01:10:10] I'm really interested in the culture wars.
[01:10:12] Obviously both sides,
[01:10:14] when you go too far either way are awful,
[01:10:16] but I don't find the right as interesting
[01:10:18] as a storyteller, as a YouTuber or whatever,
[01:10:20] because I don't know any right-wing people.
[01:10:22] I don't know guys running around saying,
[01:10:24] Jews will replace us.
[01:10:26] And just, oh, I hate black people
[01:10:30] and I should have a slave.
[01:10:31] I don't know anyone.
[01:10:32] That's not to say they're not out there,
[01:10:33] but no one I grew up with and no one I know
[01:10:36] is espousing this stuff on Instagram.
[01:10:38] Whereas loads of people I know
[01:10:40] who I went to university with
[01:10:41] and all these kinds of things
[01:10:43] are shouting all sorts of woke stuff,
[01:10:47] trans women are women, that kind of thing.
[01:10:50] So I just thought, okay,
[01:10:50] you can't have a YouTube unfortunately that does both
[01:10:54] because you get an audience
[01:10:55] and your audience are gonna go for one way, unfortunately.
[01:10:59] That's a really sad aspect of...
[01:11:03] I think Peter Boghossian told me this.
[01:11:04] I think basically he said,
[01:11:06] you can't be in the middle if you want an audience
[01:11:09] because then both sides...
[01:11:11] I think it was either Peter Boghossian or Scott Adams
[01:11:12] who told me, if you present the view of both sides fairly,
[01:11:18] then everyone will hate you.
[01:11:19] Yeah, 100%.
[01:11:20] That's what he did with the shitting Spanish man.
[01:11:23] The way YouTube works
[01:11:24] and I feel like I've got the algorithm sort of down,
[01:11:26] not that there actually is an algorithm,
[01:11:28] it's just basic psychology.
[01:11:29] It's basic human psychology.
[01:11:30] And so if I water things down and make them nuanced,
[01:11:33] it doesn't get as many clicks.
[01:11:34] That's just life, it's just reality.
[01:11:36] It means that I lose a lot of friends
[01:11:38] who think I'm more extreme than I am.
[01:11:39] The interviews are very nuanced.
[01:11:41] The titles and thumbnails on YouTube are pretty extreme.
[01:11:44] But also I've now got,
[01:11:46] after six months of this new channel,
[01:11:48] it's already got 125,000 subscribers.
[01:11:50] Most videos get 100,000 to 500,000 views.
[01:11:53] So it's doing well.
[01:11:55] And I know that what it does is
[01:11:58] every time I put out a new video,
[01:11:59] if I were going to put out something like,
[01:12:01] oh, this is the problem with Trump.
[01:12:02] You know, he's not very statesman-like
[01:12:04] and he's just a bit of a buffoon or whatever,
[01:12:07] or things I would like to say,
[01:12:10] you know, I'm still probably pro-choice
[01:12:13] and I'm a vegan, I'm not very passionate about it,
[01:12:16] but I don't eat meat and things like that.
[01:12:19] If I suddenly put out something like,
[01:12:20] why I don't eat meat?
[01:12:22] Well, what YouTube will do
[01:12:23] is it will show those 125,000 subscribers first.
[01:12:26] These are the guys that signed up because I said
[01:12:28] the left has a worrying relationship with Islam.
[01:12:31] You know, they don't want to, they're not going to click.
[01:12:33] And if they don't click,
[01:12:34] YouTube won't show the actual audience
[01:12:36] who would want to see that stuff about why I don't eat meat.
[01:12:38] So you can't win.
[01:12:39] I tried and tried and tried
[01:12:41] and the videos that were a bit more worthy
[01:12:43] and that other side just failed every time.
[01:12:46] And then you can't pay to run your channel.
[01:12:50] But anyway, that's what-
[01:12:52] No, that's right.
[01:12:52] That's very interesting.
[01:12:53] That's similar.
[01:12:54] Someone told us the LinkedIn algorithm
[01:12:58] it takes anything you write
[01:13:00] and it shows your first 30,000 subscribers
[01:13:05] the article or the content.
[01:13:07] And then if they like it,
[01:13:08] it shows it to the rest and then to the outside world.
[01:13:11] And if they don't like it within the first 60 seconds
[01:13:14] or minute or 10 minutes or whatever, then it's gone.
[01:13:18] And I'm thinking, boy,
[01:13:19] my first 30,000 subscribers were 18 years ago.
[01:13:22] So I don't even know who, you know, they're all dead.
[01:13:25] I don't even know who they are.
[01:13:27] That's a problem.
[01:13:28] That is a problem on like YouTube in particular.
[01:13:30] It's not so much of a problem on like an audio platform
[01:13:32] because an audio platform doesn't have discoverability
[01:13:35] in the same way.
[01:13:36] So people subscribe to it.
[01:13:37] If they like what you've put out,
[01:13:39] they'll always get a notification and they'll watch it.
[01:13:41] I listened to it, sorry.
[01:13:42] On YouTube, you only get big views from discoverability
[01:13:46] which means YouTube saying,
[01:13:47] hey, this video is doing really well
[01:13:48] with his normal subscribers.
[01:13:49] Let's push it to new groups.
[01:13:51] And that's how you grow.
[01:13:52] And that's how a video does well.
[01:13:53] So you are constantly incentivized,
[01:13:56] not just financially,
[01:13:56] but also you want your channel to do well, you know,
[01:14:00] to get more and more extreme guests
[01:14:02] who are all one way and all one way of thinking.
[01:14:05] And even like I had a guy on the other day
[01:14:07] who was really nuanced.
[01:14:08] He's the guy who wrote the status game Will Store.
[01:14:10] And early on, he said he made a sort of joke
[01:14:13] about anti-vaxxers because, you know,
[01:14:16] it just seemed obvious to him that the vax is fine
[01:14:18] or whatever.
[01:14:19] And I didn't even say, I don't have an opinion.
[01:14:20] I don't, you know, it's not my area of expertise.
[01:14:23] But I saw a huge drop off
[01:14:24] because somebody had said a thing
[01:14:26] that went against what their tribe thinks
[01:14:28] for like an anti-woke or non-culture war channel
[01:14:33] and they couldn't handle it.
[01:14:35] So anyway, I try, I'm just trying as much as I can
[01:14:39] to keep it balanced.
[01:14:40] And I think the way to do that now
[01:14:42] is have extreme guests on,
[01:14:44] because people click on that,
[01:14:46] but halfway, ask them about what they're talking about.
[01:14:48] And then halfway through say,
[01:14:50] okay, let's have an argument, you know?
[01:14:53] And so, and then I still get to feel like,
[01:14:56] hey, I've pushed back against some of the views,
[01:14:59] but some of them I agree with.
[01:15:00] So it's a lot on the-
[01:15:02] How fast do you do the argument?
[01:15:03] Like how fast do you have to bring the argument in
[01:15:06] to get the controversy in?
[01:15:07] Well, the argument doesn't necessarily get more views
[01:15:10] because the argument is,
[01:15:12] if you imagine all my followers and subscribers are going,
[01:15:15] yes, this is a video about why the trans thing
[01:15:18] is no different to exorcism in another part of the world,
[01:15:22] because it's just like teenage girls
[01:15:23] who don't have control of their lives,
[01:15:25] trying to exert control over something
[01:15:27] and it's their body or whatever.
[01:15:28] You want to watch that video,
[01:15:30] then that's what you're on.
[01:15:32] You're like, I'm on this video.
[01:15:33] So we've got a guest,
[01:15:34] the guest is telling me all that stuff.
[01:15:35] And if I suddenly go, well, hang on, I don't agree.
[01:15:38] That's already putting a bit of, they're getting annoyed.
[01:15:40] They're writing comments going like,
[01:15:41] hey, well, Andrew's not on board with this.
[01:15:43] Well, I am.
[01:15:44] And people are pissed off and they click off.
[01:15:45] There's too many other videos for them to watch
[01:15:47] with people who do say the things they want to hear.
[01:15:50] So that's the reason that I actually leave the argument
[01:15:53] for like half an hour in.
[01:15:54] And it's for me really,
[01:15:56] but it's also good for trailers on Twitter
[01:15:58] and things like that.
[01:15:59] So that is good.
[01:16:00] I did a good one the other day.
[01:16:01] It was a really woke woman called Enia Luko,
[01:16:05] who is a black English ex-soccer player
[01:16:09] who is a pundit, like a commentator on TV
[01:16:11] all the time with the football.
[01:16:12] And I think you know from my story and everything,
[01:16:15] I used to work at the BBC and made documentaries.
[01:16:18] And then they said, because you're a white man,
[01:16:19] we can't have you on screen.
[01:16:20] You have to be like behind the camera
[01:16:22] and we'll put a minority on screen.
[01:16:23] And I was like, but it's my work.
[01:16:24] The journal, I found the stories
[01:16:26] and that's what they wanted.
[01:16:27] They didn't want to hear that I was Jewish.
[01:16:28] That didn't help.
[01:16:29] They would just laugh at me when I said that and whatever.
[01:16:33] And so we had this debate, her and I,
[01:16:35] where she was saying, hey, DEI is a good thing.
[01:16:38] And I was saying, no, it's not.
[01:16:40] And that did like really well, not on YouTube.
[01:16:42] It did okay, got 200,000 views,
[01:16:43] but on Twitter, it like blew up,
[01:16:45] got over 2 million views because it was this big clash
[01:16:49] and also because unfortunately
[01:16:50] she really didn't come across that well.
[01:16:52] And it was one of those kind of disaster ones, but you know.
[01:16:56] So that's what I'm doing.
[01:16:57] It's a channel called Heretics.
[01:16:59] That's so interesting.
[01:17:01] And the stuff about the algorithm is interesting
[01:17:03] because I feel like,
[01:17:04] I don't know what I did early on in YouTube,
[01:17:06] but I must've quickly built up to about 50,000 subs.
[01:17:10] But that was like, again, years and years ago,
[01:17:12] more than 10 years ago.
[01:17:14] And now I post something
[01:17:17] and I tend to be pretty neutral about a lot of issues.
[01:17:19] I just want to kind of learn and get information
[01:17:21] and maybe story tell, but I can't get views
[01:17:26] because I'm just like not triggering the algorithm at all.
[01:17:28] I'll get literally like 12 views on something.
[01:17:30] You're completely fucked, man.
[01:17:32] You're fucked.
[01:17:33] You got to start a new channel.
[01:17:34] I think that's right.
[01:17:35] I think I do have to start a new channel,
[01:17:36] but your channel is like related to like your email
[01:17:39] and like your Google account and all that.
[01:17:41] I guess I have to start a branded channel.
[01:17:42] Like you started a channel with a name like Heretics.
[01:17:44] It's not like the Andrew Gold channel.
[01:17:47] It's actually got a purpose that's outside of you.
[01:17:50] So you could start a new one.
[01:17:51] Your problem isn't that you're James Altucher.
[01:17:54] That's a good thing.
[01:17:54] People will sign up,
[01:17:55] but it is a problem what you've encountered.
[01:17:57] You've got a lot of people at the beginning.
[01:17:58] A lot of celebrities get a lot of subscribers
[01:18:01] at the beginning.
[01:18:02] They start a YouTube channel
[01:18:03] and then their videos don't do well.
[01:18:04] And the reason for that is that subscribers,
[01:18:06] those people are not necessarily people
[01:18:08] who want to watch an hour long interview.
[01:18:10] They're people who like the guy, say it's Ricky Gervais.
[01:18:12] They're like, I like this guy, Ricky Gervais.
[01:18:13] I'm going to just subscribe.
[01:18:15] So suddenly he's got a million followers,
[01:18:16] but none of them are actually people who watch his kind,
[01:18:20] if he was doing podcasts, let's say.
[01:18:21] None of those people who watch it.
[01:18:23] So what happens every time he puts out a new video,
[01:18:25] or let's say you, every time you put out a new video,
[01:18:27] those people followed you
[01:18:28] because they liked the stuff about like the articles
[01:18:30] that were controversial,
[01:18:31] things you've said that are controversial in the past.
[01:18:33] But maybe these are not podcast listeners
[01:18:35] or podcast viewers on YouTube.
[01:18:37] They've just seen your name and they go,
[01:18:39] oh, I like him, click subscribe,
[01:18:41] but they'll never watch it.
[01:18:42] So you're just screwed from the beginning
[01:18:44] because every video you put out goes to them first.
[01:18:46] They don't watch it, no one else gets to see it.
[01:18:48] So yeah, you've got to start again.
[01:18:51] And that's why I started again with Heretics.
[01:18:52] It just wasn't going to work on the old channel.
[01:18:54] I think that's really smart.
[01:18:56] I'm going to experiment with this.
[01:18:58] So, and then when's the book on Heretics coming out?
[01:19:02] Oh man, I might, I started it.
[01:19:04] It's called, what was it?
[01:19:07] The Unintentional, or the Unintentional.
[01:19:13] Oh man, hang on, I want to get this right
[01:19:15] because I had this great name
[01:19:17] and I've probably got it here.
[01:19:19] The Unintended.
[01:19:20] The Unintended Bigot Chronicles of a Hellbound Heretic.
[01:19:26] And it's about how,
[01:19:28] because I haven't mentioned this,
[01:19:29] but this book, I was invited to like loads
[01:19:31] and loads of museums, the Tate Museum,
[01:19:33] all of these kinds of things to speak about it.
[01:19:35] And every single one of them got in touch
[01:19:37] after inviting me with Pam McMillan
[01:19:39] and disinvited me
[01:19:41] because they say I'm a controversial figure.
[01:19:43] I think it's mostly because of the gender stuff.
[01:19:46] So I've not been able to do any of that
[01:19:47] like traditional publicity.
[01:19:49] And I'm like, this is weird.
[01:19:50] That is fascinating.
[01:19:51] Yeah, so I'm sitting there thinking,
[01:19:52] it's interesting because the book, as you know,
[01:19:53] doesn't even touch on culture wars.
[01:19:55] It's all about psychology and secrets
[01:19:57] and those kinds of cults and things.
[01:19:58] So I just thought, how has it gotten to this point
[01:20:01] where, I mean, one very quite famous journalist,
[01:20:04] his name was going to be on the back
[01:20:05] and he also called up the day before
[01:20:07] and said, I've seen what you've been doing
[01:20:09] and I don't actually disagree with you
[01:20:11] but I can't have my name on the back of the book.
[01:20:13] I was like, okay, thanks for that coward.
[01:20:16] But you know, how is it,
[01:20:18] I'm interested in investigating that,
[01:20:19] sort of telling the story
[01:20:20] of how I started the YouTube channel
[01:20:22] and all of these things
[01:20:22] and trying to analyze myself and go like,
[01:20:25] am I a bad guy?
[01:20:26] What's going on?
[01:20:27] And have I let myself be captured by audience and ideology?
[01:20:30] So that's what I'm interested in now.
[01:20:33] Can we do another podcast about that
[01:20:35] like in a couple of weeks?
[01:20:37] Yeah, of course.
[01:20:38] We could do it right now and after you write the book.
[01:20:40] We can do that.
[01:20:41] And we can do that in a couple of weeks if you want.
[01:20:43] And then I'll just push the psychology of secrets at the end.
[01:20:46] That's what I want people to get
[01:20:47] because you know what?
[01:20:49] They did a weird thing.
[01:20:50] It's not even out in America till the 1st of September
[01:20:53] on Amazon, you can pre-order.
[01:20:55] Oh, psychology, this is not out till 1st of September?
[01:20:58] Yeah, it's mad but it is out in the UK.
[01:21:01] So what all my sort of subscribers in America are doing
[01:21:04] is you go to amazon.co.uk,
[01:21:06] you've got to go to the English Amazon,
[01:21:08] the UK Amazon and order it there.
[01:21:11] And then they ship it over.
[01:21:14] It's like, hopefully it's only like five or $6
[01:21:16] or something to ship it over from the UK.
[01:21:19] Well, people should absolutely get it.
[01:21:21] Like they should get it now because I really,
[01:21:24] I don't always enjoy, look,
[01:21:27] when I have a guest on, I usually love what they do.
[01:21:30] But sometimes like maybe they ghost write the book
[01:21:33] or maybe they're not really a writer.
[01:21:34] So it's a little bit, I like the ideas
[01:21:36] but it's a little harder to get through the book.
[01:21:39] You are an excellent storyteller,
[01:21:40] you're an excellent writer.
[01:21:41] This book is so fascinating.
[01:21:42] Like the concepts around secrecy,
[01:21:45] the psychology of secrecy, I just loved it.
[01:21:48] So it's a great book and you're always a great guest,
[01:21:52] Andrew, you're always welcome on the podcast.
[01:21:55] You should come on all the time.
[01:21:57] And thanks once again for coming on this side.
[01:22:00] And by the way, here's what I'll do.
[01:22:02] I'm gonna release the podcast now.
[01:22:03] So people can order off of amazon.co.uk
[01:22:06] but also I'll release it in the first week of September
[01:22:09] when it comes out.
[01:22:10] Mate, you're amazing.
[01:22:11] You're an absolute legend.
[01:22:13] Double hit.
[01:22:13] So thanks once again, Andrew.
[01:22:15] I really appreciate it.
[01:22:17] It's such interesting stuff.
[01:22:18] Oh, thank you, James.
[01:22:20] Anytime.
[01:22:37] This Memorial Day, click into cordless power
[01:22:40] at the Home Depot.
[01:22:41] Tackle your yard cleanup with the precision
[01:22:43] and gas-like power of the RYOBI 40V
[01:22:46] expanded cordless string trimmer.
[01:22:48] Then power through the heavier debris
[01:22:50] with more than an hour of runtime
[01:22:51] on the RYOBI 40V cordless battery blower.
[01:22:56] Right now you can get either one for only $159 each.
[01:22:59] It's time to click into Memorial Day doing
[01:23:01] at your cordless power source, the Home Depot.
[01:23:04] How doers get more done.
[01:23:07] When you're drinking a frozen beverage from McDonald's,
[01:23:11] your brain may not like how refreshingly cold it is
[01:23:14] but the rest of your body, oh yes,
[01:23:17] it's gonna relish every moment of it
[01:23:19] because there are drinks.
[01:23:22] Then there are drinks from McDonald's.
[01:23:24] Try the new ice cold frozen Philadelphia Fanta
[01:23:27] with piping hot fries.
[01:23:29] Get any size frozen drinks for $1.49.
[01:23:32] Price and participation may vary,
[01:23:33] cannot be combined with any other offer.
[01:23:36] Ba da ba ba ba.