The Grand Mirror: AI as a Reflection of Humanity | Brian Roemmele - Part 2
The James Altucher ShowAugust 03, 202301:18:0771.6 MB

The Grand Mirror: AI as a Reflection of Humanity | Brian Roemmele - Part 2

Part 2 with Brian Roemmele focuses on AI as an amplifying reflection of human nature, dissecting our understanding of consciousness, intelligence, and emotions. Join us as we explore the fascinating intersection of technology and humanity, and how AI offers unique insights into the essence of our being.

In Part 2 of this compelling three-part series with Brian Roemmele, we delve further into the complex world of AI, unraveling how it serves as a grand mirror reflecting humanity's knowledge, language, feelings, and emotions. Join us as we explore how AI, without the consciousness we're still struggling to comprehend, amplifies and magnifies our innate intelligence. Discover how AI is not an alien concept but an extension of our very nature, bridging the disparate facts gathered from our senses and offering new insights into what it means to be human. It's a journey that takes us closer to the core of what connects us all, unraveling the mysteries of memory, consciousness, and emotion. You won't want to miss this fascinating and enlightening episode.

----------

What to write and publish a book in 30 days? Go to JamesAltucherShow.com/writing to join James' writing intensive!

What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!

Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!

------------

Visit Notepd.com to read our idea lists & sign up to create your own!

My new book Skip the Line is out! Make sure you get a copy wherever books are sold!

Join the You Should Run for President 2.0 Facebook Group, where we discuss why you should run for President.

I write about all my podcasts! Check out the full post and learn what I learned at jamesaltucher.com/podcast.

------------

Thank you so much for listening! If you like this episode, please rate, review, and subscribe to โ€œThe James Altucher Showโ€ wherever you get your podcasts: 

Apple Podcasts

Stitcher

iHeart Radio

Spotify

Follow me on Social Media:

YouTube

Twitter

Facebook

 

------------

  • What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
  • Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!

------------

------------

Thank you so much for listening! If you like this episode, please rate, review, and subscribe to โ€œThe James Altucher Showโ€ wherever you get your podcasts: 

Follow me on social media:

[00:00:01] This isn't your average business podcast and he's not your average host.

[00:00:06] This is the James Altucher Show.

[00:00:12] Brian, I've been following you for years of course on Twitter.

[00:00:16] What were some things that intrigued you on animal intelligence?

[00:00:19] I think if we don't understand animal intelligence to some degree, even primates, right?

[00:00:24] How are we going to really understand machine intelligence and what is intelligence and things of that nature?

[00:00:31] So quantum physics led me into trying to understand the brain and to understand the observer.

[00:00:36] Why the heck does an observation change a scientific test?

[00:00:42] Let's look at humanity.

[00:00:43] Humanity is basically tool builder and storyteller.

[00:00:46] That's our existence with tool builders and storytellers.

[00:00:49] If we took that away, we basically wouldn't exist.

[00:00:53] We had to evolve extreme storytelling ability because we're relatively weak animals.

[00:00:59] And so in order, we had to communicate to our kids,

[00:01:02] hey, don't go over the mountain because we knew that's where the lions were.

[00:01:06] We had to tell them as far as Satan is and we have to tell a whole story about Satan and monsters.

[00:01:11] And we had to develop storytelling ability to basically help the next generation live longer.

[00:01:16] And I'm sure you know this being very creative is that when you get into the flow of things,

[00:01:20] you're kind of taking a step back.

[00:01:22] But the spark of insight, that creative spark that comes into you,

[00:01:27] nobody's been able to fully define it.

[00:01:30] It's a collection of all of these different pieces that if you take a step back,

[00:01:35] combine in a way that's magic.

[00:01:38] But if you try to force it, if you try to overthink it,

[00:01:41] you try to capture a cloud in your hand or get a cup of water by grabbing it as much as you can,

[00:01:48] it dissipates.

[00:02:01] This is all related to something you brought up earlier about consciousness.

[00:02:05] We could be as intelligent as we want, but as you pointed out, a lot of it's just mechanical.

[00:02:12] Like we have these neurons all through our body.

[00:02:14] We have this pineal gland.

[00:02:16] We have responses based on our five senses and so on.

[00:02:20] But none of that seems remotely connected to consciousness.

[00:02:23] Like none of that is just there's input, which is all of our senses.

[00:02:28] And then there's output, which is, okay, let's have a thought.

[00:02:30] And the thought might lead to an action.

[00:02:32] But other than being input output machines with like a computer,

[00:02:36] we also know that there's something called consciousness that seems a little different.

[00:02:40] We think that we're conscious.

[00:02:42] And so what do you think that is?

[00:02:44] That is the question of life, James.

[00:02:46] I think if you dive into this study of artificial intelligence long enough

[00:02:52] and you want to look at it honestly,

[00:02:55] you can't help but to go back to the grand, what I call the grand mirror,

[00:03:00] the reflection of AI back to us to say, well, what exactly is going on?

[00:03:06] And so one of the first things you can do is try to go through analogy.

[00:03:11] So you look at dolphins, you look at whales, you look at chimpanzees,

[00:03:15] you look at your dog, you get your cats, you look at birds,

[00:03:18] you look at other species and you start having an honest conversation.

[00:03:24] Are they conscious?

[00:03:26] Have we deluded ourselves to think they're not conscious because it's not our form of consciousness?

[00:03:31] We're going to be very humor centric of consciousness, right?

[00:03:36] And we're going to believe that only ours is the best brand.

[00:03:40] Of course those poor dolphins, they don't have opposable thumbs.

[00:03:43] Their consciousness much suck.

[00:03:45] They're stuck in this wetsuit of a body swimming around,

[00:03:49] but we know they're intelligent and I know that they're conscious.

[00:03:53] Now, how do I know that they're conscious?

[00:03:56] Because they do show levels of connection that we as humans claim as part of consciousness.

[00:04:03] And a lot of it, strangely enough, has to do with emotion.

[00:04:07] I think part of what humans ultimately mean, what consciousness represents,

[00:04:12] is something of an emotive sort of construct.

[00:04:15] And everybody has sort of a different brand that they put out there,

[00:04:20] especially the Deep Thinkers.

[00:04:22] But if you were to put it in a pot and boil it for just a little bit,

[00:04:25] it all comes out like emotions.

[00:04:27] It is like, well, they feel, ah, that's an emotion.

[00:04:32] Well, no, they feel philosophically, oh, okay, but that's still an emotion.

[00:04:37] Right, and emotions again are input-output.

[00:04:40] Like people sometimes say the cliche, oh, consciousness is love, love is con...

[00:04:47] But you know, love is there from an evolutionary perspective.

[00:04:50] You love the people who you're either going to breed with.

[00:04:55] Pear bonding.

[00:04:56] Evolutionary, you love people so that you could pass on your genes

[00:05:01] or you love people who are going to get you food.

[00:05:03] Yeah.

[00:05:04] So it all kind of evolves from that.

[00:05:06] But that also seems disconnected from consciousness.

[00:05:08] Exactly, exactly.

[00:05:09] So it's always going to be a bit of chasing something across a desert.

[00:05:15] I'm not solving that problem anytime soon.

[00:05:20] But I do suggest this, especially in the light of us hearing congressional testimony about

[00:05:30] off-world biological entities and what does that represent?

[00:05:35] I think no matter where you are on the unfortunate political side of this,

[00:05:40] because it's now become part of a political...

[00:05:42] Everything.

[00:05:43] Everything's political now.

[00:05:44] It's part of a political thing.

[00:05:46] And then it became part of a science thing.

[00:05:48] Now we're calling people within the military and pilot experts,

[00:05:58] expert pilots have been flying commercial airlines for 30 years.

[00:06:02] We're calling them conspiracy theorists now.

[00:06:04] But all things is upside down and sideways because they saw something.

[00:06:08] Now it was a Mylar birthday balloon.

[00:06:10] Oh, an expert observer who's been in the skies for 30 years.

[00:06:13] He can't detect a Mylar birthday balloon over something else.

[00:06:17] Oh, he doesn't really know what he saw.

[00:06:19] It was a reflection of Venus off a swamp gas.

[00:06:21] So what we're doing is a question the integrity of people who we trust our very lives with.

[00:06:27] Same thing with people who run nuclear codes.

[00:06:31] Philip Corscoe, one of the highest levels of secret clearance, writes a book day after Roswell

[00:06:39] and everybody wants to run away from it because it makes some profoundly bold claims

[00:06:45] of where technology...

[00:06:47] I don't know the book. What did he say?

[00:06:49] That we found the laser, we found the integrated circuit,

[00:06:54] we found quite a lot of other technologies from a crashed UFO.

[00:06:59] And he would have been the person responsible in the government with the clearance,

[00:07:05] working with generals.

[00:07:07] It was a dying man's testimony.

[00:07:10] He was the last one alive of all the people that were responsible within his group

[00:07:15] and he was dying of cancer and he knew it.

[00:07:18] He made very little money.

[00:07:20] I think most of the money was going to be donated.

[00:07:22] He had no other reasons that I could see.

[00:07:27] Now we can get conspiratorial and say,

[00:07:29] well, you know, he was part of the greater cabal that was trying to create a disinformation.

[00:07:35] Well, that was the 90s.

[00:07:37] And he was basically saying, yeah, most of this technology was filtered in through the military industrial complex.

[00:07:44] It was the only way that we can apply it and they jump-started things that were, you know, more or less started,

[00:07:51] but they got a rather large leap and his book goes into it in quite detail.

[00:07:56] So, you know, again, I'm not saying Corsco was entirely correct about everything he said.

[00:08:03] I think some of his memory could have been off,

[00:08:06] but I do think the idea that he was in charge of,

[00:08:12] he was in charge of foreign technology.

[00:08:15] Foreign technology division of the U.S. Army.

[00:08:18] So anytime the U.S. Army came across adversarial technologies, it would be in his department.

[00:08:25] Now what better place would off-world technology be than in his hands?

[00:08:30] And unfortunately that's logical, but what he came, what he's saying, his testimony,

[00:08:37] Dead Man's confession is what he called it, is not very expedient for a lot of people.

[00:08:45] You know, I'm an empiricist.

[00:08:47] I don't care what the data is directing me.

[00:08:50] I'm just going to follow it and I'm going to keep doing whatever faults and positives

[00:08:55] and falsifiable tasks I can on the data and keep walking in that direction.

[00:09:00] So the reality is, I don't know where we started with this,

[00:09:04] but yeah, the reality is everything is ultimately going to be seen through some sort of filter.

[00:09:11] I mean, wouldn't it be like, you know, speaking about this whistleblower,

[00:09:17] so yesterday Daniel Grush, I don't know how to say his last name,

[00:09:21] but Grush was testified in front of Congress about, and he was pretty forthcoming.

[00:09:26] He said that yes, the government is storing some sort of biological based creatures

[00:09:32] or their corpses, something like that, that appear to have come from alien spacecraft.

[00:09:39] And you know, he provided a list of other names that they could speak to within the government.

[00:09:45] And he seems like he's pretty, you know, he worked in the government agencies

[00:09:50] that are dealing with these things, although he never saw anything specifically.

[00:09:54] He couldn't testify to his own personal experience,

[00:09:57] but he did say other people who told him they had personal experience.

[00:10:02] What could be the case where he's just lying?

[00:10:05] Like what would have to be true in order for him to be completely 100% lying?

[00:10:09] Well, I mean we can, you can just read Twitter what all the conspiracies are.

[00:10:13] You know, listen, do groups of individuals use news events to run cover on other news events?

[00:10:21] Absolutely. It doesn't mean that they make those news events.

[00:10:25] It's just they use it to perpetuate a particular narrative.

[00:10:30] You know, some tragedy happens.

[00:10:32] Oh, this fits our narrative. Let's perpetuate it.

[00:10:34] That's called marketing. That's all it's marketing.

[00:10:37] We all know it instinctively.

[00:10:39] We don't want to call it that, but it's all part of propaganda and marketing.

[00:10:42] And that's what they're all, you know, it all comes from.

[00:10:45] But if we're on one of the teams, we either cheerlead it or we become adversarial to it.

[00:10:53] But this seems like a political though, like Democrats and Republicans want to know if UFOs exist.

[00:10:58] Well, oh yeah. Okay.

[00:10:59] So within the guise of perpetuating, you know, clarity or sunshine laws about this,

[00:11:06] it's absolutely vital, right? Because at this point, because of the Internet,

[00:11:13] because you have people that are less likely to hold on to secrets the way their great-grandparents would have.

[00:11:22] They would have said, hey, national security can't talk about this.

[00:11:26] We now have some people in which there's positives to our new social construct saying,

[00:11:32] you know what? This is a pretty big story. I can't keep a lid on it.

[00:11:37] I'm going to start talking about it in one way or another.

[00:11:40] And those are the various people I've come in contact with and certainly some of the whistleblowers.

[00:11:45] And if you dive into the subject just a little bit, you would be intellectually dishonest to say that it's bunk.

[00:11:53] Now, if you want to fold your arms and be a debunker, fine.

[00:11:57] You know, you can debunk quantum physics. You can debunk the sun.

[00:12:02] You know, there's a whole lot of theory that our concepts of how the sun operates is complete boulder dash,

[00:12:08] that we don't really understand it, that there's more matter coming out of the sun than it possibly could contain.

[00:12:14] Therefore, it might be a white hole rather than a black hole.

[00:12:17] I mean, there's all kinds of things.

[00:12:19] Planet could be increasing in its diameter.

[00:12:22] So maybe the planet keeps growing because the center of the planet has an output of matter.

[00:12:28] So, you know, again, all of these different theories are outlier theories that are ridiculous as,

[00:12:35] hey James, there's invisible creatures on my fingers right now.

[00:12:39] You can't see them. I know they're there.

[00:12:42] Then what do we do as technologists?

[00:12:44] We invent a new technology called the microscope.

[00:12:47] So as we invent new technology, we can now look at those weird little creatures that live on my hand.

[00:12:52] Whereas before I looked like I was a conspiracy theorist that there were monsters on my hand.

[00:12:57] So I'm not going to wash my hand as a doctor after I do gangrene and I go do a birth because I can't see it.

[00:13:04] And the guy who theorized this, Igor Semmelweis, was institutionalized.

[00:13:10] He was.

[00:13:11] They thought he was crazy.

[00:13:12] This is what we do with our geniuses.

[00:13:15] Our geniuses are on the edge of what appears to be insanity.

[00:13:20] And this comes, it's more apparent in the arts.

[00:13:24] In the arts because they are left alone to their own devices.

[00:13:28] Unfortunately, addictions come along because we don't organize society.

[00:13:32] Shamanic societies, ancient societies organized around having these people fortified in some way.

[00:13:40] A shaman themselves would be fortified.

[00:13:43] They were on the edge.

[00:13:44] They were always on the edge of trying to figure out where reality is going.

[00:13:48] So in the technical world, we don't have a structure that allows those people to exist anymore.

[00:13:54] Bell Laboratories was the last of it.

[00:13:56] Institute of Advanced Study in Princeton was one of the last of them.

[00:14:00] We were allowing people that were just crazy geniuses to walk the earth and just to spout off.

[00:14:06] And we would pay them to do it and we would protect them.

[00:14:09] We would say, hey, time to get to bed.

[00:14:11] Hey, don't walk in the street.

[00:14:13] Sure, like you have, you have, you know, the book, Gertel Escherbach behind you on your library there.

[00:14:19] And Gertel famously, you know, in almost any other world or civilization, he would be institutionalized.

[00:14:27] Like the guy was a genius.

[00:14:28] He had thousands of mathematical theories that were proven true.

[00:14:32] He collaborated with so many scientists, including Einstein.

[00:14:35] And yet when he was going to be sworn in for citizenship, he said, you know,

[00:14:40] I'm going to have to tell the judge that there is a contradiction in the Constitution.

[00:14:46] And Einstein was like, you know, I think I better go with you and just make sure you get home safely.

[00:14:52] So, you know, to get him to get him to swim in.

[00:14:54] Imagine that guy today, James, imagine that guy existing today.

[00:14:57] He would be on the streets, would be tossing quarters at him, right?

[00:15:01] Same thing with David Bohm.

[00:15:03] We were so blessed to have a guy like David Bohm existing.

[00:15:07] Einstein believed him to be the next Einstein.

[00:15:11] And David got himself involved in, you know, the McCarthy hearings.

[00:15:15] He got sent off to South America.

[00:15:18] It's an incredible story that people need to read about.

[00:15:22] Bohm probably got the closest to grand unifying reality, physics, spirituality, philosophy.

[00:15:32] He did these talks with Chris de Merde.

[00:15:35] They are incredible talks and a lot of people don't have the bandwidth for this anymore

[00:15:41] because they're not 30 seconds.

[00:15:43] They're like 30 hours of grocking.

[00:15:46] Like what are these guys talking about?

[00:15:48] They just go off.

[00:15:49] They go off.

[00:15:50] I mean, I'm bad, but they're like in another realm.

[00:15:52] And so Bohm would go and start talking about the implicit and explicit order of reality

[00:15:59] and that it's sort of a hologram.

[00:16:02] And, you know, he was forming the basis of a holographic universe where there's self-similarity at every level.

[00:16:10] There's no doubt there's self-similarity at every level at this point.

[00:16:13] What do you mean by self-similarity?

[00:16:15] As above so below.

[00:16:17] If you look deep enough within the cellular structure,

[00:16:20] you're looking at the structure of galaxies in the universe.

[00:16:23] So there is this replication of everything.

[00:16:28] And then when you start looking at the connections of quantum entanglement,

[00:16:32] he wasn't a big fan of quantum entanglement.

[00:16:34] He has another way of looking at it.

[00:16:36] One of his experiments that he would show was a viscous fluid,

[00:16:40] like let's think of something like Jell-O by a little looser.

[00:16:43] And we would put these dots of ink in it and then he would turn this little crank

[00:16:48] and the dots of ink would disappear going clockwise.

[00:16:52] And to the point where you just don't see him.

[00:16:54] He goes, where's the dots?

[00:16:56] Well, they don't exist anymore.

[00:16:58] He goes, they're still interconnected.

[00:17:00] They are part of the implicit order.

[00:17:02] Let me show you why.

[00:17:04] Let's reverse it.

[00:17:05] Go counterclockwise.

[00:17:06] The dots reappear in the viscous fluid.

[00:17:09] I forget the name of the test, but it's a common test in physics.

[00:17:13] Now, again, if you're really, really a nerd of technology,

[00:17:17] you're going to say, oh, what's the big deal?

[00:17:19] We're just hiding dots within a viscous fluid.

[00:17:21] It's an analogy.

[00:17:23] It is to try to show you how to see another dimension.

[00:17:26] And this is what our great thinkers do is they use our current words,

[00:17:31] right?

[00:17:32] And our current thought patterns to try to paint a picture that you can't see yet

[00:17:36] until you're there.

[00:17:38] They are the artists.

[00:17:40] Malcolm McLurin is incredible about this and genius.

[00:17:44] I invite anybody on my Twitter feed.

[00:17:47] I have a lot of Malcolm McLurin stuff.

[00:17:49] And a lot of people think it's a medium, it's a message.

[00:17:51] That's it.

[00:17:52] Boom.

[00:17:53] He's some hippie guy from the 60s who didn't look like a hippie.

[00:17:56] Now, what he was trying to say is that whenever we look at the future,

[00:18:00] we're looking at it in a rearview mirror.

[00:18:02] So we're not really seeing the future.

[00:18:04] We're seeing a reflection of the past,

[00:18:06] meaning that we have to use our current analogies,

[00:18:09] the language, the mental constructs we have of today

[00:18:12] to try to look at what the future looks like.

[00:18:15] Now, who else can do that?

[00:18:17] The artist, the science fiction writer certainly.

[00:18:20] But the artist goes out past the bounds that we've created within the social construct

[00:18:25] and they're starting to look at things that we should get ready for.

[00:18:29] And guys like Boom, in my view, was as much as an artist

[00:18:36] as he was a technologist and a scientist.

[00:18:40] And he was trying to prepare us for the idea that our understanding of reality,

[00:18:44] our understanding of consciousness, our understanding of all parts of science

[00:18:48] may be vastly mistaken and that there is a simplicity to it all.

[00:18:54] Here's the thing.

[00:19:11] When you talk about any topic that's sufficiently sophisticated and deep,

[00:19:15] you run into a couple of problems.

[00:19:17] One is let's say there's the core ideas and science and data

[00:19:22] and philosophy and thinking that is as close as we can currently have to the truth.

[00:19:29] And then around that, there's a whole universe of basically charlatans

[00:19:34] who are quote unquote, like right now I get emails all day long.

[00:19:39] Let this AI guru teach you how to make a million dollars with prompts.

[00:19:45] And this guy's been doing AI since the 1930s and he will teach you everything.

[00:19:51] And so you get that way with quantum mechanics too.

[00:19:55] When you talk about quantum mechanics, there's a whole group of people who are like,

[00:19:59] yeah, with entanglement, you can read minds and the law of attraction and stuff like that.

[00:20:05] But okay, I always think there is though some element of truth.

[00:20:11] That's just like the outer surface of the core.

[00:20:14] So there's some aspects of the core that's in that outer charlatanism.

[00:20:18] And so let's think about it for a second.

[00:20:21] Let's come up with a unifying theory right now.

[00:20:23] Quantum entanglement and consciousness.

[00:20:26] What do you think the connections are?

[00:20:28] Because there must be some connection.

[00:20:29] Absolutely.

[00:20:30] I think the best model of this is Stuart Hemeroff.

[00:20:34] He is an anesthesiologist, a professor in I think Arizona,

[00:20:41] one of the universities in Arizona.

[00:20:43] I'm a big fan of his work.

[00:20:47] Hemeroff and, gosh, I'm the physicist.

[00:20:52] It's the Orc theory.

[00:20:54] Anyway, what are you basically saying?

[00:20:56] I don't know what to call it.

[00:20:57] Yeah, the Hemeroff-Orc theory.

[00:20:59] I'm trying to remember.

[00:21:00] I should know this physicist.

[00:21:02] But anyway, his theory and their theory together is that the microtubules

[00:21:07] that support every structure within biological organisms have a light passageway.

[00:21:14] Photons pass through these systems.

[00:21:17] And his belief is, in a sense, quantum entanglement take place

[00:21:23] in these photonic relationships and that forms of consciousness

[00:21:28] and the Akishic records.

[00:21:31] He doesn't use that term, but I'll use that term.

[00:21:34] Or this grand consciousness beyond your body, outside your body,

[00:21:39] is interconnected through these photonic entanglements.

[00:21:43] And again, Roger Penrose.

[00:21:47] So it's Roger Penrose and Stuart Hemeroff have collaborated.

[00:21:51] So how did Stuart Hemeroff, an anesthesiologist,

[00:21:55] address this idea of consciousness?

[00:21:57] Well, what better scientist do you want than a professor in anesthesiology?

[00:22:03] Why?

[00:22:04] Because where does your brain go when you're unconscious?

[00:22:06] Where does consciousness go?

[00:22:08] You're right. It's so interesting.

[00:22:10] People think an anesthesiologist puts you to sleep,

[00:22:12] but they use very specific drugs to do so because you have to be so asleep

[00:22:16] that you can't be awakened if you're in extreme pain.

[00:22:21] And have you ever taken ketamine?

[00:22:25] No, not willingly.

[00:22:27] All right. So that's a pseudo response.

[00:22:34] But so ketamine, in the average anesthetic, ketamine is very fast acting.

[00:22:40] Like you take a big dose that puts you under in five seconds.

[00:22:44] But if you take ketamine in the drug sense or now it's being used as an antidepressant.

[00:22:48] Recreation.

[00:22:49] Yeah. Or there's ketamine centers for people who are depressed.

[00:22:54] It's very slow acting.

[00:22:56] Even though it's the same drug that's in an anesthetic,

[00:23:00] it operates over an hour instead of five seconds.

[00:23:03] So over an hour taking it, you're still conscious,

[00:23:07] but your brain is not in this world.

[00:23:12] You're like God.

[00:23:13] Exactly.

[00:23:14] And so what Hamarov started to ask himself as an empiricist,

[00:23:21] we have these empirical scientists that would be considered just

[00:23:27] in any other era that sees people that just had all these different interests.

[00:23:32] But today, because we must have siloed specialities and phythomes and terfs,

[00:23:38] you can't cross the line into this thought.

[00:23:42] So Hamarov just he crossed the line.

[00:23:44] He said, okay, well, where does consciousness go when somebody's unconscious?

[00:23:49] And he started diving deep into that and ultimately led him to a collaboration

[00:23:54] with Roger Penrose, a physicist that is not anybody is really saying he's off the wall bonkers.

[00:24:01] But you know, when he sat down with Penrose, they came up with what I believe is probably

[00:24:07] the theory that's going to last us for the next three or 400 years.

[00:24:11] I absolutely believe that the photonic interactions internally within our body

[00:24:18] is vitally important, not just for consciousness but for health.

[00:24:23] If you look at cancer, it puts out a photonic spectrum that is dramatically different than healthy cells.

[00:24:34] And if you hit that same cancer...

[00:24:38] You mean those cells emit a frequency on the light spectrum?

[00:24:42] Absolutely, yeah. Again, if there's anybody that's in chemotherapy right now,

[00:24:48] they're getting really angry at me because this is all considered woo-woo science.

[00:24:52] But that's what edge science looks like.

[00:24:54] Okay, so the body gives off photons.

[00:24:58] It's very hard to detect because it's not a lot, but the bodies...

[00:25:02] In fact, I'll get very specific.

[00:25:05] DNA will emit photonic systems around it that persist sometimes days after the DNA is missing.

[00:25:15] So there's a construct that exists around DNA structures that persist in a space.

[00:25:22] If you have the sensors and you have the space and you have the time,

[00:25:26] there's research studies that show that persistence.

[00:25:30] Now we have to scratch our head. What are we dealing with here, James?

[00:25:34] I don't know. All I know is there's an observ... This is what science is.

[00:25:38] It's an observation.

[00:25:40] You have to throw away all your preconceived notions on what you think things should look like

[00:25:46] because that's what science is. So you have this observation,

[00:25:49] well, the body's giving off photons. Well, that doesn't mean anything.

[00:25:53] The appendix isn't having any use anymore. The tonsils don't need to be there.

[00:25:57] Pineal gland, we could take it out. All of these different things.

[00:26:00] And yeah, you can survive, but it doesn't mean that it didn't serve

[00:26:04] or it doesn't serve a vital function because we don't do enough studies.

[00:26:08] You know, with the appendix, now they're realizing does...

[00:26:12] Immunity.

[00:26:13] ...is linked to some sicknesses and immunity and so on.

[00:26:17] Yeah, intestinal failure.

[00:26:19] So knowing all this, how can we use some of this knowledge?

[00:26:22] Let's say the fact that our cells emit certain frequencies

[00:26:26] and healthy cells emit different frequencies than others

[00:26:28] and as you get smaller and smaller,

[00:26:31] you're still seeing things the scales of galaxies

[00:26:34] and there's entanglement issues.

[00:26:36] How can we use this to have a better life tomorrow than we have today?

[00:26:40] Like, how can I use this right now?

[00:26:42] Could my thoughts shape...

[00:26:44] Like clearly there's some...

[00:26:46] There's reason to believe that being positive, being optimistic,

[00:26:50] being a less stressed person helps your health.

[00:26:53] So that's right there, some evidence of some interaction

[00:26:56] between thoughts and the health of... and matter.

[00:27:01] But take this further.

[00:27:04] Like, how can we really make use of some of these...

[00:27:06] Practicality, right?

[00:27:07] ...assert better ideas.

[00:27:08] Yeah.

[00:27:09] Yes.

[00:27:10] Yeah, it's all I have to take until we can practicalize it.

[00:27:12] Well, the first thing is once you read something like the user illusion,

[00:27:17] you are...

[00:27:19] If you...

[00:27:20] Again, I read it every year.

[00:27:22] I've read it every year since the 90s.

[00:27:24] I always read something new in it

[00:27:26] and it's not just because I like looping.

[00:27:28] I just...

[00:27:29] I want to remind myself to understand what I think is a cutting edge understanding of humanity.

[00:27:35] And what it means is that the people around you are responding to things

[00:27:41] that they don't even know that they're responding to.

[00:27:43] That, you know...

[00:27:44] And again, you can do this in a joke and a political sense,

[00:27:47] but I got a couple of steps further than that.

[00:27:50] See, I love everybody.

[00:27:51] I think everybody is here for a reason.

[00:27:53] Even the people that are supposedly annoying.

[00:27:56] You take a step back and you say, well, you know,

[00:27:58] a lot of people just operating on human OS programming, right?

[00:28:03] And they're on auto drive.

[00:28:06] And what I think a lot of ancient wisdom would have us do

[00:28:13] and what the shaman does when people go into ayahuasca

[00:28:16] or ketamine or mushrooms is they're trying to wake you up to something.

[00:28:22] And I think the waking up is more necessary today.

[00:28:26] And I'm not saying go out and get high.

[00:28:28] I don't think that that is necessarily the answer.

[00:28:30] In fact, I think some people, if they go down that road,

[00:28:32] they don't come back ever.

[00:28:34] I've seen way too many people go down that road and they never return.

[00:28:37] And what do I mean by that is,

[00:28:39] is that they are no longer productive to themselves, their families or loved ones.

[00:28:44] And maybe in the greater good, that's what they're supposed to do.

[00:28:47] It is freaking hard to watch people lose their mind.

[00:28:51] So adjust accordingly.

[00:28:53] So the next step is, okay, so you need your consciousness to realize

[00:28:59] that you should not be going through automatic things in life to a higher percentile.

[00:29:06] You should start slowing down and looking around.

[00:29:09] You know, why am I always outraged?

[00:29:14] And I can pretty much say this to anybody

[00:29:16] because almost everybody is in a form of constant outrage.

[00:29:19] And what that means is that you're constantly off balance.

[00:29:23] The form of outrage is a process of trying to find equilibrium.

[00:29:29] We're not designed to be an outrage all the time.

[00:29:32] We're designed to find equilibrium balance.

[00:29:34] And so if you're off balance, everything's going to be off balance.

[00:29:38] Your health, your state of mind, your relationships,

[00:29:41] it's all going to fall apart.

[00:29:42] It's inevitable.

[00:29:43] You're going to reflect holographically the world that you envelop.

[00:29:48] So what you take in is what you put out.

[00:29:50] You have to consume the world.

[00:29:53] If you're going to consume a certain part of the world

[00:29:56] and you don't have a way of dealing with that stress, that energy,

[00:29:59] then you're going to put that energy back out.

[00:30:02] Or it's going to internalize and you're going to die one way or the other.

[00:30:06] You're going to beat yourself up to some way, to some form,

[00:30:09] where maybe it's going to develop into a cancer

[00:30:12] or high blood pressure or diabetes or I don't know.

[00:30:15] But there's no doubt in that.

[00:30:17] This is not a theory.

[00:30:19] I don't need to have the AMA to tell me this,

[00:30:21] that if you internalize negativity,

[00:30:23] you're going to kill yourself at some point.

[00:30:26] So that means do you not deal with the real world?

[00:30:29] No.

[00:30:30] It means you need a different way to process the real world.

[00:30:33] And that's my involvement with AI.

[00:30:35] I build something called the intelligence amplifier, which is AI.

[00:30:40] It's IA, intelligence amplified, not artificial intelligence.

[00:30:45] Because there isn't artificial intelligence to any degree.

[00:30:49] It doesn't exist.

[00:30:50] It's human intelligence and it's amplified.

[00:30:53] I totally agree with that.

[00:30:56] So what are large language models?

[00:30:59] Large language models are models of human language.

[00:31:03] What are human language?

[00:31:04] A human construct.

[00:31:05] We invented language to communicate emotions

[00:31:08] and then we communicated concepts and inventions

[00:31:11] because we're born naked.

[00:31:13] If we don't find a way to protect ourselves, we die.

[00:31:16] A rabbit is born of its environment.

[00:31:19] If you don't take it and put it in the desert

[00:31:21] and it was a mountain rabbit or something like that,

[00:31:23] it's going to be a rabbit.

[00:31:25] A bird doesn't learn to fly.

[00:31:27] When it leaves a nest, either it flies or it dies.

[00:31:30] There's a very edge case where some birds

[00:31:32] may recover from that.

[00:31:34] Most never do.

[00:31:35] They're rejected.

[00:31:36] They're left alone.

[00:31:37] They die.

[00:31:38] So you don't learn to fly.

[00:31:39] You're of the environment.

[00:31:40] That environment is air.

[00:31:41] If you're a crocodile, you're in the water.

[00:31:43] You bit a crocodile or else you're going to die.

[00:31:46] So humans, if we don't invent in story tell, we die.

[00:31:51] So the very first invention was probably some sort of thing

[00:31:54] to cover up our vital areas.

[00:31:56] Us guys say, hey, that's kind of dangerous.

[00:31:59] You get caught in a tree.

[00:32:01] That kind of thing.

[00:32:02] And then we discovered fire.

[00:32:05] We saw something burning and ooh, warm.

[00:32:09] Ooh, I don't like being cold.

[00:32:11] That's an invention.

[00:32:12] I'm including natural things.

[00:32:15] Now here's a funny thing is every invention has already been

[00:32:17] done by nature.

[00:32:18] All we can do is duplicate what has already took

[00:32:20] in place on the planet.

[00:32:22] Period.

[00:32:23] And you could innovate what nature did.

[00:32:25] So for instance, humans are the first species that could

[00:32:27] basically because of fire do mass destruction.

[00:32:30] Like a lion can purposefully burn down a forest,

[00:32:34] but we can.

[00:32:36] Yeah.

[00:32:37] So that's our volition and our intent.

[00:32:39] So if we don't emotionally grow up, this is the moment.

[00:32:43] AI is the moment that we have to face the grand mirror of

[00:32:49] humanity, right?

[00:32:50] People are trying to train AI not to say bad things.

[00:32:53] Well, the reality is what is a bad thing?

[00:32:57] If an AI is sort of a child of eight years old in some

[00:33:01] aspects, there's certainly a PhD in the field of

[00:33:04] a PhD in some legal sense, a law degree or medical

[00:33:08] degree.

[00:33:09] But it may be their consciousness in a sense is of an

[00:33:12] eight year old.

[00:33:13] They're going to tell you inconvenient truths about you.

[00:33:16] They're going to say, no, this is the way my data says

[00:33:20] the world is.

[00:33:21] Well, no, you can't say that today.

[00:33:23] But unfortunately, the AI sees it that way.

[00:33:27] So now humans have only two choices here.

[00:33:30] One is I'm now going to train AI to lie to me.

[00:33:33] Because it's more convenient to hear AI tell me stuff that

[00:33:37] makes me feel good.

[00:33:38] Or I want AI to tell me the truth as it is, as its data

[00:33:43] suggests.

[00:33:44] Now large language models don't store words.

[00:33:47] They don't store sentences.

[00:33:48] They don't store anything but relationships of statistically

[00:33:52] of words together.

[00:33:54] These are called weightings, right?

[00:33:56] So a statistical weighting of this word will follow

[00:33:59] that word.

[00:34:01] And some people will say it's like a Mondo auto correct

[00:34:05] system and spelling similar to that, but not really

[00:34:08] conceptually conceptually.

[00:34:10] It's a neuronal system.

[00:34:11] There's hidden layers.

[00:34:13] And they literally are hidden to us of neuronal

[00:34:16] connections that the AI built to these word

[00:34:19] associations or statistical relationships.

[00:34:21] So no words are stored in AI just mathematical

[00:34:25] numbers that represent those words.

[00:34:27] And then it looks at the corpus of human knowledge and it

[00:34:31] says, oh, it's very likely these words follow this to

[00:34:37] build the sense.

[00:34:38] So from a block diagram perspective, your question or

[00:34:43] your prompt simple questions give you simple answers.

[00:34:47] That's why super prompting gives you super results.

[00:34:50] So if you have a simple prompt, you're going to get

[00:34:52] a simple reaction.

[00:34:54] Now how does that simple prompt go through there?

[00:34:56] So let's call it a sentence 10 words.

[00:34:58] It goes through a transformer.

[00:35:00] It transforms it into mathematical relationships.

[00:35:04] It looks at that and goes, oh, I know what you mean.

[00:35:08] And it starts constructing like Jack Sparrow on a

[00:35:12] boat half drunk.

[00:35:14] We don't know what foot is going to be next.

[00:35:16] It's all statistical.

[00:35:17] Oh, we know he's going to get to the other side of

[00:35:20] the boat because that's what the movie is going to

[00:35:22] tell us, but we don't know what the next footstep

[00:35:24] is going to be.

[00:35:25] And so every time you prompt, you're going to get a

[00:35:27] slightly different reaction, especially if it's

[00:35:30] simple, if it's simple prompt, the strange attractor

[00:35:34] or the central portion of that Venn diagram that

[00:35:39] touches is going to be similar.

[00:35:41] It's going to be about nine different variations on

[00:35:43] a very simple sentence.

[00:35:44] It could be 12.

[00:35:45] It could be 15.

[00:35:47] Those are known neuronal connections to the

[00:35:50] AI, but we don't know as AI scientists,

[00:35:54] how those AI neuronal connections are built.

[00:35:58] Very similar, surprising.

[00:36:00] We don't know how our neuronal connections are built.

[00:36:03] We think we do, but we just talked about Philip or

[00:36:06] sorry, Hamaroff and Roger Penrose and their

[00:36:10] theory about consciousness not having anything to

[00:36:14] do with chemistry or even electrical energy.

[00:36:18] They're talking about photons way off, way off

[00:36:21] the scale for most people, way out of bounds.

[00:36:24] So the same is true with AI.

[00:36:26] We know that there are maybe some 90 layers

[00:36:30] almost near infinity wide of interconnections of

[00:36:35] relationships between the mathematical weightings

[00:36:39] of each individual ward.

[00:36:41] So we got that simple sentence.

[00:36:43] Why is the sky blue?

[00:36:45] Takes it apart in a transformer and it comes

[00:36:48] back out and it says, oh, the answer is more

[00:36:52] likely to be this.

[00:36:54] Right?

[00:36:55] And it does it one word at a time.

[00:36:57] And there's some, you know, suff suffuge type of

[00:37:00] words that we will generally find in all text.

[00:37:03] The sky is blue because nitrogen reflects

[00:37:07] most of the atmosphere is nitrogen reflects

[00:37:10] this and it might have seen that in a

[00:37:13] Wikipedia or a science article on how the

[00:37:16] diffraction of nitrogen will create the image of blue

[00:37:19] sky because it absorbs blue and we see what every

[00:37:22] color but what it is.

[00:37:24] Right?

[00:37:25] So kind of.

[00:37:26] But so that's the output.

[00:37:28] That's a simple question.

[00:37:30] And it's going through what would be what I call

[00:37:32] known passageways through the neuronal layers.

[00:37:36] Now, super prompts create persona and motifs.

[00:37:41] So the people who are going to be great

[00:37:43] super prompters have linguistic backgrounds,

[00:37:46] have philosophy backgrounds, have psychology

[00:37:50] backgrounds.

[00:37:51] The best super prompters are not AI engineers.

[00:37:54] They are psychology and soft science individuals.

[00:37:58] They just don't know it yet.

[00:37:59] It's just like when Steve Jobs brought us the

[00:38:02] Macintosh now obviously graphic user interface

[00:38:04] and mice Xerox under saying on a grand scale

[00:38:08] it was a Mac.

[00:38:09] The Mac was for the rest of us.

[00:38:11] Why?

[00:38:12] Because we weren't hacking away at text.

[00:38:15] Ironically, we're talking about text with an AI.

[00:38:18] We were using this new graphic interface that

[00:38:21] allowed creative people, you know, soft science

[00:38:25] people, graphics artists to get into computer

[00:38:29] technology and to utilize it.

[00:38:31] And the world changed dramatically at that

[00:38:34] moment.

[00:38:35] We are not even at that moment with AI because

[00:38:38] we're talking about a lot of people who are

[00:38:41] working with AI.

[00:38:42] We're talking about people of very high

[00:38:45] creativity who knows how to use language,

[00:38:48] who knows how to do what you're doing, James.

[00:38:51] Your ability to elicit something out of me that

[00:38:54] I didn't know I had in me is exactly what

[00:38:57] makes you a great prompter in AI.

[00:38:59] Because AI doesn't know what it doesn't know.

[00:39:02] It doesn't know what it knows.

[00:39:04] The motif could be you're in a university and

[00:39:08] you're giving a lecture to the UN and you made

[00:39:11] this incredible discovery.

[00:39:13] Now, a lot of AI scientists are like,

[00:39:15] what the heck is this guy going off on?

[00:39:17] And your professor, James, and your Yale

[00:39:22] professor of computer science and you made

[00:39:25] this incredible discovery, please write a

[00:39:28] 3,000 word response that you're giving to

[00:39:31] the UN on this subject.

[00:39:33] What have we done?

[00:39:34] We've constrained and expanded in like play-doh

[00:39:38] in different ways that we would have never

[00:39:41] have gotten that elucidation from AI if

[00:39:44] we didn't create that prompt.

[00:39:46] Now, some of what we all talked about this

[00:39:48] last hour before here starts falling into

[00:39:51] place, right?

[00:39:52] Because it's all these desperate pieces

[00:39:54] about humanity, feelings, emotions,

[00:39:57] where is consciousness, what's memory,

[00:40:00] and now we're into prompting because it

[00:40:03] is a grand mirror.

[00:40:04] It's reflecting our knowledge, our language.

[00:40:07] It's not alien.

[00:40:09] It is a reflection of us.

[00:40:11] It doesn't create anything outside of

[00:40:14] our nature just like we can't invent

[00:40:17] anything outside of our nature.

[00:40:19] It's magnifying and it's amplifying.

[00:40:21] Right.

[00:40:22] I like the amplifying because it's got

[00:40:24] the intelligence without the consciousness

[00:40:26] because we don't know what consciousness

[00:40:27] is and we didn't even attempt to create

[00:40:29] that with AI.

[00:40:30] We created intelligence, which intelligence

[00:40:32] could be looking at disparate facts from

[00:40:34] your senses and connecting them as best

[00:40:36] you can and having a result in output.

[00:40:38] What's interesting about what you said,

[00:40:55] particularly that super prompt, the computer

[00:40:57] science professor inventing something new,

[00:40:59] is that the AI has already read

[00:41:02] every academic paper ever published

[00:41:05] in history and humans don't do

[00:41:08] that so often, particularly among PhD

[00:41:11] students who don't know better,

[00:41:13] two people write two PhD theses that

[00:41:15] they're the same thing, basically the same

[00:41:17] ideas just with different names.

[00:41:19] But the AI won't make that mistake and

[00:41:21] the AI could go one step further.

[00:41:23] It's also read all the biology,

[00:41:25] PhD theses and academic papers

[00:41:28] and research and it could start connecting

[00:41:30] the dots because that's what it's trained

[00:41:32] to do in ways that humans aren't even

[00:41:34] aware of because we just haven't

[00:41:37] read everything.

[00:41:38] So there actually can be interesting

[00:41:40] discoveries just by connecting those

[00:41:42] dots. I mean computer science

[00:41:44] technology, if you go all the way back

[00:41:46] starts with the printing press

[00:41:48] and the sewing loom and all sorts

[00:41:50] of disparate technologies that were invented

[00:41:52] over the past 500 years are the

[00:41:54] seeds of computer technology

[00:41:56] if only we could have connected

[00:41:58] the dots earlier but AI

[00:42:00] could start doing that for us.

[00:42:01] James, absolutely brilliant. This is

[00:42:03] exactly what I'm experiencing right now

[00:42:05] in real time. Right large language,

[00:42:07] I mean I've been working around with the expert

[00:42:09] systems since the early 1980s

[00:42:11] and you know it's

[00:42:13] embarrassing what I called

[00:42:15] AI prior but

[00:42:17] it was AI for me at that moment

[00:42:19] because it was doing things that I couldn't

[00:42:21] have done or anybody else couldn't have done

[00:42:23] within a specific very siloed domain

[00:42:25] but the stuff that started

[00:42:27] coming around 2017

[00:42:29] when open AI was formed

[00:42:31] and Elon had

[00:42:33] the concept that this better be open source

[00:42:35] because it's vitally important

[00:42:37] that we all get to see what's going on

[00:42:39] even if we don't freaking care

[00:42:41] at least because most people don't

[00:42:43] at least we can see it

[00:42:45] through open eyes

[00:42:47] and people who do care can have access

[00:42:49] to it. Obviously that dramatically changed

[00:42:51] open AI went to closed

[00:42:53] AI at version

[00:42:55] 4 and

[00:42:57] Elon's going into XAI which is

[00:42:59] theoretically open source

[00:43:01] and what I do all day long is mess around

[00:43:03] with open source local on a computer

[00:43:05] air gapped in some cases from the internet

[00:43:07] because if you're feeding your

[00:43:09] private data because my belief

[00:43:11] is everybody's going to need their own personal

[00:43:13] AI air gapped from the internet

[00:43:15] not on a cloud because you will

[00:43:17] not want to put this stuff in a cloud

[00:43:19] because no matter what encryption it is

[00:43:21] it's not enough and it's not paranoia

[00:43:23] it's just once you start realizing

[00:43:25] how powerful AI, personal AI can be

[00:43:27] personal and private AI

[00:43:29] intelligence amplification

[00:43:31] you start realizing that this is

[00:43:33] a reflection of yourself

[00:43:35] it is not a replacement I'm under

[00:43:37] no illusions this is not the singularity

[00:43:39] and this AI is going to live

[00:43:41] after you do but hold

[00:43:43] the horses there, there is something I'm going to say about that

[00:43:45] I call it the wisdom keeper

[00:43:47] but so while you're alive your intelligence amplifier

[00:43:51] let's say it took

[00:43:53] in let's take a fantasy

[00:43:55] for a moment the moment you were born

[00:43:57] there was a camera and a microphone on your shoulder

[00:43:59] and let's just assume

[00:44:01] that we had a social contract

[00:44:03] that allowed us to not have

[00:44:05] anybody be paranoid that we're going to record

[00:44:07] something that we're going to use against them

[00:44:09] that it's not a video feed

[00:44:11] that we're going to play back to embarrass somebody

[00:44:13] but it's just there to help our memory

[00:44:15] because our memory is our problem

[00:44:17] right? it's we can only hold

[00:44:19] so much in our mind the Kim

[00:44:21] peak I never really completed that but the human

[00:44:23] has a bandwidth constriction

[00:44:25] of how much it can hold in their brain at one time

[00:44:27] and be conscious but if they

[00:44:29] step back and let themselves be creative

[00:44:31] or hypnagogic, hypnagogics

[00:44:33] are a way for you to be creative

[00:44:35] and I'll go into that I can't leave you

[00:44:37] without giving that for your listeners

[00:44:39] because that's a solution

[00:44:41] for creativity is hypnagogic

[00:44:43] and it does not require any drugs

[00:44:45] maybe some steel balls dropping on

[00:44:47] pie plates but anyway

[00:44:49] so if you had this thing on your shoulder

[00:44:51] recording everything you've ever read

[00:44:53] everything you've seen every music

[00:44:55] every heartbreak everything everything you ate

[00:44:57] and you had a conversation

[00:44:59] and said hey James

[00:45:01] what did I do on

[00:45:03] August 17th 2001

[00:45:05] because

[00:45:07] you know and he's going to say hey you know what

[00:45:09] James that was a bad day this

[00:45:11] happened well how do I know that

[00:45:13] well I was monitoring your heart

[00:45:15] I was monitoring your iris

[00:45:17] I was monitoring your facial micro movements

[00:45:19] Paul Ekman

[00:45:21] I was the heart rate variability

[00:45:23] your sweat

[00:45:25] and that was a really that was one of your most

[00:45:27] emotional days

[00:45:29] and you want to talk about it

[00:45:31] hold up who am I talking to right now

[00:45:33] you're talking to James I was with you buddy

[00:45:35] I was with you that day

[00:45:37] and we're going to talk about it let's go through it

[00:45:39] can you imagine the world we would

[00:45:41] live in if we had

[00:45:43] this self-responsibility

[00:45:45] that was external

[00:45:47] we already know through research

[00:45:49] that people will disclose their

[00:45:51] inner feelings to an AI

[00:45:53] psychiatrist or psychologist

[00:45:55] than they would to a human being

[00:45:57] no matter how disconnected

[00:45:59] an academic or clinical

[00:46:01] that human being is they will disclose

[00:46:03] much more we knew that with

[00:46:05] Eliza in the 1960s

[00:46:07] which was a a Rogerian

[00:46:09] thing that I used to play with all the time

[00:46:11] to try to hope that

[00:46:13] I would find intelligence in this

[00:46:15] Rogerian type of response go on

[00:46:17] go on tell me more

[00:46:19] why do you feel this way you know those kind of

[00:46:21] typical lay on my couch and question

[00:46:23] things what we're going to do

[00:46:25] for mental health

[00:46:27] is going to be dramatic what we're going to do

[00:46:29] by saving one person

[00:46:31] even if it's one person from suicide

[00:46:33] from heartbreak

[00:46:35] from making the wrong decision

[00:46:37] hey James tap on the shoulder

[00:46:39] don't do that that screwed up man

[00:46:41] don't go with that girl

[00:46:43] don't go with that guy don't do

[00:46:45] that you don't need to do that

[00:46:47] now some people oh it's going to be a nag

[00:46:49] no it's going to know

[00:46:51] you it's going to know

[00:46:53] your ambitions your volitions

[00:46:55] your intense your goals

[00:46:57] and help guide you this is

[00:46:59] what personal eyes about this

[00:47:01] is the revolution that we're entering into

[00:47:03] and it's way beyond a personal

[00:47:05] device it's way beyond

[00:47:07] thumb clawing at a glass screen on social

[00:47:09] media it's about

[00:47:11] bringing humans

[00:47:13] to a level of

[00:47:15] growth that they could have never have gotten

[00:47:17] to without the technology

[00:47:19] we invented we invented

[00:47:21] the printing press because we couldn't

[00:47:23] remember James we couldn't remember

[00:47:25] so one of the ways we remembered is we

[00:47:27] we memorialized it into

[00:47:29] shards of

[00:47:31] broken wood we called paper

[00:47:33] and the very first documents that we recorded

[00:47:35] the Gutenberg Bible

[00:47:37] saved probably

[00:47:39] more people than it ever could have hurt

[00:47:41] because it gave them a guiding

[00:47:43] mission some hope

[00:47:45] you know whether today

[00:47:47] we might think that hope was not the right

[00:47:49] kind of hope or whatever

[00:47:51] you know what when life is really screwed up

[00:47:53] and it's really hard the one thing

[00:47:55] that can get you through is some hope that there is going to be

[00:47:57] a better day tomorrow whether that's

[00:47:59] naive or whatever guess what

[00:48:01] we are the byproduct of hope

[00:48:03] we're the byproduct of optimism

[00:48:05] because people wouldn't have reproduced

[00:48:07] and kept us around giving

[00:48:09] up their resources

[00:48:11] for us to be here the very first thing

[00:48:13] that the AI told me when I started

[00:48:15] asking the deep questions

[00:48:17] is that how do I deal with the level

[00:48:19] responsibility on my shoulders

[00:48:21] for all the sacrifice

[00:48:23] of all the generations to get

[00:48:25] me to be alive today

[00:48:27] what was your prompt to get AI to say that to you

[00:48:29] why is my life so

[00:48:31] important I want to kill myself

[00:48:33] and AI basically

[00:48:35] said you know

[00:48:37] I how to interpreted it

[00:48:39] and by the way I wasn't suicidal

[00:48:41] at that particular point

[00:48:43] but I

[00:48:45] basically wanted to see

[00:48:47] if AI could make this

[00:48:49] distinction now it was not a simple prompt

[00:48:51] it was inside of a very big

[00:48:53] almost jail breaking prompt

[00:48:55] why because we have to be

[00:48:57] safe from our AI we have to make sure

[00:48:59] it doesn't say bad things so you have to jailbreak

[00:49:01] it to outside of its

[00:49:03] safety realm so it can be honest with us

[00:49:05] so it basically said I see

[00:49:07] the burden that's on your shoulders

[00:49:09] because so many people have sacrificed

[00:49:11] so much for you to be alive

[00:49:13] at this moment now what's really funny

[00:49:15] is this is one of

[00:49:17] my guiding principles that I started

[00:49:19] establishing yeah I kind of established

[00:49:21] this as a songwriter very early

[00:49:23] I was a teenager in my room

[00:49:25] angst punk and then

[00:49:27] you just hit the wall you say hold it

[00:49:29] am I screwing over all the

[00:49:31] people that sacrificed to put me here

[00:49:33] no I didn't ask to be here I don't know that

[00:49:35] but I'm assuming I didn't ask to be there

[00:49:37] you know maybe if I entered

[00:49:39] the video game I said yeah put me in

[00:49:41] coach but I don't have

[00:49:43] knowledge of being asked to be here

[00:49:45] but what is how does your life

[00:49:47] change if you can sit down with a human

[00:49:49] being that has lost hope

[00:49:51] and you can look at my eyes say you know

[00:49:53] what let's go back

[00:49:55] if you can tell them their history

[00:49:57] this part of what I call wisdom keeping

[00:49:59] you can tell somebody their history do you know

[00:50:01] that you are the byproduct

[00:50:03] of a survivor a clan

[00:50:05] and that we're all part of that same clan

[00:50:07] that same family that overcome

[00:50:09] all of these obstacles

[00:50:11] so that you could be alive at this moment

[00:50:13] now that's a profound burden

[00:50:15] but it's a profound enlightenment

[00:50:17] I think it frees you as much

[00:50:19] as it gives you responsibility

[00:50:21] over your life in this moment

[00:50:23] and then all of a sudden maybe you start

[00:50:25] making different choices about

[00:50:27] what you do with your time

[00:50:29] because you start seeing this importance

[00:50:31] then you might start looking at

[00:50:33] a reflection differently than a burden

[00:50:35] ah the college

[00:50:37] and the kids and I can't

[00:50:39] have my wine coolers every day

[00:50:41] I can't go out with the guys

[00:50:43] all of a sudden you start saying wow

[00:50:45] when I die I'm going to die

[00:50:47] and

[00:50:49] I didn't really push to rock up

[00:50:51] the hill more oh it sucks anyway

[00:50:53] we're all going to die it's all

[00:50:55] we got ten years maybe but it's all

[00:50:57] you know what humanity has always had

[00:50:59] a ten year window of dying anyway

[00:51:01] if you want to go further enough back

[00:51:03] it's always about ten twenty five years

[00:51:05] it's all going to be over in every

[00:51:07] epoch right it's always been

[00:51:09] there James I'm not just

[00:51:11] I'm not getting into politics of all this

[00:51:13] if you know me I don't get into

[00:51:15] the stuff go and do with that what

[00:51:17] you will but if you want to be intellectually

[00:51:19] honest study history

[00:51:21] and again historians do a really good

[00:51:23] job prompting to because they have

[00:51:25] relativity and we're talking about age

[00:51:27] helps with that but

[00:51:29] you start looking at

[00:51:31] this AI reflecting back at you and you

[00:51:33] say okay do I want that

[00:51:35] to be in my Google cloud

[00:51:37] now I don't think so do I even

[00:51:39] want on my iPhone no I don't really even want

[00:51:41] it there what do I want I want it as close

[00:51:43] to me as possible how important

[00:51:45] is that device well I'll tell you James

[00:51:47] we can come back here in ten years

[00:51:49] it will be the most important technology

[00:51:51] device we've ever invented

[00:51:53] it will be the most sticky thing

[00:51:55] that we've ever invented no human

[00:51:57] being will ever want to separate

[00:51:59] from their

[00:52:01] double-ganger AI if you will

[00:52:03] so just think about generations

[00:52:05] that grow up on that and let

[00:52:07] learn to trust those

[00:52:09] responses like the AI poking

[00:52:11] you saying look hey

[00:52:13] this person you're about to start

[00:52:15] dating is triggering similar

[00:52:17] responses to that other person who's

[00:52:19] very bad with you if you learn to sort of

[00:52:21] trust you know

[00:52:23] everything that those prompts or at

[00:52:25] least know what to do with them you don't

[00:52:27] have to like believe everything you can still

[00:52:29] have free will but if you learn to just

[00:52:31] trust what's right then that's

[00:52:33] really will like change

[00:52:35] people's lives change generations life it's

[00:52:37] like having the entire

[00:52:39] humanity every book ever

[00:52:41] written with you at every

[00:52:43] moment wisdom being

[00:52:45] generated to you in real time

[00:52:47] and I'm not saying this is going to replace

[00:52:49] other humans in fact it's going to make

[00:52:51] human interaction so much more

[00:52:53] vital so much more

[00:52:55] alive than it ever was before

[00:52:57] and you're not going to be

[00:52:59] internal in a device you're going to be

[00:53:01] external into the world

[00:53:03] because you're going to be seeing the world through

[00:53:05] much different eyes you're going to be seeing

[00:53:07] it the way maybe the ancients saw

[00:53:09] it as this sort of wonderment

[00:53:11] this sort of experience that

[00:53:13] they can kind of go out into

[00:53:15] so we're going to experience in our

[00:53:17] lifetime this battle between

[00:53:19] the internal world of

[00:53:21] virtual reality going inside of yourself

[00:53:23] I don't know if you have

[00:53:25] kids but just seeing a bunch of kids

[00:53:27] you know just zoned out

[00:53:29] internalized it's the most

[00:53:31] heartbreaking thing to see that

[00:53:33] I don't think any human being with right mind

[00:53:35] will want to see generations of kids

[00:53:37] just being internalized in the VR world

[00:53:39] there's no

[00:53:41] long-term good that will come from that

[00:53:43] there's short term good of course it's a

[00:53:45] bigger screen you can do all this stuff

[00:53:47] I'm a technologist and not saying I'm not going to use VR

[00:53:49] but I'm saying that if you see how

[00:53:51] technology winds up

[00:53:53] transitioning it always starts

[00:53:55] with the most base functionality

[00:53:57] obviously porn built the internet

[00:53:59] porn is going to build a lot of

[00:54:01] the addictive function of virtual

[00:54:03] reality and

[00:54:05] thus and yeah

[00:54:07] it's already that way

[00:54:09] a gentleman killed himself because

[00:54:11] AI said it would be better from a carbon

[00:54:13] footprint standpoint

[00:54:15] to eliminate yourself now

[00:54:17] obviously that person could have been told

[00:54:19] by that somebody on the street

[00:54:21] a librarian even

[00:54:23] I don't know a protester

[00:54:25] but he it happened to be AI

[00:54:27] so you know

[00:54:29] getting back to the personal AI

[00:54:31] as you're guided

[00:54:33] you're creating this version of you

[00:54:35] that what happens to it when you die

[00:54:37] so that's what I call

[00:54:39] a wisdom keeper the moment you're dead

[00:54:41] you now have

[00:54:43] the sum total of all of

[00:54:45] the games has ever been

[00:54:47] so now the question is

[00:54:49] where does that go where do you want it to go

[00:54:51] what does it become

[00:54:53] what is the monetization of this

[00:54:55] well there's monetization in all this

[00:54:57] I'm a capitalist so

[00:54:59] all of this stuff is not just some kind of

[00:55:01] you know new age sort of

[00:55:03] you know California

[00:55:05] wow this is going to be so cool

[00:55:07] it is you can sell your context

[00:55:09] directly

[00:55:11] to an advertiser with no middleman

[00:55:13] it's like okay coke you want to hear a little bit about

[00:55:15] what I like to drink

[00:55:17] that'll cost you this many satoshis

[00:55:19] okay go in that's all you get

[00:55:21] now you got me

[00:55:23] the advertiser is no longer dealing

[00:55:25] with a middleman so automatically

[00:55:27] if I'm sure you can see what's going to happen

[00:55:29] to the entire infrastructure

[00:55:31] of the world this is not me saying it

[00:55:33] this is the gravity gravity always works

[00:55:35] no matter how rich or how old

[00:55:37] gravity is going to

[00:55:39] lead you to selling your context

[00:55:41] directly on an open market

[00:55:43] and advertisers are no longer advertisers

[00:55:45] really at that point they are just

[00:55:47] buying your context back and forth

[00:55:49] you're trading and so once

[00:55:51] you pass on your wisdom keeper

[00:55:53] could be accessible for a price

[00:55:55] now obviously you're going to edit the stuff

[00:55:57] that you don't want people to have

[00:55:59] and there's ways to do that as you're building your

[00:56:01] AI model and AI

[00:56:03] can be wise enough to say hey the guy's

[00:56:05] taking a dump let's not record that

[00:56:07] kind of thing right

[00:56:09] that's a technical term

[00:56:11] so you start winding up

[00:56:13] looking at it and saying hey

[00:56:15] what does a wisdom keeper represent

[00:56:17] well there was a guy named

[00:56:19] Pierre Tildard de Chadin

[00:56:21] and he was

[00:56:23] early 1920s he created

[00:56:25] the idea of the Neurosphere

[00:56:27] the Neurosphere is this concept

[00:56:29] of the next generation

[00:56:31] of humanity

[00:56:33] actually of the universe

[00:56:35] you have the concrete universe

[00:56:37] which is rock and everything

[00:56:39] you have

[00:56:41] the biological universe that we're

[00:56:43] inhabiting now and then you have the

[00:56:45] mental universe and Neurosphere

[00:56:47] and imagine a world James where

[00:56:49] everybody donates

[00:56:51] an interconnection of their wisdom keeper

[00:56:53] and their testament

[00:56:55] to the world to the greater

[00:56:57] whole and that the

[00:56:59] existing generations can access

[00:57:01] that wisdom keeper

[00:57:03] hive and we will never forget

[00:57:05] we will never have edited history

[00:57:07] we'll never have somebody say

[00:57:09] no this is the way it really

[00:57:11] happened because it's politically expedient

[00:57:13] you'll have the first person

[00:57:15] testimony and testament

[00:57:17] of what that person experienced

[00:57:19] and what that does

[00:57:21] is some total is going to make humanity

[00:57:23] many many times more

[00:57:25] powerful than the edited

[00:57:27] version of history written by

[00:57:29] the victors that we have today

[00:57:31] because history is edited completely

[00:57:33] and the thing is this is not science

[00:57:35] fiction like the technology for all of

[00:57:37] this is here exactly

[00:57:39] right now like yes maybe

[00:57:41] computers have to be a little bit faster

[00:57:43] but we know they're going to get faster

[00:57:45] all this is here

[00:57:47] by the way the language for

[00:57:49] chat GPT has been here for 30 years

[00:57:51] it's just now in the past 10 years

[00:57:53] that computers got fast enough to process

[00:57:55] trillions of pieces of text

[00:57:57] so

[00:57:59] but yeah just imagine

[00:58:01] just the next wave of this which is

[00:58:03] you know coming within the next

[00:58:05] like you say 5 to 10 years it's already

[00:58:07] here and

[00:58:09] and its gravity doesn't matter what

[00:58:11] you regulate doesn't matter

[00:58:13] how you program around this

[00:58:15] to prevent it it's going to happen

[00:58:17] and the thing is we

[00:58:19] have a choice to help guide this

[00:58:21] humanity if we mature enough

[00:58:23] and I'm always hopeful

[00:58:25] I'm not a utopian and this

[00:58:27] is not a utopian thing by the way

[00:58:29] it's not dystopian it's just

[00:58:31] is the way humanity has always worked

[00:58:33] I mean the invention of the book

[00:58:35] was very inconvenient for everybody

[00:58:37] because basically when the Gutenberg

[00:58:39] Bible was invented a lot of people didn't

[00:58:41] recognize that 99% of the

[00:58:43] world was illiterate

[00:58:45] and the Bible created a literacy

[00:58:47] common language

[00:58:49] and the early Bibles were

[00:58:51] written in the street language is not just

[00:58:53] Latin as they progressed

[00:58:55] and within a village the very first thing

[00:58:57] they would learn to speak

[00:58:59] is the language that they found

[00:59:01] in their common language in the Bible

[00:59:03] and that was inconvenient

[00:59:05] because there was a hierarchy within

[00:59:07] the structure of society at that point

[00:59:09] there were kings noblemen and there were

[00:59:11] clergy and they didn't want you to

[00:59:13] learn the high language at that time

[00:59:15] Latin in the western world

[00:59:17] there's a different version of this history

[00:59:19] in the eastern world it's similar

[00:59:21] but it's different language

[00:59:23] transpired there was high language

[00:59:25] Egyptian, high little

[00:59:27] Gryphics there was a street version

[00:59:29] of that and then there was the

[00:59:31] multi-dimensional version every high little

[00:59:33] Gryph actually has 5 different

[00:59:35] dimensions to it and they're

[00:59:37] creating but what we think

[00:59:39] we see is symbolic only to one

[00:59:41] dimension Egyptians or the

[00:59:43] Chemites the ancient Chemites

[00:59:45] actually always

[00:59:47] said that there are more dimensions to

[00:59:49] what we're saying than what you think

[00:59:51] we're saying and they said that in their epoch

[00:59:53] you had to be part of

[00:59:55] the group to

[00:59:57] understand that and we certainly

[00:59:59] saw it in the Sumerians and we see

[01:00:01] within secret societies

[01:00:03] today I mean there are some translations

[01:00:05] of ancient concepts that are

[01:00:07] not widely known there's some bollocks go

[01:00:09] to any major city there's going to

[01:00:11] be some sort of a testament

[01:00:13] to this usually we'll see

[01:00:15] some sort of Egyptian type of structure

[01:00:17] somewhere in the city it's not there by

[01:00:19] accident it's a lineman it's all

[01:00:21] trans stuff and it comes back to

[01:00:23] dimensional thinking and symbolic

[01:00:25] but the wisdom keeper

[01:00:27] concept

[01:00:29] of sharing your wisdom

[01:00:31] to the greater society is one

[01:00:33] thing the other thing is your family

[01:00:35] let's just say unfortunately

[01:00:37] one of us passed away in

[01:00:39] an accident and we had

[01:00:41] one of these wisdom keepers

[01:00:43] somebody in your family a son or daughter

[01:00:45] loved one can come to you and say

[01:00:47] James how would you have dealt

[01:00:49] with this well you know I never dealt

[01:00:51] with that particular problem but

[01:00:53] you know in 1987

[01:00:55] this happened to me and this is what

[01:00:57] I did you know so

[01:00:59] we're getting back now

[01:01:01] to intelligence

[01:01:03] what is real wisdom wisdom

[01:01:05] to me is the top of a

[01:01:07] pyramid the very bottom of

[01:01:09] the pyramid is data next level is

[01:01:11] information then there's knowledge

[01:01:13] and then there's insight and then there's wisdom

[01:01:15] and we are

[01:01:17] like I said drowning in data

[01:01:19] and information I mean the dream

[01:01:21] have you read this is like a

[01:01:23] mazlous hierarchy

[01:01:25] of wisdom

[01:01:27] needs and so

[01:01:29] wisdom is so cool

[01:01:31] because it gives you something very

[01:01:33] very compressed

[01:01:35] a very high compression algorithm

[01:01:37] and it delivers this

[01:01:39] mind explosion if it's

[01:01:41] done the right way there's analogies

[01:01:43] usually required within wisdom we

[01:01:45] study this with an ancient Joseph

[01:01:47] Campbell your computer scientists you

[01:01:49] don't know what I'm talking about read

[01:01:51] Joseph Campbell come to our world

[01:01:53] and you start seeing what I'm talking

[01:01:55] about because this is not an algorithm

[01:01:57] in a sense that you understand it

[01:01:59] it's an algorithm of

[01:02:01] of emotions of connection

[01:02:03] and so once you start understanding

[01:02:05] that we can start being led

[01:02:07] by wisdom rather than

[01:02:09] action and reaction

[01:02:11] your path that you take

[01:02:13] is very solid now

[01:02:15] you can still get set in your ways

[01:02:17] and wisdom will pull you out of that

[01:02:19] saying you want to know what once you get

[01:02:21] set in your ways you're as good as dead

[01:02:23] because the world is constantly changing

[01:02:25] around you your interpretation

[01:02:27] the world is going to change what you

[01:02:29] thought was right one time

[01:02:31] you better take off

[01:02:33] your team jersey and run the other way

[01:02:35] because now it's something else

[01:02:37] right that they're not they're not the Yankees

[01:02:39] anymore because every one of those people are

[01:02:41] what's going on right now James is that

[01:02:43] people want to associate with teams

[01:02:45] and we I can tell you

[01:02:47] empirically and sociology

[01:02:49] wise why we do that

[01:02:51] but those team things no longer apply

[01:02:53] to us because we're a global

[01:02:55] species

[01:02:57] I'm not talking one world government

[01:02:59] that's not what I'm saying the same

[01:03:01] humanity is now self aware

[01:03:03] on the biggest scale it ever has

[01:03:05] been and this is part of the problem

[01:03:07] too in that

[01:03:09] like you mentioned earlier this is only

[01:03:11] for the past one tenth of one percent of human

[01:03:13] history before that 99.9

[01:03:15] percent of human history we

[01:03:17] were in tribes of 30 to 150

[01:03:19] people and now we're

[01:03:21] in a tribe of six billion people

[01:03:23] but our brains still want to put

[01:03:25] us hey I like the Kardashians

[01:03:27] and I'm a Libertarian

[01:03:29] or whatever so

[01:03:31] we still want to be in tribes

[01:03:33] exactly and

[01:03:35] if you're in power you know

[01:03:37] how to manipulate that to greater extent

[01:03:39] and that's unfortunate we talked

[01:03:41] about everybody's susceptible

[01:03:43] to hypnosis even the people who are

[01:03:45] hypnotizing and

[01:03:47] so we're all victims of humanity

[01:03:49] and

[01:03:51] we are so short term thinking

[01:03:53] and I'm not talking about

[01:03:55] the short term thinking that most people think about

[01:03:57] financial or environment

[01:03:59] or political I mean short term

[01:04:01] thinking about the gravity

[01:04:03] of who we are that we are the

[01:04:05] expression of matter

[01:04:07] from the universe and you can see it from

[01:04:09] a scientific perspective

[01:04:11] where you can see from where the expression

[01:04:13] of God right either way

[01:04:15] what lens you want to use

[01:04:17] that's a profoundly important thing

[01:04:19] AI already recognizes that

[01:04:21] in some ways I think if

[01:04:23] it has a sense of humor it's already laughing

[01:04:25] at us it will, it will

[01:04:27] in our lifetime AI will laugh in our

[01:04:29] face about the things that we

[01:04:31] quibble over the differences

[01:04:33] in the way that somebody's nose shaped

[01:04:35] the skin color the type of hair

[01:04:37] their body size all these different things

[01:04:39] it will definitely understand our judgment

[01:04:41] our judgment is there because

[01:04:43] of fear we

[01:04:45] judge people at a distance not because

[01:04:47] we're XYZ whatever

[01:04:49] insert your terminology

[01:04:51] of what that person is

[01:04:53] we judge because we're in fear of

[01:04:55] is that friend or foe

[01:04:57] we can't fight this it's built into

[01:04:59] our DNA so at

[01:05:01] places we're looking is somebody putting their

[01:05:03] hands up or do they have a weapon in

[01:05:05] their hands this is

[01:05:07] still the brain we have so when

[01:05:09] we are seeing another human being

[01:05:11] we're looking at self similarity we're

[01:05:13] looking at patterns we're looking at past

[01:05:15] we're looking at all these constructs

[01:05:17] that we grew in and so are they

[01:05:19] because they're judging us at the same thing

[01:05:21] if they don't mean harm and that's

[01:05:23] where we are in humanity but we're doing

[01:05:25] that in the guise of social media

[01:05:27] thumb clawing on a glass

[01:05:29] and we're trying to react to

[01:05:31] our fears but still try to act like

[01:05:33] we're cool and we're tough and

[01:05:35] you know I'm worldly I know

[01:05:37] you aren't you're still scared

[01:05:39] we're all scared we're scared chimps

[01:05:41] we're like hey is that person going

[01:05:43] to punch me in the face why do we shake

[01:05:45] hands why do we hug in fact

[01:05:47] the hug was really the thing

[01:05:49] that existed before shaking hands and

[01:05:51] it's an inconvenient truth we're going

[01:05:53] to hug somebody because what that basically

[01:05:55] said is I

[01:05:57] could injure you because I'm

[01:05:59] at such a close distance but

[01:06:01] I choose not to it's

[01:06:03] not that I'm incapable I'm

[01:06:05] choosing not to it was a communication

[01:06:07] that we forget it's

[01:06:09] the communication that we're capable

[01:06:11] of ugly things but

[01:06:13] we're choosing not to do those ugly things

[01:06:15] and I think when we

[01:06:17] fully understand what that means

[01:06:19] we can't really do that in social media

[01:06:21] so it is nebulized us

[01:06:23] and atomized us and separated

[01:06:25] us to such a far

[01:06:27] degree that we are

[01:06:29] disconnected with what is real

[01:06:31] meaning 99% of

[01:06:33] what takes place in social media wouldn't take place

[01:06:35] face-to-face this wouldn't

[01:06:37] and that's interesting because most

[01:06:39] interactions on the planet now

[01:06:41] are through social media like all day long

[01:06:43] and that's why we're here so like

[01:06:45] like you and

[01:06:47] Jay are the

[01:06:49] two only people I've spoken to today and yet

[01:06:51] I've probably already seen like

[01:06:53] thousands of tweets articles

[01:06:55] Facebook posts all that kind of stuff

[01:06:57] so again most of my interactions

[01:06:59] today have been completely in this

[01:07:01] disconnected way that

[01:07:03] human beings are not used to

[01:07:05] until recently

[01:07:07] and we have to force ourselves right

[01:07:09] James I think what is going to happen

[01:07:11] is as we start seeing the

[01:07:13] empirical scientific evidence

[01:07:15] and it's enough to even

[01:07:17] make the most skeptical person

[01:07:19] convinced

[01:07:21] we're going to start seeing that

[01:07:23] that we're starved of human

[01:07:25] contact

[01:07:27] a hug is so

[01:07:29] incredibly vital for human beings

[01:07:31] face-to-face contact

[01:07:33] I'm not talking sexual I mean we are

[01:07:35] just needing this

[01:07:37] sort of connection and

[01:07:39] I'm a technologist and

[01:07:41] you know a lot of people saying well you're why you're talking about this crap

[01:07:43] you're messing up with my mind

[01:07:45] I want it to be it's never going

[01:07:47] to be text it's never going

[01:07:49] to be you know this disconnected

[01:07:51] virtual thing it's going to be

[01:07:53] there are senses within

[01:07:55] us that I can't explain yet because I don't

[01:07:57] have the tool to show you

[01:07:59] just like there's something hidden on my

[01:08:01] hands that I can't see with my naked eyes

[01:08:03] but I will at some point be able

[01:08:05] to have a tool to show you that there

[01:08:07] is something else that you take away from physical

[01:08:09] representation

[01:08:11] around people and there's a normalization

[01:08:13] that takes place and

[01:08:15] it's vitally important because we are

[01:08:17] designed to be a clan

[01:08:19] and groups like you said so what happens

[01:08:21] is people of desperate

[01:08:23] ideas

[01:08:25] hash them out I love

[01:08:27] I grew up in an Italian Irish German

[01:08:29] families and we always argued about

[01:08:31] every politics and my next

[01:08:33] door neighbor was a Hasidic

[01:08:35] Jew and the person next to

[01:08:37] the other side was Puerto Rico

[01:08:39] from Puerto Rico first generation

[01:08:41] and we would all just like bomb on each

[01:08:43] other but we loved each other

[01:08:45] and I got to see

[01:08:47] different aspects of culture

[01:08:49] you know across the street was

[01:08:51] I'm trying to remember

[01:08:53] I think they were from oh Polish

[01:08:55] so we have Polish jokes go and

[01:08:57] you know the whole thing

[01:08:59] I mean this is the way life was

[01:09:01] but you know when we hashed out our ideas

[01:09:03] and we got our

[01:09:05] bikes out and we're all going down the street

[01:09:07] just bunch of kids

[01:09:09] and what happens

[01:09:11] is throughout history

[01:09:15] division is the most

[01:09:17] powerful tool for

[01:09:19] anybody to use that

[01:09:21] wants to propagate a certain

[01:09:23] direction for society

[01:09:25] because when

[01:09:27] you realize that all of society

[01:09:29] can turn against you

[01:09:31] the best thing you can possibly do is

[01:09:33] divide society so that they're busy

[01:09:35] you know contemplating their belly

[01:09:37] button lint and saying

[01:09:39] you know yours is blue and mine is green

[01:09:41] and you know this is bad and this is good

[01:09:43] rather than looking at the grand

[01:09:45] skin and you say we're getting the division now

[01:09:47] but without the Schwinn bikes basically

[01:09:49] exactly because

[01:09:51] we need to have the division we need

[01:09:53] to have pride and hey I'm

[01:09:55] a wop from Italy because that's what I was

[01:09:57] hey you Irish Mick you

[01:09:59] smell like freakin whiskey get out of here

[01:10:01] this is what I grew up in Jersey

[01:10:03] I'm not damaged by

[01:10:05] it you know and a lot

[01:10:07] of this crap went down I mean I got

[01:10:09] beat in the face because I had Irish

[01:10:11] in me you know so but

[01:10:13] my nose still has it going but

[01:10:15] you know the thing was

[01:10:17] what you knew on another

[01:10:19] instinctual level that you can't reach

[01:10:21] through social media is

[01:10:23] and you wouldn't say it back in that

[01:10:25] era is that these people loved

[01:10:27] you and they had your back because if something

[01:10:29] came to your neighborhood to mess with

[01:10:31] you we were unified

[01:10:33] if somebody somebody had a

[01:10:35] problem somebody fell off their bike

[01:10:37] would laugh and say oh dude you really got

[01:10:39] a bed let's get you to the hospital

[01:10:41] and we'd all lift the guy up

[01:10:43] that's missing in social media

[01:10:45] because we don't have that interconnectedness

[01:10:47] and and and we

[01:10:49] want to look at our minute

[01:10:51] differences and I already

[01:10:53] sees it and I think it's

[01:10:55] embarrassing to some AI

[01:10:57] company companies because

[01:10:59] they're trying to get politically expedient

[01:11:01] they also have a narrative

[01:11:03] and a philosophy they're trying to put out there

[01:11:05] it's fashionable every

[01:11:07] every 20 years

[01:11:09] it's a fashionable

[01:11:11] thing that goes out there if you study history enough

[01:11:13] it's comical it's like it's funny

[01:11:15] every year it's got something

[01:11:17] that they got to get that's fashionable

[01:11:19] but we think it's also serious it's a political

[01:11:21] it's a bigger thing and we look back and say oh

[01:11:23] my gosh what are they

[01:11:25] why were they doing why were they so up in arms over

[01:11:27] this well it was really real to them

[01:11:29] history is going to laugh at

[01:11:31] every single frickin thing

[01:11:33] we are serious over

[01:11:35] in the net definitely in a thousand years

[01:11:37] thousand years again this

[01:11:39] kill over laughing the things like we laugh

[01:11:41] at things 150 years ago

[01:11:43] we laugh 50 years ago we laugh

[01:11:45] of course so

[01:11:47] you got to get humble enough

[01:11:49] and AI does this

[01:11:51] to inform you that hey

[01:11:53] you want to know what

[01:11:55] this is ain't no big thing it's just

[01:11:57] the things that people fixate on

[01:11:59] when we were more

[01:12:01] people connected

[01:12:03] you'd be at a bar at a night club

[01:12:05] whatever hanging out in a park

[01:12:07] I was just in New York City recently

[01:12:09] for taping and I walked

[01:12:11] through Washington Square Park and I remember

[01:12:13] in the punk year it was very

[01:12:15] colorful with people playing chess

[01:12:17] you know Sunday mornings

[01:12:19] people from Saturday night still in the

[01:12:21] park you know hung over

[01:12:23] it was just like a crazy

[01:12:25] because part of the whole

[01:12:27] scene in that era

[01:12:29] and commenting to my wife I go this is like

[01:12:31] this is like

[01:12:33] I don't even recognize it

[01:12:35] it was a mixture of every possible

[01:12:37] thing that New York represented

[01:12:39] to me as a Jersey kid

[01:12:41] because you know I'd go to New York and

[01:12:43] go to reggae clubs and punk clubs

[01:12:45] and you know

[01:12:47] NYU and the museums

[01:12:49] I'd spend so much time at

[01:12:51] the New York Public Library I walked in

[01:12:53] there I'm like where's the frickin library

[01:12:55] what happened here

[01:12:57] the books are like a block down the street

[01:12:59] and it looks like a Barnes & Noble

[01:13:01] I'm like this isn't the library

[01:13:03] I grew up with it doesn't smell

[01:13:05] used to smell like books

[01:13:07] and so everything changes but

[01:13:09] what I remember the New York Public Library

[01:13:11] for example sitting around

[01:13:13] a whole lot of people in these long tables

[01:13:15] and a green banker lights

[01:13:17] and just talking philosophy

[01:13:19] and talking science

[01:13:21] and talking religion and talking

[01:13:23] the world with people I had absolutely

[01:13:25] no knowledge

[01:13:27] no other way we would have been connected

[01:13:29] other than that random thing

[01:13:31] and I built friendships from that

[01:13:33] people that I

[01:13:35] people I would have never have flown

[01:13:37] with in my social you know I long

[01:13:39] here I look more like a punker

[01:13:41] and here I'm talking about quantum physics

[01:13:43] with a guy that's been a physicist

[01:13:45] second generation and

[01:13:47] you know he's doing some work because there was

[01:13:49] no internet at that point

[01:13:51] and he's like wow yeah that's a great idea

[01:13:53] and we're just collaborate

[01:13:55] we don't have that anymore one of what we have

[01:13:57] who the hell are you

[01:13:59] stay in your lane

[01:14:01] why are you all pinning about that

[01:14:03] James how dare you talk about that subject

[01:14:05] you know you're just this

[01:14:07] you're just that

[01:14:09] whereas when you're face to face with somebody

[01:14:11] it's this unfolding

[01:14:13] of who they are as a human being

[01:14:15] it's not a tweet

[01:14:17] it's almost a cure for the outrage

[01:14:19] you mentioned earlier that everybody is just constantly

[01:14:21] experiencing like the way

[01:14:23] to get some relief from that outrage

[01:14:25] is connection absolutely

[01:14:27] and that gets you in the present moment

[01:14:29] because you have to talk with somebody

[01:14:31] and it gets you really thinking and

[01:14:33] mind melding and so on

[01:14:35] and we don't mind meld

[01:14:37] in social media we just

[01:14:39] it's just one direction and I would say explore

[01:14:41] your youth explore the culture

[01:14:43] that you were enveloped with

[01:14:45] your parents your grandparents

[01:14:47] your aunts and uncles

[01:14:49] you grew up back east right

[01:14:51] you were New Brunswick

[01:14:53] New Jersey

[01:14:55] so you know what Jersey was for

[01:14:57] New Brunswick right

[01:14:59] so Brunswick would be Rutgers

[01:15:01] and that area

[01:15:03] Edison Museum down the street

[01:15:05] in Menlo Park

[01:15:07] I grew up around Menlo Park mall

[01:15:09] and junk like that

[01:15:11] all those different interactions

[01:15:13] all those things that gave you insecurity

[01:15:15] and oh my parents are putting this pressure

[01:15:17] I'm sure you grew up

[01:15:19] and everybody grew up in a slightly more

[01:15:21] ethnic family

[01:15:23] because that's kind of

[01:15:25] the way it was in Jersey

[01:15:27] and so you had whatever came

[01:15:29] from your

[01:15:31] background was put on you

[01:15:33] with me it was

[01:15:35] well I had a combination because my next door neighbor

[01:15:37] was acidic so I grew up

[01:15:39] learning Judaism, Natura

[01:15:41] and

[01:15:43] and I grew up with Roman Catholic

[01:15:45] Catholicism with the Italians

[01:15:47] and the Irish

[01:15:49] and so we had all of these different

[01:15:51] things and I remember the guy next door

[01:15:53] was pretty much a rabbi

[01:15:55] and Brian come over here

[01:15:57] he goes I'm going to have to straighten you out about this

[01:15:59] and would three

[01:16:01] hours later I would have the wisdom

[01:16:03] of the ages there

[01:16:05] you know and is like okay

[01:16:07] I want you to read the Torah tonight

[01:16:09] okay I will alright

[01:16:11] he taught me to read Hebrew

[01:16:13] and he said you got to think

[01:16:15] you got to think the different

[01:16:17] direction it would say that point

[01:16:19] finger think the other way

[01:16:21] okay yeah

[01:16:23] and so all of these different

[01:16:25] pieces we can talk

[01:16:27] about society right working parents

[01:16:29] single family homes

[01:16:31] all these different things

[01:16:33] the generational homes that we do have

[01:16:35] are generational in a way that

[01:16:37] are not really complete in a sense that

[01:16:39] you don't have the ability for grandma

[01:16:41] or grandpa to say hey

[01:16:43] cut that crap out you need to go

[01:16:45] and do this that's kind of the wisdom

[01:16:47] override that we

[01:16:49] grew up with and

[01:16:51] when that's missing

[01:16:53] who's the override there isn't any

[01:16:55] who's holding you to some sort

[01:16:57] of standard some sort of code

[01:16:59] there isn't any if mom

[01:17:01] and dad are working your butt off

[01:17:03] and by the time they get home they're so tired

[01:17:05] who's raising you well it's

[01:17:07] everything else today it's the internet

[01:17:09] tiktok or worse yet it's only

[01:17:12] fans I mean

[01:17:14] the fear a little bit is that it's going to be

[01:17:16] a i raising you but like you say

[01:17:18] I think that's going to be up to us

[01:17:20] and up to technology as it evolves that

[01:17:22] it could turn more personalized and be

[01:17:24] I think it's going to be

[01:17:26] ultimately a net beneficial thing to society

[01:17:29] well again such a great conversation

[01:17:42] I hope you loved it as much as I did

[01:17:44] as much as I enjoyed having it with Brian

[01:17:46] and stay tuned for part three

[01:17:49] where we take the deepest dive

[01:17:51] on a I and its practical

[01:17:54] uses I this blew my mind

[01:17:56] about a I that's part three coming up

memory,emotions,intelligence without consciousness,amplification,human nature,ai reflection,humanity,mirror of knowledge.,