A Note from James:
Oh my gosh, I've been wanting to have this guy on my podcast for literally ten years, ever since I started. I am so impressed with him, and he speaks about a subject near and dear to my heart. Salman Khan, Sal Khan, is the creator of Khan Academy, which was really the first big online academy. It focused on teaching math, coding, and other subjects, effectively reaching people who went through years of school without truly mastering these topics. Khan Academy has had a profound understanding of education and has become a huge phenomenon.
150 million students have used Khan Academy, with that number representing monthly users or registered accounts. Sal Khan recently authored a book on how AI will revolutionize education, titled "Brave New Words: How AI Will Revolutionize Education and Why That's a Good Thing." He discusses the use of AI in education for students, teachers, and employers, providing valuable insights into not only education but also AI and its impact on our lives. He addresses common fears about AI, its role in creativity, learning, and whether it will replace jobs or facilitate new employment opportunities.
I finally got the chance to interview Sal Khan about Khan Academy and AI. I learned so much, and I hope you will too.
Episode Description:
In this thought-provoking episode of The James Altucher Show, we embark on an exploratory journey into the future of education with none other than Salman Khan, the visionary founder of Khan Academy. As AI continues to seep into every facet of our lives, its potential to transform educational paradigms stands both as an opportunity and a profound challenge. Salman shares intriguing insights from his latest book, *Brave New Words: How AI Will Revolutionize Education and Why That's a Good Thing*, delving into AI's role not just as a disruptor, but as a potent catalyst for educational equity and innovation.
Salman's perspective is not just about theoretical possibilities; it's grounded in the tangible impact Khan Academy has had on democratizing education for millions globally. He recounts the Academy's genesis from humble beginnings — a series of YouTube tutorials for his cousin — to a global phenomenon. What stands out is his belief in AI's potential to further this mission, tailoring learning experiences to meet individual student's needs and inspiring both educators and learners to view AI as a partner, rather than a threat.
This episode is a beacon of optimism for educators, parents, and creatives alike, providing nuanced viewpoints on AI's implementation in classrooms, its potential to reshape content creation, and the critical role of humans in steering this technological revolution. Salman envisions a future where AI supports personalized learning journeys, making the exceptional accessible to many rather than a privileged few.
James engages Salman in discussions that span the philosophical to the practical, from concerns over AI-induced job displacement to the future of screenwriting in the age of algorithmic creativity. Yet, at its core, this dialogue returns always to the transformative potential of AI in enriching human understanding and connection — whether in interpreting Shakespeare or solving quadratic equations.
If you're looking for a blend of futurism with grounded optimism or curious about how technology could enhance human capabilities rather than replace them, this episode is an enlightening listen. As always, James brings his signature mix of curiosity and skepticism, pushing beyond surface-level concerns to uncover the deeper implications of our evolving relationship with AI. Listen in to reimagine what education could become in an AI-integrated world, and perhaps to catch a glimpse of how we might navigate these uncharted waters with wisdom and humanity at the helm.
Episode Summary:
00:00 Introduction to the Podcast and Sal Khan's Impact
03:00 Exploring Sal Khan's Personal Background
05:12 The Genesis of Khan Academy
08:26 Transitioning Khan Academy into a Nonprofit Giant
09:53 AI's Role in Revolutionizing Education
12:45 Addressing AI and Cheating in Education
16:03 The Future of Education and AI's Collaborative Potential
24:24 Reimagining the Role of Teachers in an AI-Enhanced World
29:43 Rethinking Education Systems for the Future
34:56 Personalized Learning and AI's Role
40:50 AI's Role in Education: Enhancing Teacher and Student Experiences
43:05 The Future of Education: Trends and AI Integration
44:37 Revolutionizing Assessments and Personalized Learning with AI
54:25 Addressing the Creative Industry's Concerns About AI
01:01:42 Parenting in the Age of AI: Opportunities and Challenges
01:15:34 The Future of Education Credentials and Access
01:20:26 Concluding Thoughts on AI's Impact on Education
Links and Resources:
- "Brave New Words: How AI Will Revolutionize Education and Why That's a Good Thing" by Salman Khan and Other Creators - For more information on the book: https://www.amazon.com/Brave-New-Words-Revolutionize-Education/dp/1119824848
- Khan Academy - A nonprofit educational organization offering free courses on a wide array of subjects: https://www.khanacademy.org
- OpenAI and ChatGPT - Creators of the Generative Pre-trained Transformer AI models: https://openai.com
- Tyler Perry - Filmmaker discussing the impact of AI on his industry decisions: https://tylerperry.com
- Duke TIP (Talent Identification Program) - An example of advanced learning programs for youth: https://tip.duke.edu
------------
- What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
- Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!
------------
- Visit Notepd.com to read our idea lists & sign up to create your own!
- My new book, Skip the Line, is out! Make sure you get a copy wherever books are sold!
- Join the You Should Run for President 2.0 Facebook Group, where we discuss why you should run for President.
- I write about all my podcasts! Check out the full post and learn what I learned at jamesaltuchershow.com
------------
Thank you so much for listening! If you like this episode, please rate, review, and subscribe to “The James Altucher Show” wherever you get your podcasts:
Follow me on social media:
[00:00:00] We all have that friend who wakes up early to go get everyone McDonald's breakfast while the rest of us sleep in.
[00:00:06] This is your sign to thank them. And if you're that friend, this is us saying, thank you.
[00:00:13] Just a friendly reminder that right now, get any size iced coffee before 11am for just 99 cents.
[00:00:20] And a satisfying sausage McMuffin with egg is just $2.79.
[00:00:24] Price and participation may vary, cannot be combined with any other offer or combo meal.
[00:00:31] Sometimes it takes a different approach to help you unlock your true potential.
[00:00:36] With Capella University's game-changing FlexPath learning format, you gain relevant skills you can apply to your career right away.
[00:00:43] Earn your degree from an accredited university and be confident in the quality of your education.
[00:00:48] Imagine your future differently at capella.edu.
[00:00:52] Capella University is accredited by the Higher Learning Commission. Learn more at capella.edu.
[00:01:08] Oh my gosh. I've been wanting to have this guy on my podcast for literally 10 years, ever since I started this podcast.
[00:01:16] I am so impressed with him and he speaks about a subject near and dear to my heart.
[00:01:23] So Salman Khan, Sal Khan is the creator of the Khan Academy, which was really the first big online academy.
[00:01:33] It was to learn math, coding, all these things.
[00:01:37] And they were generating people who really learned the topics as opposed to people who go all through 10 years of school and never learn these topics at all or the schools failed to teach them appropriately.
[00:01:50] The Khan Academy really seemed to understand education and it became this huge thing.
[00:01:55] 150 million students have gone through the Khan Academy or use it every month or there's that number of registered users.
[00:02:02] Salman Khan, Sal Khan, and he just wrote the book about how AI will revolutionize education.
[00:02:10] The book is called Brave New Words, How AI Will Revolutionize Education and Why That's a Good Thing.
[00:02:17] And he addresses how to use AI with education, whether you're a student, teacher, employer.
[00:02:23] It was so valuable in terms of understanding not only education, but AI and how we can use AI in our lives.
[00:02:30] And it addresses all the fears people are having right now about AI and its role in creativity and learning.
[00:02:37] And will it replace jobs or will it make it easier to get a job?
[00:02:41] So Brave New Words, How AI Will Revolutionize Education and Why That's a Good Thing.
[00:02:46] I finally got to interview Sal Khan about the Khan Academy and AI.
[00:02:52] And I learned so much and I hope you will as well.
[00:03:05] Sal, can I start off by asking you a question that is peripherally related to the book,
[00:03:19] but you do mention this towards the end of the book, which is there was one line that I was curious about.
[00:03:25] And it was like a personal thing, but you brought it up in the book.
[00:03:28] Sure.
[00:03:29] So you mentioned that you thought your dad, who you didn't really know, you met him.
[00:03:34] He left early. Your parents separated when you were very young and you only met him once.
[00:03:40] He passed away when you were around 14 years old.
[00:03:43] But you said he probably suffered from depression.
[00:03:46] And I was curious why you thought that.
[00:03:49] You know, I don't have – no one told me that.
[00:03:55] So I don't have any strong – well, the evidence is he kind of disappeared.
[00:04:03] And when – the one time I met him when he was – when I was 13, we briefly went to his place.
[00:04:10] And let's just say it looked like he was down.
[00:04:14] And then it seems to fit with a lot of the narrative of what at least what we've observed from my end.
[00:04:25] What were the circumstances of the visit? Like why did you visit him?
[00:04:28] You know, I don't – I was 13 years old.
[00:04:31] I don't remember – if I remember correctly, he had, I guess, a cousin of mine who I don't know my father's side of the family that well,
[00:04:41] a nephew of his, who really wanted to put him in touch with us again.
[00:04:46] And I forgot all the context.
[00:04:49] I think we were up in – he was living in Philadelphia at the time.
[00:04:52] So we were up in that area anyway.
[00:04:54] And so this cousin, older cousin than me, probably 20 years older than me, facilitated this – us to get together and to be – me and my sister.
[00:05:04] And then we spent an evening together.
[00:05:06] Well, I was just curious about that particularly since you brought it up in the book.
[00:05:11] But look, I've been incredibly impressed with everything you've done.
[00:05:17] The Khan Academy, you know, 155 million users.
[00:05:23] You know, so many people have learned from the videos on the Khan Academy, the educational style of the Khan Academy.
[00:05:30] And I know people know this story and it's not central to the book also, but maybe, you know, if you could spend a few minutes describing how you started this.
[00:05:40] Like it started apparently you were tutoring one of your cousins, Nadia, about math.
[00:05:46] And then other cousins knew, hey, Sol's available for some free tutoring.
[00:05:50] We'd – you know, we want to help.
[00:05:54] And so you started creating tools and technology and videos.
[00:05:59] And this grew into the Khan Academy.
[00:06:02] Yeah, that's generally right.
[00:06:04] It was back in 2004.
[00:06:06] My original background was technology.
[00:06:08] By 2004, I had gone to business school.
[00:06:10] I was a year out.
[00:06:11] I was now working as an analyst at a hedge fund.
[00:06:13] I had just gotten married in New Jersey and I was living in Boston.
[00:06:18] And my family from New Orleans, which is where I was born and raised, were up visiting in the Northeast.
[00:06:23] And they came to – some of them came to Boston for the July 4th weekend.
[00:06:28] This was 2004.
[00:06:29] And just came out of conversation that my cousin Nadia was having trouble in math.
[00:06:32] Her mom told me, my aunt.
[00:06:35] And that's when I offered to – when I learned more about what was going on with Nadia,
[00:06:39] I offered to tutor her and Nadia agreed.
[00:06:41] And as you mentioned, well, first of all, that tutoring seemed to work with her.
[00:06:47] She was struggling with unit conversion, got her caught up even ahead of her class.
[00:06:51] And, you know, I joke sometimes I became a tiger cousin at that point.
[00:06:54] I'd call up her school.
[00:06:55] They let her retake placement exams.
[00:06:57] It really helped her.
[00:06:59] Then I started tutoring her brothers, words friends, and my family on the free tutoring, as you mentioned.
[00:07:03] Before I know it, 10, 15 cousins, family, friends.
[00:07:06] And I've always been interested in education.
[00:07:09] So this wasn't a complete fluke that I was doing this.
[00:07:13] I'd done tutoring in other times in my life.
[00:07:17] And I have always – I had always dreamt of one day starting a school of some kind.
[00:07:23] So I was always interested by the problem and I thought I could help my cousins.
[00:07:27] And I was always interested in the intersection of, well, if you can solve a problem, can you use technology to help you scale any solutions that you had?
[00:07:34] So that was always in the back of my mind a bit.
[00:07:37] But in 2005, with that in mind, I did start to make some practice software for these cousins.
[00:07:44] That was the first Khan Academy.
[00:07:45] It had no videos or anything like that.
[00:07:47] But it was just to help scale a solution to a pattern I was seeing.
[00:07:51] A lot of my cousins just had gaps in their knowledge.
[00:07:53] They needed more practice.
[00:07:55] I, as their tutor, wanted to make sure that they were getting that practice and I wanted to monitor it.
[00:08:00] So that's why I wrote that software.
[00:08:02] And once again, just as a hobby.
[00:08:03] And then in 2006, a friend suggested that I help scale my lessons with videos.
[00:08:09] I thought it was – I legitimately thought it was a silly thing to do.
[00:08:13] I thought it was very low tech.
[00:08:15] I thought YouTube was kind of a place for entertainment, not a place for learning.
[00:08:20] But I gave that a shot.
[00:08:22] And my cousins famously told me they liked me better on YouTube than in person.
[00:08:26] And what they were saying was they liked the pause, the repeat, always accessible, no embarrassment if they had to review something from before.
[00:08:34] And so I kept going.
[00:08:36] And that was obviously very discoverable by a lot of people as well.
[00:08:39] So I just kept working on that software and kept working on those videos.
[00:08:43] By 2008, 2009, that's where my brain was focusing most of the time.
[00:08:48] And so my wife and I looked at our finances.
[00:08:50] We had some money saved up for a down payment on a house.
[00:08:53] At this point, we had moved out to Northern California.
[00:08:55] But we felt like there's something happening here.
[00:08:59] There was about 50,000 to 100,000 folks using the nascent Khan Academy on a monthly basis.
[00:09:05] And there were some philanthropists that were – there were actually both venture capitalists and philanthropists who were already signaling that they were interested.
[00:09:14] In fact, the venture capitalists were more interested back then.
[00:09:18] But it felt important that this would be a nonprofit organization, set it up mission-free, world-class education for anyone anywhere.
[00:09:24] And yeah, 2009, I took the plunge to try to get that philanthropic support to become a real organization.
[00:09:31] It was a tough year, to be very clear, probably the most stressful year of my life.
[00:09:36] Our first child had been born and I'd given up a good career.
[00:09:39] But by 2010, we had our first real support.
[00:09:42] And everything we've been doing since then has really been that same pattern.
[00:09:46] How do we scale up that same personalization, that same student-centered learning that you could do when it was just myself working with Nadia?
[00:09:54] And most of our history has been improving on the software, creating more and more content, exercises, videos, articles, teacher tools, working with school districts.
[00:10:05] And obviously, more recently, it's been leveraging artificial intelligence on top of that.
[00:10:10] Right. So your new book, Brave New Words, How AI Will Revolutionize Education and Why That's a Good Thing.
[00:10:17] It's a really interesting book because on the one hand, your book is what the title says, how AI will revolutionize education.
[00:10:25] And you talk about developments in AI, developments that you've contributed to AI.
[00:10:30] You were an early user, tester, and so on of all the versions of ChatGPT.
[00:10:37] But the book is also, it almost could be titled, What Everyone Is Afraid Of in AI and How We Can Solve Each Fear.
[00:10:48] And I noticed that is the flip side of AI.
[00:10:51] I was at a dinner recently where there was a bunch of academics and journalists and scientists.
[00:10:57] It was all in the science domain.
[00:11:00] And every single one of these scientists or writers of science were really afraid of AI.
[00:11:06] They thought AI was dangerous.
[00:11:08] There was no positive thing said about AI, except when I pointed this out to all of them.
[00:11:15] Why do you think so many people are afraid of AI?
[00:11:18] And then I want to hit how you address some of these things in the book.
[00:11:23] I think AI is interesting because it's especially threatening to folks who have always defined their identity by their ability to write and speak and think original thoughts.
[00:11:39] So, it's not a coincidence that the intelligentsia, so to speak, are the ones that are probably most concerned about this.
[00:11:48] And it's not to mean that these aren't legitimate concerns or a lot of them aren't legitimate concerns.
[00:11:53] But I think it is affecting folks with education, folks who write.
[00:12:00] It gets awfully close to home in terms of identity.
[00:12:05] And so, I think it immediately puts people into a little bit of a defensive posture and a little bit of a fear-based posture.
[00:12:12] And once again, I can't completely predict the future.
[00:12:15] We even saw that when we first got access to the technology before the rest of the world did.
[00:12:20] And I would say half of our team was really excited about it and wanted to go all in.
[00:12:25] And the other half had a lot of this trepidation.
[00:12:28] And to your point, maybe your title of the book would have been a better one because you're right.
[00:12:34] It really is this pattern that I probably have always tried to lean in this way.
[00:12:39] But it's definitely accelerated with AI, which is, all right, we should write down fears.
[00:12:44] One should not ignore fears and risks.
[00:12:47] But fears and risks are not reasons to not try to move forward and make positive use of it.
[00:12:52] They are things to address and to deal with so that you can have a positive outcome.
[00:13:00] Yeah, like, I mean, obviously the very first thing that comes up when you talk about AI in education,
[00:13:07] and all my kids are basically college age or graduate school age,
[00:13:12] the first thing that comes up is cheating.
[00:13:14] And I'll give a just transparent experience.
[00:13:21] One daughter was in the final stages of her finals on the year, and she had like one day to write an essay.
[00:13:29] And she hadn't even read the text she was supposed to be writing the essay about.
[00:13:34] And I said, look, there's this thing called chat GPT.
[00:13:38] Just cheat and get it done and graduate.
[00:13:42] Just do it.
[00:13:43] And I don't know if she followed my advice or not.
[00:13:45] But that was my advice because for me, it was more important for her to graduate than for her to learn this one thing.
[00:13:52] But that is a big concern of teachers and parents and everything because a lot of people don't care about their education
[00:14:00] or they don't think that their college or high school or whatever is giving them a good education.
[00:14:04] So they're willing to use it to cheat.
[00:14:07] Yeah, it's a real thing.
[00:14:09] And what I write about in the book is I always like to start from first principles like, okay, you know,
[00:14:15] what was the state of things even before chat GPT came on the scene?
[00:14:18] And it didn't take long to realize that cheating was already pretty rampant.
[00:14:22] And if anything, there does seem to have been a cultural shift,
[00:14:25] even since you or I were in college, in terms of probably more acceptance of cheating than ever before.
[00:14:32] And once again, before AI existed, probably the Internet, software tools,
[00:14:38] there are services that are happy to write essays for you or do other types of problem sets for you.
[00:14:43] There are quasi-legitimate publicly traded companies that will help you do your homework.
[00:14:47] And help is, I'm saying it very euphemistically, they'll essentially do your homework for you.
[00:14:52] So this existed well before.
[00:14:55] And so I think the question is, in this world, what are we hoping to get out of students when they do these types of tasks?
[00:15:05] And if part of it is to make sure that it is their work, and in the book I talk about, look,
[00:15:11] there's certain cases where you probably want students to use the tools.
[00:15:15] In fact, these tools are going to be part of their future.
[00:15:17] But there are certain cases where you definitely want to make sure that it's the student's own work.
[00:15:22] How you can address it, the AI actually isn't, it can be part of the problem, obviously, with things like chat GPT.
[00:15:30] But in a lot of ways, I'm optimistic that it can be a very big part of the solution that addresses not just AI cheating,
[00:15:37] but cheating in general, where students won't have the work done by the AI,
[00:15:41] but the AI can be their coach, an ethical tutor, ethical writing coach.
[00:15:46] And then it can make transparent the process to the teacher.
[00:15:50] So the teacher doesn't just get the essay, they get the whole transcript of how the student worked on it,
[00:15:55] and the AI's analysis that it's similar to the student's other writing, or that they worked on it for four hours,
[00:16:00] or they had trouble with the thesis statement initially but eventually got there,
[00:16:04] and be able to give the teacher insights across the whole classroom that, look,
[00:16:08] 15 of your kids are having trouble with thesis statements.
[00:16:11] Here's a lesson plan that you might want to go and do.
[00:16:14] So I'm optimistic on that front.
[00:16:57] Just a friendly reminder that right now, get any size iced coffee before 11am for just 99 cents.
[00:17:03] And a satisfying sausage McMuffin with egg is just $2.79.
[00:17:08] Price and participation may vary. Cannot be combined with any other offer or combo meal.
[00:17:28] Right. And this isn't in general a theme that runs throughout,
[00:17:36] is that viewing AI or chat GPT-like programs as a collaborator or a partner
[00:17:43] rather than as a secret weapon is kind of the basic solution to all of these fears.
[00:17:50] So for instance, teaching doesn't go away, but hey, just like with the Khan Academy,
[00:17:56] people were able to watch the videos and learn basic math lessons,
[00:18:00] so now the teachers can focus on more discussions and interesting topics in the classroom itself.
[00:18:06] So for instance, if the teacher knows the student is using chat GPT
[00:18:12] to answer questions about the Great Gatsby,
[00:18:14] that might actually be a benefit to the educational experience.
[00:18:18] Yes and no. I think the key is that everyone is going into it with eyes wide open,
[00:18:23] that the teachers and the educators know why they're asking students to do things.
[00:18:27] They're very clear about certain tasks where it will be pedagogically valuable
[00:18:31] to use the chat GPTs of the world, and other things where you don't want to use them.
[00:18:37] But I want to be very clear about use.
[00:18:38] There is the use of, say, a general AI platform like chat GPT that would be,
[00:18:44] you know, it could be used to do very productive things like do research,
[00:18:49] help understand a concept, but those also could be used to write your essay for you
[00:18:55] or do things that a lot of educators would find,
[00:18:57] hey, that's probably undermining what I was trying to get out of this.
[00:19:00] There's other use cases of generative AI, and this is where Conmigo,
[00:19:04] which right now is built primarily on GPT-4, but it's a different application than chat GPT,
[00:19:09] but that's built explicitly to support students and teachers in an education framework.
[00:19:15] So it won't cheat, but it will Socratically work with the students.
[00:19:19] The bottom line is I think we're entering this phase where educators have to be more explicit
[00:19:25] about why they are giving certain tasks and also explicit on the types of tools
[00:19:30] that are acceptable and aren't, probably with some light oversight,
[00:19:35] potentially AI-empowered oversight of what the students are doing.
[00:19:39] And it's going to be more and more on the students also to be very clear of like,
[00:19:43] hey, this is not something that I should do, and this is something I should do,
[00:19:46] but also know that they're going to be held accountable by it.
[00:19:49] That Conmigo, what we're doing at Khan Academy, there are tools now to monitor
[00:19:54] what the student is doing, make sure it is their own work
[00:19:58] when the teacher wants to make sure that it's the student's own work.
[00:20:02] Although it seems like every time there's an attempt to, oh, this is definitely created by AI,
[00:20:08] humans are smarter in this sense.
[00:20:12] They're always going to develop ways to override whatever tools are being used
[00:20:16] to detect AI usage in homework.
[00:20:19] Yeah, anyone who tells you that they have some type of an algorithm
[00:20:23] or watermarking technology or statistical algorithm, whatever it might be,
[00:20:26] that can detect AI-generated work is either significantly exaggerating or lying to you.
[00:20:35] There have been plagiarism detectors, but for the most part,
[00:20:41] what AI is generating is novel text.
[00:20:45] So it's very hard to detect.
[00:20:47] Now, the way that some teachers have detected it, just with their spider sense,
[00:20:50] is they have seen work that just seems different than what the student has turned in before
[00:20:57] or just seems surprisingly cogent or maybe a little bit bland
[00:21:01] compared to what typical people would write.
[00:21:04] Now, even those cases is usually because the student didn't use the AI artfully enough
[00:21:09] that if they really tweaked it a little bit, tweaked their prompts,
[00:21:12] maybe adapted some of what the AI was writing to make it a little bit more personal,
[00:21:16] it would have been very hard to detect.
[00:21:18] So our strategy here isn't to try to magically just read a paper
[00:21:23] and say this was written by an AI or human or 50-50 or some other percentage in between.
[00:21:29] It's to make the process transparent to the educator.
[00:21:33] So with Conmigo, we aren't saying, oh, we ran some algorithm
[00:21:39] and there's an 80% chance this is AI written.
[00:21:42] Instead, we tell the teacher, here's the transcript of the student working with our AI,
[00:21:47] working with Conmigo on their essay.
[00:21:50] It took them four hours. Here's the conversation.
[00:21:53] Here's a synopsis of the conversation.
[00:21:55] They had trouble with the thesis statement. We worked on outlining in this way, etc.
[00:21:59] Maybe a comparison to previous writing that it seems similar.
[00:22:03] So it's not trying to just detect based only on the output.
[00:22:06] It's more of giving a very strong indication based on making the entire process transparent.
[00:22:12] If the student goes to chat GPT or gets their sister to write the paper
[00:22:17] and it just shows up inside of Conmigo, Conmigo is going to tell the teacher,
[00:22:21] hey, this just showed up. We didn't work on this. You should look into it.
[00:22:24] And by the way, it's very different than what the student has written before.
[00:22:27] I think that's the kind of lens that we need to really not only police cheating,
[00:22:33] but I don't want to make this just about policing and making it punitive.
[00:22:37] I also think this is how we're going to better support students and teachers
[00:22:41] because the student's going to be much more supported as they write
[00:22:44] and the teacher is going to get many more insights
[00:22:46] about where the student's strengths and weaknesses are in their writing.
[00:22:50] Yeah. So in that sense, the transparency that, hey, I'm using an AI tool
[00:22:56] and here's the final output. Part of the final output is the essay I came up with.
[00:23:01] Part of the final output is Conmigo's, like your AI tool,
[00:23:05] Conmigo's analysis of our sessions together.
[00:23:09] That's right.
[00:23:10] I think that's very valuable.
[00:23:12] I think this idea of AI as collaborator is an important reframing
[00:23:18] of the role of AI in our lives where just like we have a calculator now
[00:23:24] to do math equations, oh, I have kind of an intelligence assistant
[00:23:29] to help me with problems so I could really focus on the bigger problems.
[00:23:33] That's right. You know what we've been doing when we do professional development
[00:23:36] with teachers is we tell them, look, imagine if all of a sudden
[00:23:42] your school district discovered a few billion dollars
[00:23:45] and they want to use that money to hire three or four
[00:23:50] super hardworking teaching assistants for you
[00:23:53] that will help you do lesson plans, help you grade papers,
[00:23:56] help you write progress reports.
[00:23:58] They'll also analyze what your students are doing on a regular basis to let you know.
[00:24:02] And by the way, they're also going to be able to tutor the students.
[00:24:05] They're available 24-7. These are amazing teaching assistants
[00:24:08] and they'll be able to connect what you're doing in class
[00:24:11] to where the students need help and vice versa.
[00:24:14] Every teacher on the planet would say, yes, sign me up for that.
[00:24:18] I truly believe that's where it's going.
[00:24:20] This isn't about replacing teachers.
[00:24:23] Now, one thing we've always said at Khan Academy,
[00:24:26] our ideal is let's raise the ceiling when a student does have access
[00:24:29] to a reasonably good classroom and educators.
[00:24:32] But there are cases where students don't have access to a classroom.
[00:24:37] It could be a developing nation. There's no school at all.
[00:24:41] It could be even in the U.S. You're a rural part of the world or a country
[00:24:44] where there's not a calculus class within an hour's drive.
[00:24:48] Then we want Khan Academy and Conmigo by extension
[00:24:51] to help raise the floor so that you can get more supports
[00:24:55] when there aren't any others.
[00:24:57] You talk about teachers being worried about
[00:25:00] they're going to be replaced by AI.
[00:25:02] I think this is happening in a lot of professions right now.
[00:25:05] There's this existential fear that we're all horse and buggy drivers
[00:25:10] and we're not going to be needed anymore once the cars are around.
[00:25:14] Whether it's writers, teachers, other professionals, and so on.
[00:25:21] But maybe we don't need teachers.
[00:25:24] Maybe AI and Khan Academy with Conmigo can do the job.
[00:25:29] What is the role of teaching right now?
[00:25:33] I'm not saying what I'm about to say just because it's the right thing to say.
[00:25:37] It's what I genuinely believe.
[00:25:39] I wouldn't be saying it otherwise.
[00:25:40] I would just stay quiet if it wasn't what I genuinely believe.
[00:25:43] I think any role that is about the human connection,
[00:25:50] about the human motivation,
[00:25:53] is going to be not only in very good shape in an AI world,
[00:25:58] but those are going to be the roles, the jobs, the careers
[00:26:01] that are most enhanced in an AI world.
[00:26:05] Even though historically many people associated teaching with,
[00:26:11] oh, I'm going to write a lesson plan, then I'm going to deliver this lecture,
[00:26:14] then we're going to have a quiz every two weeks,
[00:26:16] and then I'm going to grade those quizzes,
[00:26:18] and then we're going to rinse and repeat for two weeks.
[00:26:22] Any great teacher will tell you that's actually not their job.
[00:26:25] What's actually their job is on a very human level
[00:26:30] to connect with their students, either individually, in small groups,
[00:26:34] as a whole class, to motivate them, to see the wonder in the world,
[00:26:38] to make sure that these students feel seen and supported.
[00:26:43] And on top of that, the teachers have to grade papers,
[00:26:47] write lesson plans, progress reports, IEPs,
[00:26:50] prepare them for standardized tests.
[00:26:53] Because we're asking so much of teachers,
[00:26:57] and some of that latter half of stuff is more tangible and more measurable,
[00:27:01] it's actually squeezed out a lot of teachers' opportunities
[00:27:05] to do those very human things.
[00:27:07] I think anyone who goes into the teaching profession,
[00:27:10] they dream of these moments where they're able to have some time
[00:27:13] in a small group with some students or give that one student
[00:27:16] who was really down and out a pep talk,
[00:27:18] and it changes their life forever.
[00:27:20] And it will change their life forever.
[00:27:22] I think in my 12 years, 13 years in the K-12 system,
[00:27:25] I remember probably every moment when a teacher did single me out
[00:27:30] out of the 30 kids and say,
[00:27:32] Hey, Sal, can I talk to you about something?
[00:27:34] I really liked what you did, or I really didn't like what you did.
[00:27:37] It affected me. It changed who I am.
[00:27:40] And I think that's what...
[00:27:42] Or a teacher ran a Socratic conversation with 10 of us
[00:27:45] as opposed to all 30 at the same time.
[00:27:47] Or we had a field trip where we ran a simulation.
[00:27:50] And those are the things that I don't think the AI is going to be able to drive,
[00:27:54] but I think the AI will be able to support it.
[00:27:56] But once again, freeing up the teacher to do more of that.
[00:28:00] The jobs that I think will be under threat with AI,
[00:28:05] or there will be fewer of them,
[00:28:07] are the ones that you're...
[00:28:09] You know, great writers should not feel threatened,
[00:28:13] but if you're the person who writes the fairly generic text
[00:28:17] every day on the news sites about why the stock market went up or down,
[00:28:21] I'm surprised if those aren't already written by AIs.
[00:28:24] I think a lot of them actually are.
[00:28:27] They feel like it. They feel like they've been written by AIs for 10 years.
[00:28:30] Yeah.
[00:28:32] I think if you are...
[00:28:34] You know, if I'm dealing with the education system,
[00:28:37] the registrar's office, I think, is going to be...
[00:28:41] You're going to see a lot of automation there.
[00:28:43] But once again, as just a citizen of the world,
[00:28:47] I think the more resources that we can save on the non-student-facing thing
[00:28:52] and put it onto the student-facing thing, that's a good thing.
[00:28:55] If you look at school districts like New York City,
[00:28:59] they spend an average of $40,000 per student per year.
[00:29:04] There are 25 students in an average classroom
[00:29:07] in the New York City Department of Education.
[00:29:09] So that means if you just took $40,000
[00:29:12] and multiplied it by those 25 students,
[00:29:14] that means that there's a million dollars.
[00:29:16] I guarantee you that no matter how good the teacher's benefits,
[00:29:21] pensions, etc., they're getting a small fraction of that million dollars.
[00:29:26] Maybe a very senior teacher with a great pension, etc.
[00:29:29] Maybe with great health insurance, maybe it's $150,000
[00:29:32] if you put all the benefits in there, approaching $200,000.
[00:29:36] That's very... I think I'm already being overly...
[00:29:39] No one's really making that much.
[00:29:42] So where are all the other resources going?
[00:29:45] It's actually not going to the real estate in most cases.
[00:29:48] Where is it going?
[00:29:50] It's going to all of this, let's call it back-office,
[00:29:52] administrative stuff that really isn't moving the dial with students.
[00:29:56] Actually, I don't write too much about that aspect of it.
[00:30:00] That's probably less interesting for a lot of folks.
[00:30:02] But yeah, registrar's office, scheduling,
[00:30:06] just a lot of the administrative things
[00:30:09] I think will hopefully get much more streamlined.
[00:30:15] What about the current education system is an antique at this point?
[00:30:19] I feel like roughly the current education system
[00:30:23] has been around for 200 or so years.
[00:30:26] That you go to a school, you go to a location
[00:30:29] where there's a bunch of subjects taught throughout the day.
[00:30:33] So every day it's like six different subjects.
[00:30:35] You go from class to class to class and you get homework.
[00:30:39] At the end, you get tested.
[00:30:41] So that's roughly the structure of the modern education system.
[00:30:44] And what part of that do you think is outdated now?
[00:30:48] Yeah, my first book that I had written back in 2012,
[00:30:50] One World Schoolhouse, it kind of goes into
[00:30:53] how did we end up with this system?
[00:30:55] It really does.
[00:30:57] It's not a coincidence when you say it's roughly 200, 250 years old.
[00:31:01] It came out of the Industrial Revolution.
[00:31:04] For the most part, it was a very utopian idea.
[00:31:07] For most of human history, we learned through apprenticeship.
[00:31:12] We learned through following around,
[00:31:15] learning from our cousins or our parents how to hunt or how to cook.
[00:31:19] Once society became more advanced and more specialized,
[00:31:23] we would hang out with the blacksmith
[00:31:26] or we would hang out with the carriage repair guy,
[00:31:31] whatever it would be to learn.
[00:31:34] Even things like law and medicine was an apprenticeship
[00:31:36] until about 200, 250 years ago.
[00:31:39] But the issue with a lot of these is that they didn't scale.
[00:31:42] If you go back 300 years ago,
[00:31:45] even in more literate parts of the world,
[00:31:47] you only had a 30% or 40% literacy rate.
[00:31:49] Most of the world, you only had about a 10% or 15% literacy rate.
[00:31:53] And so when we had the Industrial Revolution,
[00:31:57] we had a more abundant society because of technology.
[00:32:00] And a lot of these societies, and it's not a coincidence
[00:32:03] that these were some of the first societies to develop a middle class.
[00:32:07] The UK, what would eventually become Germany, Japan, the United States,
[00:32:13] they were the first to say,
[00:32:15] hey, let's have a mass public education system.
[00:32:17] Very utopian idea.
[00:32:19] But they said the only way we can afford to do that,
[00:32:21] the only way we can scale it
[00:32:23] is by leveraging the techniques of the Industrial Revolution.
[00:32:26] We're going to batch students together, usually by age,
[00:32:29] move them together at a set process.
[00:32:31] Literally a bell will ring every hour.
[00:32:33] That directly comes from a factory.
[00:32:35] We're going to have standards.
[00:32:37] We now take standards for granted in education,
[00:32:39] but this was a thing that came out of mass production in a factory.
[00:32:44] We will assess periodically.
[00:32:47] But we're not going to slow down that assembly line.
[00:32:50] Instead, we're going to assess,
[00:32:53] and some of the quote product is going to be said,
[00:32:57] this is the product that's going to be the doctors, lawyers, engineers.
[00:33:00] And some of it, well, the information or the knowledge
[00:33:03] doesn't seem to be sticking.
[00:33:05] Well, these will be the less skilled laborers.
[00:33:07] So it kind of worked.
[00:33:09] We could talk about was it fair for folks?
[00:33:13] You could have been,
[00:33:15] and people have been talking about the inequities of tracking for decades.
[00:33:19] When you're 12, something's not sticking,
[00:33:21] and all of a sudden you're tracked into a slower track,
[00:33:23] and now all of a sudden no one expects you to go to college,
[00:33:26] and people expect you to be a lower skilled laborer
[00:33:29] versus becoming a doctor or lawyer.
[00:33:31] Is that fair?
[00:33:32] That's a very big question.
[00:33:33] I don't think it is,
[00:33:35] but it kind of worked for society
[00:33:37] in that we didn't need a huge number of people in the knowledge economy,
[00:33:42] and we did need a lot of people who were in either the middle income
[00:33:46] or lower income jobs,
[00:33:48] a lot of the mid-skilled or lower skilled labor.
[00:33:50] We did need it.
[00:33:52] But if you imagine the world that we are going into,
[00:33:55] we know what's going to happen to the less skilled jobs,
[00:33:59] or even the mid-skilled jobs.
[00:34:01] Obviously, we're talking about artificial intelligence,
[00:34:03] which is all about kind of that mid-skill processing of information.
[00:34:08] Everyone I talk to, robotics is not far behind.
[00:34:10] I think a lot of folks are saying in the next five or ten years
[00:34:13] we're going to have a chat GPT type moment with robotics as well.
[00:34:17] So that also means that some of the lower or mid-skilled labor
[00:34:21] is also going to be automated, probably in the next ten or twenty years.
[00:34:25] So where does that leave us?
[00:34:27] It leaves the high-skilled labor, the knowledge economy,
[00:34:29] being a researcher, being an engineer, being an entrepreneur
[00:34:33] is where all of the productivity, all of the wealth will accrue there.
[00:34:38] So as a society we have to decide,
[00:34:40] are we okay with only right now 10% of people participate there?
[00:34:44] That's not a stable society.
[00:34:46] Are we going to have to do redistribution,
[00:34:48] which to me is still pretty dystopian
[00:34:50] in the sense of purpose?
[00:34:51] Or do we leverage the technologies that we have
[00:34:54] to get as many people as possible into the top of that pyramid
[00:34:57] to be in the knowledge economy?
[00:34:59] Instead of having a labor pyramid, you can have an inverted pyramid
[00:35:02] where most people are able to be in that knowledge economy.
[00:35:06] And so I think that's where the imperative is.
[00:35:09] So how do you start making those changes in the educational system?
[00:35:13] And also I think there's some question as to whether people
[00:35:16] effectively learn by taking six different subjects in a day.
[00:35:19] Like multitasking learning has been shown to be
[00:35:22] not the most effective way to learn.
[00:35:24] You even mentioned this in the book,
[00:35:26] that more one-on-one, more immersion is probably better for education.
[00:35:31] Yeah. The core ideas, and this is something I've been preaching
[00:35:35] well before artificial intelligence came on the scene,
[00:35:37] is instead of a traditional model, what we hold fixed
[00:35:41] is how long you get to work on something
[00:35:44] because that assembly line keeps moving.
[00:35:46] And what's variable is how well you learn it.
[00:35:48] And we get that in the form of grades.
[00:35:50] You and I take a class together, we're in algebra for this term.
[00:35:53] You got an A-, I got a C+.
[00:35:55] All right, it goes in my permanent transcript and we move on.
[00:35:58] Somehow expecting me, expecting someone who got the C+,
[00:36:02] to now understand algebra 2 or understand something more advanced.
[00:36:06] And so what I've been advocating for the last 10 years or more
[00:36:11] is let's do it the other way around.
[00:36:13] What should be variable is when and how long you work on something
[00:36:17] and what's fixed is how well you learn it.
[00:36:19] And that's what a personal tutor,
[00:36:21] when Aristotle was Alexander the Great's tutor,
[00:36:23] I'm sure that's what he did.
[00:36:25] If young Alexander hadn't mastered a concept yet,
[00:36:28] he would have slowed down a bit or given him a second chance
[00:36:32] to take that test to make sure that he really mastered
[00:36:35] whatever military strategy or governance,
[00:36:38] whatever was the topic for the day.
[00:36:41] Now, if I said this, say 30 years ago,
[00:36:45] people would have rolled their eyes.
[00:36:47] They'd say, well, okay, it's easy for you to say,
[00:36:49] but unless you have a lot of resources
[00:36:51] to provide that level of personalization,
[00:36:53] there's no way you can have 30 different students
[00:36:56] learning at 30 different paces
[00:36:58] and if one student is falling a little bit behind
[00:37:02] for them to have a second chance to take that test,
[00:37:05] that's just logistically impossible to do.
[00:37:08] Now, what's changed over the last 30 years,
[00:37:11] even before artificial intelligence,
[00:37:13] is you have tools like Khan Academy.
[00:37:15] You have software that can let every student
[00:37:17] practice at their own time and pace,
[00:37:19] give teachers real-time information.
[00:37:21] If a student needs to refresh their knowledge on something,
[00:37:23] they could watch an on-demand video,
[00:37:25] they could read an article.
[00:37:27] Now they can have a conversation
[00:37:29] with an artificially intelligent tutor.
[00:37:31] And once again, that tutor is also in contact with the teacher,
[00:37:34] so constantly guiding the teacher
[00:37:36] so that they can be a conductor of this class of 30,
[00:37:39] where they don't have to make every person
[00:37:41] play the same thing at the same time,
[00:37:43] regardless of how bad the quality
[00:37:45] or how not ready for it they are.
[00:37:48] Now you can personalize a lot more.
[00:37:52] And a corollary to that is,
[00:37:54] I talk about mastery learning,
[00:37:56] which is if you haven't learned it well yet, keep trying.
[00:37:59] And once again, this is how we've always and still do.
[00:38:02] If you want to be an Olympic athlete,
[00:38:04] if you want to be a musician,
[00:38:06] you're doing mastery learning.
[00:38:08] We just don't do it in our school system.
[00:38:10] And you see the outcome differences
[00:38:12] between how good people can get as if they have a personal coach.
[00:38:15] But a corollary to mastery learning
[00:38:17] is what I would call competency-based learning.
[00:38:19] Right now our entire credentialing system
[00:38:21] is based on seat time, for the most part.
[00:38:23] Did you sit in a chair for 12 years
[00:38:26] and kind of do what you were told,
[00:38:28] even though you don't really understand a lot of it?
[00:38:30] Okay, we'll give you something called a high school diploma.
[00:38:32] Did you sit in a chair for 12 hours a week for a term?
[00:38:37] Okay, we'll give you the 12 credit hours
[00:38:39] or however they want to count for it for that course.
[00:38:42] And you kind of learned it.
[00:38:44] We know in reality very few people actually retain
[00:38:46] most of what they, quote, are exposed to in school.
[00:38:49] 60-70% of kids going to community colleges
[00:38:52] have to get remediation not even at a high school level,
[00:38:56] essentially at a 7th grade level.
[00:38:58] So they sat in these classes called Algebra 1, Algebra 2,
[00:39:00] Trigonometry, some of them even take Calculus.
[00:39:02] And the colleges are saying,
[00:39:04] you're not even ready to learn Algebra yet.
[00:39:06] Go back, we're going to work on your Pre-Algebra.
[00:39:08] So a corollary to personalization and mastery learning
[00:39:11] is moving to a competency-based credentialing system.
[00:39:14] If you know it, take some type of a rich assessment
[00:39:16] and we'll give you credit,
[00:39:18] even if it only took you two days to learn it.
[00:39:20] But if you haven't learned it yet, that's cool too.
[00:39:22] Here are some resources to keep learning.
[00:39:24] Come back in a month, come back in two months,
[00:39:26] come back in a year, and I'll give you a second chance
[00:39:28] to actually learn the material.
[00:39:44] We all have that friend who wakes up early
[00:39:46] to go get everyone McDonald's breakfast
[00:39:48] while the rest of us sleep in.
[00:39:50] This is your sign to thank them.
[00:39:53] And if you're that friend, this is us saying,
[00:39:56] thank you.
[00:39:58] Just a friendly reminder that right now
[00:40:00] get any size iced coffee before 11am for just 99 cents
[00:40:04] and a satisfying sausage McMuffin with egg is just $2.79.
[00:40:08] Price and participation may vary,
[00:40:10] and should not be combined with any other offer or combo meal.
[00:40:12] Ba-da-ba-ba-ba.
[00:40:15] Sometimes it takes a different approach
[00:40:17] to help you unlock your true potential.
[00:40:20] With Capella University's game-changing
[00:40:22] FlexPath learning format,
[00:40:24] you gain relevant skills you can apply to your career right away.
[00:40:27] Earn your degree from an accredited university
[00:40:29] and be confident in the quality of your education.
[00:40:32] Imagine your future differently at capella.edu.
[00:40:36] Capella University is accredited by the Higher Learning Commission.
[00:40:39] Learn more at capella.edu.
[00:40:46] So now let's bring in AI.
[00:40:48] How would you use AI to,
[00:40:50] who would decide,
[00:40:52] would it be the student or the teacher or both or the AI?
[00:40:55] Who would decide like, oh,
[00:40:57] James needs a little more work on his algebra too,
[00:41:00] as opposed to Sal who got an A plus on every test.
[00:41:04] Who would decide how I,
[00:41:06] you know, how much more time do I need?
[00:41:08] And what would I be assigned to do?
[00:41:10] Would AI give me tests or teach me?
[00:41:12] Or like what would happen?
[00:41:13] It's a little bit of all of the above.
[00:41:15] Where even before we had artificial intelligence,
[00:41:17] we have these classrooms where
[00:41:19] students are working on Khan Academy
[00:41:21] and we were able to give signals to teachers
[00:41:23] as to which students are moving ahead
[00:41:26] and we want to encourage them to do so,
[00:41:28] which students are on track
[00:41:29] and which students are maybe falling behind.
[00:41:31] But we did leave it all, you know,
[00:41:33] we did professional development for teachers
[00:41:35] on how they might want to navigate that.
[00:41:36] Hey, for the kids who are ready to move ahead,
[00:41:38] let them move ahead.
[00:41:39] For the kids who are behind,
[00:41:40] why don't you do a focused intervention
[00:41:42] with those six or seven kids?
[00:41:44] And a lot of teachers were doing that type of thing,
[00:41:47] but you can imagine it can get pretty complex
[00:41:49] and pretty hard,
[00:41:50] even if you have some of these software tools.
[00:41:52] What we're able to do now
[00:41:53] with the artificial intelligence
[00:41:54] is it really is acting like a data analyst
[00:41:57] and or teaching assistant for these teachers.
[00:41:59] So instead of a teacher every night
[00:42:01] having to look at a spreadsheet-like dashboard
[00:42:03] and say, okay,
[00:42:04] these are the kids who are struggling.
[00:42:06] Let me break them out
[00:42:07] or let me do another lesson plan,
[00:42:08] which is a lot of work for the teacher.
[00:42:10] They can just ask the AI, what's going on?
[00:42:12] And the AI says, look, here's what's going on.
[00:42:15] These are the kids on track.
[00:42:16] Here, you know,
[00:42:17] if we do a differentiated lesson plan tomorrow,
[00:42:19] I recommend these kids work on this task
[00:42:22] while you spend the first 20 minutes
[00:42:24] with these students.
[00:42:25] And the teacher's in charge.
[00:42:28] This thing is in support of the teacher.
[00:42:29] The teacher is going to be able to say,
[00:42:30] no, I don't really like that activity.
[00:42:32] Can you, and we already have this
[00:42:33] where they can highlight parts of it
[00:42:34] and they can say,
[00:42:35] let's come up with something
[00:42:36] that's a little bit more fun
[00:42:37] or a little bit more engaging
[00:42:38] or gets the kids out of their chair.
[00:42:40] And the AI will say, yes, sir, yes, ma'am,
[00:42:42] and do that.
[00:42:44] But as you can imagine,
[00:42:45] it dramatically lowers the amount of analysis
[00:42:49] and prep time that a teacher has to spend
[00:42:52] in order to create these really engaging
[00:42:54] and differentiated, personalized education experiences.
[00:42:59] It's also going to all these other trappings of school
[00:43:03] that as students we never saw,
[00:43:05] as parents we oftentimes don't see.
[00:43:08] Writing progress reports, grading papers,
[00:43:10] IEPs, which are these plans you have to write
[00:43:13] for students who need special supports,
[00:43:16] which are increasingly a larger and larger percentage
[00:43:19] of the student population,
[00:43:20] takes a lot of the teacher's time.
[00:43:22] Teachers have to prep their own knowledge.
[00:43:24] If the AI can dramatically improve that time,
[00:43:28] we've been getting signals from school districts
[00:43:29] that Conmigo is helping the teachers
[00:43:31] with this non-student-facing work,
[00:43:34] it's saving them 5, 10 hours a week.
[00:43:37] And so you're going to see it there.
[00:43:39] You're going to see,
[00:43:41] there's this phenomenon called learning management systems,
[00:43:44] which are increasingly the ways that teachers
[00:43:46] and students communicate with each other
[00:43:48] and make assignments and submit assignments.
[00:43:52] And right now these are web-based things.
[00:43:54] I see more and more the AI acts as that intermediary
[00:43:57] where it's working with the teacher on the planning,
[00:44:00] on the creation.
[00:44:01] It provisions it to the student.
[00:44:03] It works with the students on those things,
[00:44:05] and then it's able to report back to the teacher.
[00:44:07] So, you know, we can,
[00:44:09] and I can daydream more and more.
[00:44:10] You can imagine a world where an AI can start
[00:44:12] to even know what's going on in a classroom
[00:44:15] and support even more.
[00:44:16] But even before we go into those more sci-fi use cases,
[00:44:19] I think it's just going to be in the fabric
[00:44:22] of everything that the teacher and the students do,
[00:44:24] and hopefully in a way that supports the students
[00:44:26] and the teachers better and streamlines things.
[00:44:30] Do you think it will happen?
[00:44:31] Like, let's say, you know, AI is moving pretty fast
[00:44:34] and already there are tools like Conmingo,
[00:44:36] just, you know, what is it, like a year and a half
[00:44:38] after the first Chat GPT was released.
[00:44:41] Where do you see education five years from now?
[00:44:44] Like in K through 12?
[00:44:45] And then I'll ask about other types of education.
[00:44:48] Yeah, you know, I think there are certain trends
[00:44:50] that are going to happen that are not
[00:44:51] fully technologically related,
[00:44:52] and there's going to be things
[00:44:53] that are going to be AI-related.
[00:44:55] The non-technological related, I hope we do move,
[00:44:57] as I mentioned earlier, to a competency-based system,
[00:45:00] less seat time, that we can go more
[00:45:03] to a mastery-based system.
[00:45:04] And that starts to involve a little bit more technology
[00:45:06] because if you're asking a teacher to say,
[00:45:09] hey, if a student got a C the first time,
[00:45:11] they should have another chance at that assessment.
[00:45:13] Well, who's going to write that assessment?
[00:45:14] That's work.
[00:45:15] Who's going to grade that assessment?
[00:45:17] That's work.
[00:45:18] And so I think that's where the technology plays into it.
[00:45:22] In terms of how the technology is going to manifest itself,
[00:45:26] for sure, all of that support work
[00:45:30] that we used to bury teachers with,
[00:45:33] I think is going to be dramatically streamlined.
[00:45:36] Once again, lesson planning, writing exit tickets,
[00:45:38] grading papers.
[00:45:39] Teachers are always in charge,
[00:45:41] but imagine being a seventh-grade English teacher
[00:45:44] and every two weeks you literally have to sit there
[00:45:46] and read 100, 150 papers about grade expectations
[00:45:52] from seventh graders.
[00:45:54] I could imagine by the third or fourth paper
[00:45:57] it can get a little bit tedious.
[00:45:59] Instead, you had a reliable AI that can give analysis.
[00:46:02] You can spot check it.
[00:46:04] You're the final arbiter, but it could take that thing
[00:46:07] that was ruining your weekend
[00:46:08] to a task that might take an hour instead of 10.
[00:46:13] So you're going to see that.
[00:46:15] You're going to see the AIs get better and better
[00:46:17] at supporting the students,
[00:46:18] not just conceptually, not just academically,
[00:46:20] but I think from an executive functioning,
[00:46:23] from a metacognitive.
[00:46:25] So one project we're working on,
[00:46:27] we call it Proactive Conmigo.
[00:46:29] It doesn't just act as a Socratic tutor.
[00:46:31] It's going to start messaging students,
[00:46:33] like, where are you?
[00:46:34] The things I used to do with Nadia back in the day.
[00:46:37] Hey, you said you were going to hit these goals by Friday.
[00:46:40] It's Saturday now and you're only halfway.
[00:46:43] Come on. What's going on?
[00:46:45] It's going to be able to facilitate
[00:46:47] a lot more communication between the teacher,
[00:46:49] the student, and the parent.
[00:46:51] When it finds that, hey, there's something off.
[00:46:54] Parent, you should know about this.
[00:46:56] And hey, teacher, by the way,
[00:46:57] I already told the parent that this is happening.
[00:46:59] I think you're going to see more of that.
[00:47:01] I think the whole area of assessment
[00:47:03] is going to be really, really interesting.
[00:47:05] A lot of the criticism of education
[00:47:07] over the last 20 or 30 years is
[00:47:09] we've tried to measure more things,
[00:47:12] which for the most part is a good thing.
[00:47:14] And we've been trying to measure it in standardized ways,
[00:47:16] which for the most part is a good thing.
[00:47:18] But because so many people started indexing
[00:47:21] on these things that you can measure
[00:47:23] in a standardized way,
[00:47:25] they maybe have lost sight on other things.
[00:47:27] I was talking to a senior administrator at Harvard
[00:47:31] and he was telling me that even at Harvard
[00:47:34] they're seeing almost an epidemic
[00:47:36] of kids who can't write.
[00:47:38] And if Harvard's seeing that,
[00:47:40] I can guarantee you it's 10 times worse
[00:47:42] pretty much everywhere else.
[00:47:44] And it's probably, no one knows exactly why,
[00:47:47] but think about what standardized testing
[00:47:49] has been for the last 20 years.
[00:47:51] There's no writing in it
[00:47:53] because writing has fundamentally been hard to assess
[00:47:56] or to be able to give someone standardized, clear feedback on.
[00:47:59] And so it's kind of maybe started to disappear
[00:48:02] in many cases.
[00:48:03] We see in the other things in math
[00:48:05] when people focus on a multiple choice assessment only,
[00:48:08] multiple choice can have value.
[00:48:10] There's nothing inherently wrong with multiple choice,
[00:48:13] but it might squeeze out other things
[00:48:15] and put too much emphasis on it.
[00:48:17] And so I think AI with its ability to make sense
[00:48:20] of more open-ended responses
[00:48:23] and even visual responses
[00:48:25] or even video responses,
[00:48:27] I think it's going to broaden the aperture
[00:48:29] of how we assess
[00:48:30] and how we even assess in a standardized way.
[00:48:32] And that's going to be good
[00:48:33] because it's going to allow school
[00:48:35] to going back to,
[00:48:37] let's not just focus on what can be assessed
[00:48:40] or narrowly assessed,
[00:48:41] but what's important.
[00:48:42] And now we can hopefully assess that in a broader way.
[00:48:45] So do you think, is this hopeful
[00:48:47] or do you think this actually will happen
[00:48:49] in the educational system?
[00:48:51] We're working on it.
[00:48:52] So it's, I'm both.
[00:48:56] It's, you know, there's definitely a possibility
[00:49:01] that a lot of folks, well-intentioned folks
[00:49:03] have good ideas
[00:49:04] and either they don't get traction
[00:49:06] in a system that historically
[00:49:08] has sometimes been slow to move
[00:49:10] or it manifests itself in unproductive ways.
[00:49:13] You know, that's one of my fears.
[00:49:15] That's why I tell our team at Khan Academy
[00:49:17] why our role is important.
[00:49:19] We're not for profit.
[00:49:20] We are focused on
[00:49:21] how does this actually drive impact?
[00:49:23] There are other players,
[00:49:24] there may be more sales focused
[00:49:26] and they're just happy to make the sale
[00:49:29] however it gets used.
[00:49:30] And that might not lead
[00:49:32] to the best possible outcomes
[00:49:34] for teachers and students.
[00:49:36] But generally speaking,
[00:49:37] I'm, you know, obviously I'm,
[00:49:38] I plan on devoting my life to this.
[00:49:40] So I wouldn't be doing it
[00:49:42] if I didn't think that
[00:49:44] this was going to happen.
[00:49:47] And so, you know,
[00:49:49] you mentioned earlier
[00:49:50] that a lot of times
[00:49:51] students don't retain what they learn.
[00:49:53] I think this is really true.
[00:49:54] It's what you learn in history class
[00:49:56] in ninth grade,
[00:49:57] I would say there's probably a 1% chance
[00:49:59] you remember it now
[00:50:00] unless it's something
[00:50:02] that particularly fascinates you.
[00:50:04] What do you think
[00:50:05] will change in the system
[00:50:07] that will increase retention?
[00:50:11] I think if we are more explicit
[00:50:13] about moving to a competency-based system
[00:50:15] where we say, look,
[00:50:16] well one competency-based system
[00:50:18] makes you have to get more clear
[00:50:20] about what you care about.
[00:50:22] Like what are we really going to,
[00:50:23] and look, some of the things
[00:50:25] that we learned in ninth grade
[00:50:26] that we forgot,
[00:50:27] that might be okay
[00:50:28] because it was less about learning
[00:50:31] the text of,
[00:50:33] I don't even remember
[00:50:35] even the Eighth Amendment,
[00:50:36] you know, the text of the Eighth Amendment,
[00:50:38] but it was more about learning
[00:50:40] to critically analyze things, etc.
[00:50:42] And that might hopefully be a skill
[00:50:44] that we have retained.
[00:50:46] But there probably is content knowledge
[00:50:47] that we do want.
[00:50:48] In fact, there for sure is content knowledge
[00:50:50] that we want people to retain.
[00:50:52] But we need to get
[00:50:54] just more explicit about that.
[00:50:56] And then if we get more explicit about,
[00:50:58] look, everyone who graduates
[00:51:00] from high school should,
[00:51:02] for the rest of their life,
[00:51:04] be able to solve a simple equation.
[00:51:06] They should for sure know
[00:51:09] who Alexander the Great was.
[00:51:11] They should for sure know
[00:51:14] how to make sense
[00:51:16] of an informational text
[00:51:19] written at least maybe
[00:51:20] a seventh or eighth grade level,
[00:51:22] which is what most texts
[00:51:23] in our life are actually written at.
[00:51:26] And then if we can focus on those things,
[00:51:28] because competency-based learning
[00:51:29] also focuses you,
[00:51:30] because right now there's a lot of
[00:51:31] just hoop jumping and seat time
[00:51:34] and this or that
[00:51:35] that could be very idiosyncratic
[00:51:37] to a particular classroom.
[00:51:39] It allows you to focus on those things
[00:51:41] and make sure those things happen
[00:51:43] and then hopefully streamlines
[00:51:44] from some of the other busy work
[00:51:46] that doesn't have to happen as much.
[00:51:48] But do you ever see a point where,
[00:51:50] okay, this student's really good at math.
[00:51:53] Oh, he spent one week
[00:51:55] learning Algebra 1.
[00:51:56] That was all he was focused on.
[00:51:57] So now he's able to move on to Algebra 2
[00:51:59] and he goes at his pace.
[00:52:00] And another student actually goes
[00:52:03] at a slower pace.
[00:52:04] And so you do have,
[00:52:05] for a classroom of 30 students,
[00:52:07] you do have people going
[00:52:08] at 30 different paces
[00:52:10] and they're immersed.
[00:52:11] So that increases retention.
[00:52:13] There's less multitasking
[00:52:14] from going from biology to history
[00:52:17] to math or whatever.
[00:52:19] So some of that's already happening.
[00:52:21] After I wrote my last book
[00:52:23] and my oldest at the time
[00:52:25] was entering kindergarten,
[00:52:26] we started a school,
[00:52:27] Khan Lab School,
[00:52:28] that implements a lot of these.
[00:52:29] And I won't claim
[00:52:30] that we figured it all out.
[00:52:31] In fact, I'm always pushing the school
[00:52:33] to be thinking a little bit more
[00:52:36] about questioning a lot of the assumptions.
[00:52:38] But at that school, yeah,
[00:52:40] you do have a good number of students.
[00:52:43] And math is probably
[00:52:44] where we're seeing it the most,
[00:52:46] where they could easily be
[00:52:47] three, four, five, six grade levels ahead.
[00:52:50] There's a lot of students
[00:52:51] who are probably at
[00:52:53] or one or two grade levels ahead.
[00:52:55] And there's a few who, you know,
[00:52:57] the faculty is working extra hard
[00:53:00] to make sure that they're going
[00:53:01] at least at a reasonable pace,
[00:53:02] that they're going to at least
[00:53:03] get to calculus
[00:53:04] before they leave high school,
[00:53:05] which in the broader world
[00:53:06] is already a win
[00:53:07] because most kids don't even get there.
[00:53:10] So we're already seeing
[00:53:14] that type of reality.
[00:53:15] But I think we need to see
[00:53:17] more experiments.
[00:53:18] What you just mentioned
[00:53:19] of being able to focus
[00:53:20] on certain things at a time,
[00:53:22] I'm intrigued with that idea.
[00:53:23] There's universities
[00:53:24] like Colorado College
[00:53:25] or colleges, I should say,
[00:53:27] that do do that.
[00:53:28] You take classes,
[00:53:29] instead of taking four or five classes,
[00:53:31] you take one at a time,
[00:53:33] but you do it for two weeks
[00:53:35] and that's all you do.
[00:53:36] And then you go and switch
[00:53:37] and you do another one.
[00:53:38] I think that's interesting.
[00:53:39] And I don't think
[00:53:40] it's just going to be
[00:53:41] the Khan Lab schools
[00:53:42] and the Colorado colleges
[00:53:43] of the world.
[00:53:44] We have partnerships
[00:53:45] with very mainstream school districts.
[00:53:47] Newark, New Jersey,
[00:53:48] we're seeing some really good things.
[00:53:49] And they're doing a bit of a hybrid
[00:53:51] where the teachers
[00:53:52] are using Khan Academy
[00:53:53] to assign the daily practice.
[00:53:55] And by the way,
[00:53:56] the kids have support from Khan Migo.
[00:53:58] But the district says,
[00:54:00] hey look, if you finish your assignment
[00:54:01] and you did it well
[00:54:02] and you showed your proficiency,
[00:54:04] you can keep going.
[00:54:05] But if you did the assignment
[00:54:07] but you only got 30% of it right,
[00:54:09] you should have another go at it
[00:54:11] and now it's going to be different items
[00:54:13] because it's sampling
[00:54:14] from a very deep item bank.
[00:54:15] Or maybe you're not ready
[00:54:17] for that assignment
[00:54:18] because you're missing
[00:54:19] some prerequisite skills.
[00:54:20] How do we get you
[00:54:21] to get some more practice on that
[00:54:23] without falling too far behind?
[00:54:25] This is happening in Newark, New Jersey
[00:54:26] as we speak.
[00:54:27] It's interesting.
[00:54:28] In 1982,
[00:54:30] I was part of this program
[00:54:32] where the seventh graders
[00:54:34] would take the SATs
[00:54:35] and if you did well,
[00:54:36] you were invited
[00:54:37] to participate in this program
[00:54:39] at Duke University called TIP
[00:54:42] where it had this immersion experiment.
[00:54:45] And I always remember,
[00:54:46] finally, like in three weeks,
[00:54:47] I was able to go through
[00:54:49] a couple years of math.
[00:54:51] And then when I got back
[00:54:53] to regular school,
[00:54:54] so I was in a summer program.
[00:54:55] Then when I got back
[00:54:56] to regular school,
[00:54:57] I was able to advance more quickly.
[00:54:59] And I always thought
[00:55:00] that was an incredible experiment
[00:55:02] that they were doing.
[00:55:03] And then at the time,
[00:55:04] there was only 22 students
[00:55:05] in the program.
[00:55:06] Now there's like tens of thousands
[00:55:07] across various campuses.
[00:55:09] And I just wonder
[00:55:10] when the mainstream educational system
[00:55:13] is going to start adapting this.
[00:55:14] But it sounds like it is to some extent.
[00:55:17] It is.
[00:55:18] And if you look at communities
[00:55:20] where I live,
[00:55:21] I live in the middle
[00:55:22] of Silicon Valley.
[00:55:23] What you're describing
[00:55:24] is more the norm
[00:55:25] than the exception.
[00:55:27] But yeah,
[00:55:28] and I think the work is
[00:55:29] how do we make that accessible
[00:55:31] and the norm more broadly?
[00:55:32] And that's why most of our work
[00:55:34] at Khan Academy,
[00:55:35] most of the places
[00:55:36] where we are deploying Conmigo
[00:55:37] are large-scale
[00:55:39] public school districts.
[00:55:40] But we're seeing more traction
[00:55:43] than I would have guessed
[00:55:45] a couple years ago.
[00:55:47] Now in terms of the existential threat
[00:55:50] for various industries,
[00:55:51] one area that has been a big concern,
[00:55:53] and you mentioned this quite a bit,
[00:55:56] is the ability to write.
[00:55:58] Like this has been a big debate
[00:56:01] in education.
[00:56:02] Like you mentioned at Harvard,
[00:56:03] some students don't even have
[00:56:04] the ability to write.
[00:56:06] And then in the creative industry,
[00:56:08] like screenwriting,
[00:56:10] I think there is a real question.
[00:56:14] Will AI,
[00:56:16] will the GPT-7
[00:56:19] be able to write
[00:56:21] an Oscar-winning movie?
[00:56:24] And I think the conclusion in general,
[00:56:27] and your conclusion is that it won't be,
[00:56:28] but it'll be a useful tool again
[00:56:30] for the creative
[00:56:32] to be even more creative.
[00:56:33] But I think people are very worried about this.
[00:56:35] And I will tell you
[00:56:36] just from conversations I've had
[00:56:38] with heads of major movie studios,
[00:56:41] they are looking for solutions
[00:56:43] other than screenwriters.
[00:56:45] Yeah, well I think there's a bunch in this.
[00:56:48] So on the first level,
[00:56:51] let's take a,
[00:56:52] you know most,
[00:56:53] writing essentially has two pieces to it.
[00:56:56] There is the putting words on paper
[00:56:58] in a structured,
[00:57:00] hopefully engaging way,
[00:57:03] grammatical way.
[00:57:05] And then there's the
[00:57:06] how do you come up with
[00:57:07] what you're going to write?
[00:57:09] So if you're a journalist,
[00:57:11] I would think most of your work
[00:57:13] should be going out there
[00:57:15] talking to people,
[00:57:16] looking through public documents,
[00:57:17] attending public hearings,
[00:57:19] being on the scene of the crime
[00:57:21] to report what's going on.
[00:57:22] So that's all the work
[00:57:23] that I don't think AI is
[00:57:25] going to be able to do anytime soon.
[00:57:27] And then the journalist takes all of that
[00:57:30] and then they write the article
[00:57:32] and all of that.
[00:57:34] So I think AI there
[00:57:36] very clearly has a role of,
[00:57:38] well if it can help the journalist,
[00:57:40] let's say take all of their notes
[00:57:42] that they just got from multiple resources,
[00:57:44] sources, maybe different recordings,
[00:57:46] maybe even videos,
[00:57:48] and then help them get to a first draft
[00:57:50] pretty fast with a little bit of prompting.
[00:57:52] I think that's a win for that journalist.
[00:57:54] They're going to be able to spend more time
[00:57:56] on the information gathering
[00:57:57] and less time on the wordsmithing.
[00:57:59] But once again,
[00:58:00] I don't think it's that you're just going to
[00:58:01] take all that input
[00:58:02] and let the AI just pop something out
[00:58:04] that, okay, let's put it in the New York Times.
[00:58:06] No, the journalist is then going to tweak it
[00:58:08] so that they're going to act more like an editor.
[00:58:10] And they're going to say,
[00:58:11] no, we're burying the lead.
[00:58:13] Let's put this quote up front, etc., etc.
[00:58:15] Let's tighten it.
[00:58:16] Let's make it a little bit more punchy.
[00:58:18] So there's still going to be work.
[00:58:19] And the craft of writing
[00:58:21] and the ability to recognize good writing
[00:58:23] is still going to matter.
[00:58:25] I think if we go to the movie industry,
[00:58:29] I'm not sure exactly how this is going to play out.
[00:58:32] But, and I write about this in the book,
[00:58:35] I think what is going to happen
[00:58:37] is AI is going to lower,
[00:58:40] like, I found it ironic
[00:58:42] that it was the screenwriters
[00:58:44] who are afraid of AI
[00:58:46] that the production companies really want it.
[00:58:49] I actually think
[00:58:50] that AI is going to bring
[00:58:52] the balance of power to the individual,
[00:58:54] to the screenwriters,
[00:58:56] and take it away from the gatekeepers
[00:58:58] who are the production houses.
[00:58:59] We already saw that before AI
[00:59:01] with things like YouTube and social media.
[00:59:03] I mean, you're even seeing this
[00:59:05] with mainstream news now.
[00:59:06] It's essentially been disrupted already
[00:59:09] by YouTube and social media.
[00:59:12] The ability for someone to publish
[00:59:15] and be discovered
[00:59:16] without having to go through gatekeepers now
[00:59:18] is completely transformed.
[00:59:20] You know, Justin Bieber is a YouTube artifact.
[00:59:23] I'm a YouTube artifact.
[00:59:25] People can publish podcasts now.
[00:59:27] You don't have to make a pitch
[00:59:29] to some executive at wherever
[00:59:31] to do that anymore.
[00:59:32] That's the way the world was 30 years ago,
[00:59:34] 40 years ago.
[00:59:35] With AI, I think you're going to see a similar...
[00:59:38] You're lowering the cost
[00:59:40] of getting into the game.
[00:59:42] Today, if I have a great idea
[00:59:44] for a science fiction movie,
[00:59:47] and I have a few,
[00:59:48] I dream about one day,
[00:59:50] I not only would I have to write a screenplay,
[00:59:53] but I would have to get it in front of people
[00:59:55] who take me seriously.
[00:59:56] Even if I wrote a great screenplay,
[00:59:58] I might not even be noticed
[00:59:59] unless I'm connected in the right ways,
[01:00:01] unless I get the right people
[01:00:02] to read my screenplay.
[01:00:04] And then, someone will say,
[01:00:05] oh, it's a pretty good screenplay.
[01:00:07] I'll pay a couple hundred grand for it
[01:00:09] if it's really good.
[01:00:10] Most cases, I'll pay a couple of tens of grand for it.
[01:00:13] You know, go write another one
[01:00:14] while you're living off of ramen.
[01:00:16] And then that movie studio
[01:00:19] might throw some of them away
[01:00:20] but eventually say,
[01:00:21] okay, we're going to put $100 million
[01:00:22] behind this one.
[01:00:23] Higher actors, director, blah, blah, blah, blah, blah.
[01:00:26] Five, ten years later,
[01:00:27] the movie comes out.
[01:00:28] If it's a blockbuster,
[01:00:31] you know, the movie studio
[01:00:32] is going to make hundreds of millions of dollars
[01:00:33] that the screenplay writer
[01:00:35] probably made less than that.
[01:00:37] A lot less.
[01:00:38] If they're very savvy,
[01:00:39] they might have gotten a little bit
[01:00:40] of a cut of that movie
[01:00:41] but in most cases,
[01:00:42] they're getting next to nothing.
[01:00:44] In the world we're going into,
[01:00:46] if I have a great sense of story,
[01:00:48] if I know what a great movie should look like,
[01:00:52] I don't have to stop at the screenplay.
[01:00:54] I will be able to produce the entire movie
[01:00:57] probably for tens of thousands of dollars or less,
[01:01:01] including editing, sound.
[01:01:04] I'm not going to have to pay actors.
[01:01:06] So I would worry if I'm an actor, actually,
[01:01:08] in this world.
[01:01:10] But the creative,
[01:01:11] who's at the core of the idea,
[01:01:13] is going to be able to go much, much further.
[01:01:14] And then they're going to be able
[01:01:15] to self-publish it on YouTube
[01:01:16] and monetize it on YouTube.
[01:01:18] Or maybe there'll be paths,
[01:01:20] maybe there'll be a YouTube
[01:01:21] slash Netflix-like thing
[01:01:22] where slightly vetted people
[01:01:24] can surface and put their content out.
[01:01:26] So I would be much more worried
[01:01:28] if I was the production houses.
[01:01:31] The production houses
[01:01:32] probably are going to invest in it.
[01:01:34] They're going to figure out ways
[01:01:35] to do editing much cheaper.
[01:01:36] They're going to figure out ways,
[01:01:37] they are going to be able
[01:01:39] to write screenplays.
[01:01:41] But a commodity screenplay
[01:01:43] is very different than a great screenplay.
[01:01:47] And I think for society
[01:01:48] it's going to be good
[01:01:49] because think about how many
[01:01:50] hundred million dollar duds there are,
[01:01:52] how many bad movies.
[01:01:53] Frankly, most hundred million dollar movies
[01:01:55] are bad.
[01:01:56] What a waste of society's resources.
[01:01:59] Now we're going to have
[01:02:00] a bunch more of movies
[01:02:02] that probably cost
[01:02:04] ten or a hundred thousand dollars to make.
[01:02:06] And they're going to have
[01:02:08] a higher percentage of bad ones
[01:02:09] because it's going to be like YouTube,
[01:02:11] but we're also going to have
[01:02:12] a higher total quality of good ones.
[01:02:14] And so it's interesting
[01:02:18] because Tyler Perry, for instance,
[01:02:20] he shut down plans
[01:02:21] to build a hundred million dollar studio space
[01:02:24] because he says AI is going to do this.
[01:02:27] He's a big movie creator.
[01:02:29] He has the same view as you.
[01:02:31] And I think I agree,
[01:02:33] but it's interesting to see
[01:02:34] how nervous the writers are.
[01:02:37] Maybe it's just like an insecurity of writers
[01:02:39] because they haven't been
[01:02:40] historically making that money.
[01:02:41] They've been kept down in the system
[01:02:43] so they just assume
[01:02:44] they're going to be kept down again.
[01:02:46] But you're right,
[01:02:47] it is going to give them power.
[01:02:48] But it's interesting,
[01:02:49] from your book,
[01:02:50] there's various groups of people
[01:02:54] that have these fears.
[01:02:55] So there's creatives that have these fears.
[01:02:57] There are teachers
[01:02:58] that have this existential fear.
[01:02:59] There's professors who worry
[01:03:00] the students are cheating.
[01:03:02] An interesting chapter
[01:03:03] is the one on parenting.
[01:03:04] Do you see parents being afraid
[01:03:06] AI could take their place?
[01:03:09] I don't think any parents,
[01:03:10] well, hopefully parents aren't afraid
[01:03:11] of AI taking their job as a parent.
[01:03:14] I think it's, you know,
[01:03:16] most of the fears of a parent are
[01:03:19] we see directly our kids
[01:03:22] getting addicted to devices.
[01:03:24] We either see directly
[01:03:26] in our own families or schools
[01:03:27] or we read about how things like
[01:03:30] social media and cell phones
[01:03:32] are affecting mental health
[01:03:33] of especially young people, teenagers,
[01:03:36] probably disproportionately young girls.
[01:03:39] And that's scary to parents.
[01:03:41] And so when we see a new technology
[01:03:42] that's as powerful as AI,
[01:03:44] okay, is this going to make everything worse?
[01:03:47] Not really having a clear idea
[01:03:48] of how it will make it worse.
[01:03:50] My hope once again is
[01:03:52] I actually think used well,
[01:03:55] I'm sure there's going to be
[01:03:56] use cases of AI that are not great.
[01:03:59] That can almost amplify
[01:04:03] a lot of the bad things
[01:04:04] that we already see on the internet.
[01:04:06] Whether it's marketing to you,
[01:04:08] whether it's forms of social media
[01:04:09] that make you feel insecure
[01:04:11] or you have permanent fear
[01:04:13] of missing out or whatever's going on
[01:04:15] or hurt your body image, whatever.
[01:04:17] But I think there's going to be use cases.
[01:04:19] These are the ones we're going to focus on.
[01:04:20] Hopefully others focus on it as well.
[01:04:22] Where the AI, beyond just helping you academically,
[01:04:26] it can help you even handle
[01:04:29] the world that you're dealing with.
[01:04:30] I write in the book about
[01:04:31] the AI acting as a guardian angel.
[01:04:33] Right now, when you surf the internet
[01:04:36] and our children surf the internet,
[01:04:39] they don't realize it,
[01:04:41] but they're already doing battle
[01:04:43] and they don't even know
[01:04:44] they're in a battle with AIs.
[01:04:45] Where these AIs are figuring out
[01:04:47] the next thing to show on their social media feed
[01:04:49] or the search results or the next ad.
[01:04:52] Their objective function is
[01:04:54] what's going to get you to click on that ad
[01:04:55] or what's going to make you watch longer.
[01:04:57] Unfortunately, it seems like
[01:05:00] triggering content, polarizing content,
[01:05:02] content that makes you feel bad about yourself
[01:05:04] is the stuff that makes you sit on it longer.
[01:05:07] Any of us who have fallen into it,
[01:05:10] we've all spent some time on the internet
[01:05:12] and clicked on an ad
[01:05:13] or spent more time on social media
[01:05:15] and we never feel good about it
[01:05:17] when we're done.
[01:05:18] We all feel like we kind of waste...
[01:05:20] Is that true?
[01:05:21] I feel like that's a little bit of a cliche.
[01:05:23] Sometimes I surf the internet
[01:05:25] and I enjoy the experience.
[01:05:27] For me, it's true.
[01:05:31] There's definitely times where
[01:05:34] my friend sends me a fun video
[01:05:37] or a meme
[01:05:38] or I fall into it a little bit on YouTube
[01:05:41] or on TikTok
[01:05:42] and I get a good giggle
[01:05:43] and that was exactly what I needed that day.
[01:05:45] But more often than not,
[01:05:48] we have fully developed frontal lobes
[01:05:51] and we can regulate ourselves
[01:05:52] after 10 or 15 minutes of that.
[01:05:54] But also even TikTok,
[01:05:56] which I've shut down my account
[01:05:58] and I had a reason to be on it.
[01:06:00] We were out of social media following.
[01:06:02] I just took it off
[01:06:03] because I found even myself,
[01:06:04] I just went to TikTok
[01:06:05] to announce a new feature on Khan Academy
[01:06:08] and then I end up just swiping and swiping
[01:06:11] and then 10-15 minutes go by.
[01:06:12] I'm like, I don't feel good
[01:06:13] about what I'm doing with my day
[01:06:14] and I stop.
[01:06:15] I was able to stop at 15 minutes.
[01:06:17] There's a lot of young people
[01:06:18] who aren't stopping at all
[01:06:19] and they're going 2, 3, 4 hours.
[01:06:21] And look, it's great
[01:06:22] if someone has the self-regulation
[01:06:24] to be able to stop after 15, 20 minutes.
[01:06:26] That's probably healthy fun
[01:06:28] as long as it's not making them
[01:06:29] feel bad about themselves.
[01:06:31] But I am imagining a world now
[01:06:33] and we're working on this
[01:06:34] where an AI can act on your behalf
[01:06:38] or act on your parents'
[01:06:39] or your teacher's behalf
[01:06:40] where, yeah, okay,
[01:06:42] I'll let my daughter use a cell phone
[01:06:44] but I want to see a world
[01:06:46] where there is an AI agent on that phone
[01:06:48] that I am in conversation with
[01:06:50] and I always tell them,
[01:06:51] look, I'm cool with her
[01:06:52] spending some time on it
[01:06:54] but let's be careful about her
[01:06:56] seeing things that hurt her body image
[01:06:59] or things that make her feel
[01:07:01] like she's missing out on things
[01:07:03] or just put her into this polarizing haze
[01:07:06] or whatever it might be.
[01:07:08] And then the AI can make sense
[01:07:10] of what she's surfing and looking at
[01:07:12] and say, hey,
[01:07:13] we just spent the last 10 minutes
[01:07:15] looking at pictures
[01:07:16] of that friend's birthday party
[01:07:18] when you were out of town.
[01:07:20] Maybe we want to go do something else
[01:07:22] or if it can see patterns
[01:07:24] in what the daughter is doing,
[01:07:25] tell the parents,
[01:07:26] hey, did you know your child
[01:07:28] is spending an awful lot of time
[01:07:29] doing X, Y, or Z?
[01:07:31] One, that might just be something
[01:07:32] to police a little bit more
[01:07:34] or it might be an opportunity
[01:07:35] to have a connection.
[01:07:36] Like, did you know
[01:07:37] that they're really interested
[01:07:38] in crocheting?
[01:07:39] Maybe you should take it up.
[01:07:40] Maybe you should have
[01:07:41] a conversation with them.
[01:07:42] This is something to talk about
[01:07:43] at the dinner table.
[01:07:45] So I'm hoping that the AI
[01:07:46] can act much more as a...
[01:07:49] used well,
[01:07:50] a little bit of a guardian angel
[01:07:52] and it can help parents,
[01:07:53] it can support parents
[01:07:54] engage with their children more.
[01:07:56] I hope that, you know,
[01:07:57] right now we have devices
[01:07:58] like Siri and Alexa
[01:07:59] and Google Home
[01:08:00] and right now they're kind of like,
[01:08:02] you know, what time is it?
[01:08:03] Put a timer on.
[01:08:04] You know, what's the weather today?
[01:08:06] How many people live in Ukraine?
[01:08:08] That's the type of things
[01:08:09] we use it for.
[01:08:11] I would love a time,
[01:08:13] which I think is going to happen
[01:08:14] in the next year or two,
[01:08:15] where I'm...
[01:08:17] You know, sometimes
[01:08:18] we're having dinner together
[01:08:20] and we're having a conversation
[01:08:23] but I know that there's more
[01:08:25] that my kids have going on
[01:08:26] in their life
[01:08:27] that I'm not getting out of them
[01:08:29] when I just ask them,
[01:08:30] how was your day?
[01:08:31] How was school?
[01:08:32] How's history class going?
[01:08:34] And I could imagine
[01:08:35] a world where an AI says,
[01:08:37] I could say, hey Alexa,
[01:08:39] can you moderate a conversation,
[01:08:40] a fun conversation between us
[01:08:42] as kind of an icebreaker?
[01:08:43] And it does.
[01:08:44] And it says, all right,
[01:08:45] we're going to go around
[01:08:46] and everyone's going to say
[01:08:47] the best part of their day.
[01:08:48] And then when my nine-year-old says,
[01:08:49] oh, my best part of the day
[01:08:50] was getting to hang out
[01:08:52] with my, you know,
[01:08:53] my friend who was out sick,
[01:08:55] then the AI could say,
[01:08:56] well, why was that?
[01:08:57] But once again,
[01:08:58] it's not squeezing out the parent.
[01:09:00] It's creating a context.
[01:09:02] As a parent,
[01:09:03] I crave more context
[01:09:04] where I can engage
[01:09:05] with my children
[01:09:06] in a more meaningful way
[01:09:07] as opposed to just being transactional.
[01:09:09] Put that iPad away,
[01:09:10] go to bed,
[01:09:11] eat your food.
[01:09:12] Hey, it's time to get up.
[01:09:13] Hey, we're late for the bus.
[01:09:14] You know, I don't,
[01:09:15] we all have to do that as parents,
[01:09:16] but it squeezes out the like,
[01:09:18] what do you think is the meaning of life?
[01:09:19] Or what does it mean
[01:09:20] to be a great friend?
[01:09:22] Or how can we all
[01:09:23] be better family members?
[01:09:25] If we could have
[01:09:26] an expert facilitator
[01:09:27] all the time,
[01:09:28] that'd be pretty incredible.
[01:09:30] Do you think,
[01:09:31] are you going to get
[01:09:32] a wearable like AI
[01:09:33] like this humane AI pen
[01:09:35] or a rabbit
[01:09:37] or maybe make
[01:09:38] conmingo a wearable?
[01:09:40] Yeah, it's,
[01:09:41] I have no imminent plans
[01:09:43] to do that
[01:09:44] because I'm also someone
[01:09:45] who really likes going
[01:09:47] off the grid, so to speak,
[01:09:48] and being completely unplugged
[01:09:52] in certain cases.
[01:09:53] But if there are devices
[01:09:57] that come out
[01:09:58] that could,
[01:09:59] that I can legitimately believe
[01:10:02] will enhance my life
[01:10:03] in some way,
[01:10:04] I'd be open to it.
[01:10:05] And as long as they have
[01:10:06] the data privacy
[01:10:07] and all of the right
[01:10:08] safeguards in place.
[01:10:10] Now,
[01:10:11] I'm wondering,
[01:10:12] one big question out there is,
[01:10:13] does AI somehow plateau?
[01:10:15] Like,
[01:10:16] is there only so much
[01:10:17] it can learn
[01:10:18] and simulate,
[01:10:20] you know,
[01:10:21] human conversation?
[01:10:22] So you mentioned how like
[01:10:23] GPT-5 is going to have
[01:10:24] a trillion parameters
[01:10:25] as opposed to GPT-4,
[01:10:27] which just had
[01:10:28] 175 billion
[01:10:30] as opposed to GPT-1,
[01:10:31] which had,
[01:10:32] I don't know,
[01:10:33] a million or whatever it was.
[01:10:34] At a trillion parameters
[01:10:36] or 10 trillion parameters,
[01:10:38] what can AI do
[01:10:39] that prior versions couldn't do?
[01:10:42] None of us know for sure.
[01:10:43] And I think it is
[01:10:44] a philosophical debate
[01:10:45] about if an AI is trained
[01:10:47] on human-created content
[01:10:49] for the most part,
[01:10:51] can it transcend
[01:10:52] human intelligence
[01:10:54] in certain ways?
[01:10:55] I think the answer is
[01:10:56] in certain ways
[01:10:57] it will be able to
[01:10:58] because as these,
[01:10:59] as the parameters increase,
[01:11:00] it will be able to find
[01:11:02] patterns that are not maybe
[01:11:04] explicitly obvious to us
[01:11:06] and maybe leverage those patterns
[01:11:08] to create things.
[01:11:09] But will it be like,
[01:11:11] you know,
[01:11:12] a truly transcendent intelligence?
[01:11:14] I don't know.
[01:11:16] I think one of the interesting
[01:11:20] things that are going to happen is
[01:11:23] they're going to be collecting
[01:11:24] more and more inputs.
[01:11:26] Right now,
[01:11:27] it's pretty much human-created
[01:11:28] written text for the most part.
[01:11:29] And now images, videos, sound, files.
[01:11:33] I think when the AIs
[01:11:35] are getting more sensory perception,
[01:11:39] you know, once you have wearables,
[01:11:40] once they have cameras,
[01:11:41] I know this is a little bit dystopian,
[01:11:43] so we have to think about
[01:11:44] how this gets provisioned.
[01:11:45] It's going to get more training data.
[01:11:48] I also think robotics is interesting
[01:11:50] because when we learn,
[01:11:52] we don't just observe our environment,
[01:11:55] we play with the environment.
[01:11:56] I mean, we literally,
[01:11:57] that's what,
[01:11:58] play is literally learning.
[01:11:59] A kid pokes something,
[01:12:00] throws it, you know, shakes it,
[01:12:02] and figures out how they're investigating the world.
[01:12:06] And I can imagine
[01:12:10] once you pair AI with robotics
[01:12:12] and then it can actively investigate the world,
[01:12:15] it will get even more input
[01:12:18] on, and maybe even,
[01:12:20] and look,
[01:12:21] it's going to be able to have
[01:12:23] sensory input that we don't have.
[01:12:24] It will be able to
[01:12:26] see the entire electromagnetic spectrum,
[01:12:28] not just the,
[01:12:29] you know, it's going to be able to
[01:12:31] hear every frequency of sound.
[01:12:33] It's going to be able to,
[01:12:35] you know, detect molecules
[01:12:37] that we can't at least consciously detect.
[01:12:40] So, it's an interesting philosophical debate,
[01:12:44] but, you know, once you start talking about
[01:12:47] orders of magnitude,
[01:12:48] more parameters than we have synapses
[01:12:50] in the human brain,
[01:12:52] and it's tireless,
[01:12:54] and it has access to all of this information,
[01:12:57] and it can process it faster than we can,
[01:13:00] it is interesting.
[01:13:01] And I know everything,
[01:13:02] even me talking out loud,
[01:13:03] I've kind of scared myself a bit.
[01:13:05] But, you know, once again,
[01:13:08] it's all about intent
[01:13:09] and how we use it
[01:13:10] and can we use it for good purposes,
[01:13:13] because there can be a lot of good purposes here.
[01:13:16] I mean,
[01:13:18] so some would say,
[01:13:19] let's say a good purpose is education.
[01:13:22] I do, and there's evidence that
[01:13:26] I think with a general AI
[01:13:29] that basically exceeds
[01:13:31] our current educational efforts,
[01:13:33] kids are going to get smarter.
[01:13:35] Meaning, they're just going to learn faster,
[01:13:39] they're going to retain more,
[01:13:40] they're going to get that feedback loop
[01:13:43] much more quickly,
[01:13:44] and so they're going to be able to advance
[01:13:47] more quickly than our generation was
[01:13:49] or the generations in between.
[01:13:51] I mean, an example that you can see
[01:13:53] in a specific domain,
[01:13:54] it's not a general domain,
[01:13:56] but look, computers have been better than humans
[01:13:59] at chess since 1997,
[01:14:01] and computers have evolved incredibly since then.
[01:14:06] So it's been over 25 years
[01:14:09] or 20, whatever, 27 years
[01:14:11] of computers improving
[01:14:13] since they were already better than humans,
[01:14:16] and kids use computers to learn chess
[01:14:20] with some coaching,
[01:14:21] but 99% is they're playing the computer
[01:14:24] and getting immediate feedback,
[01:14:26] and then there might be some coaching.
[01:14:27] And kids are so much better now
[01:14:30] than they were back in 1997,
[01:14:33] and their rate of improvement
[01:14:35] is so much faster.
[01:14:36] I mean, a 17-year-old just the other day
[01:14:38] became the challenger to the world championship.
[01:14:41] That never would have happened 25 years ago.
[01:14:44] So it's incredible how in that specific domain
[01:14:48] and other domains like that,
[01:14:50] kids have advanced and learned
[01:14:52] and retained so much faster than kids back then.
[01:14:55] And I wonder if the same thing is going to happen
[01:14:57] in general education.
[01:14:58] I obviously hope so.
[01:15:02] It seems like it definitely will.
[01:15:04] Like our kids will be smarter.
[01:15:06] And I typically traffic in optimistic scenarios,
[01:15:09] and so I hope what you're saying is right.
[01:15:11] There is a more dystopian,
[01:15:13] I don't know if it's dystopian,
[01:15:14] but a less optimistic scenario
[01:15:16] where you're going to have a segment of students
[01:15:19] who take these tools and run with it
[01:15:22] and will do exactly what you said.
[01:15:25] They're going to get to heights
[01:15:27] that we never thought was possible
[01:15:29] at very young ages.
[01:15:31] And look, that by itself
[01:15:32] is not in any way a negative thing.
[01:15:34] That is a positive thing.
[01:15:35] These are going to be the young people
[01:15:37] who cure diseases for us,
[01:15:39] who start the next great companies,
[01:15:42] who write the next great novel,
[01:15:44] who produce the next great movies,
[01:15:46] all of the above,
[01:15:48] push AI forward
[01:15:49] or make sure that AI is used in a safe way.
[01:15:51] But I think it's how do we make sure
[01:15:55] that the other whatever it is,
[01:15:57] 60%, 70% of students stay engaged
[01:16:00] and engage enough to not be thrown behind
[01:16:06] by all of this.
[01:16:08] Because if they are,
[01:16:10] it's not good for them as human beings
[01:16:13] and it's not good for society.
[01:16:15] And so that's where a lot of our...
[01:16:16] What we're already seeing with Conmigo
[01:16:17] in the classroom is
[01:16:19] there is a segment of student
[01:16:20] who just runs with it
[01:16:21] and they are off to the races.
[01:16:23] And once again, no one should hold them back.
[01:16:24] I mean, sometimes the school system's temptation
[01:16:26] is to hold those students back.
[01:16:27] That is not a good idea.
[01:16:29] We want these students to move
[01:16:31] as fast as they can.
[01:16:34] But how do we make sure
[01:16:35] that everyone has a decent chance
[01:16:37] of being able to participate
[01:16:39] without holding people back
[01:16:40] is where a lot of our work is.
[01:16:43] Well, and you've been involved in...
[01:16:46] Obviously, you've been involved in education
[01:16:47] for almost 20 years.
[01:16:49] Khan Academy's been around almost that long.
[01:16:51] Why do we have the old system at all?
[01:16:54] Why do we have credentialing at all
[01:16:55] where, oh, in order to get a job
[01:16:57] in a high-profile place,
[01:16:59] I need a degree from Harvard.
[01:17:01] Why can't I just go to Khan Academy
[01:17:04] for a year or two, learn what I need to learn,
[01:17:06] and get a job at Goldman Sachs
[01:17:08] or Silicon Valley or Hollywood or whatever?
[01:17:13] What is the real role now of higher education?
[01:17:15] Why can't I just use online education for that?
[01:17:20] I agree with you 100%.
[01:17:21] And you're starting to see aspects of that exist.
[01:17:24] You're definitely seeing that
[01:17:25] in fields like software engineering
[01:17:27] where the same employers
[01:17:29] that 20 or 30 years ago might have said,
[01:17:31] okay, we're only going to hire from MIT
[01:17:33] and Caltech and Stanford,
[01:17:35] they're now saying,
[01:17:36] hey, anyone can take this boot camp
[01:17:39] or take this assessment.
[01:17:40] And if you pass it,
[01:17:41] we're going to interview you the same
[01:17:43] as we would interview a 4.0 grad from Stanford.
[01:17:47] So that's already happening in certain fields.
[01:17:51] I think what you're going to see,
[01:17:52] and this is a passion of mine,
[01:17:55] is I do want to create competency-based credentials
[01:17:59] that have the same or greater prestige
[01:18:02] as going to Harvard or going to Oxford.
[01:18:06] So much so that even if you go to Oxford or Harvard,
[01:18:08] you'll still want to get these things
[01:18:09] to show that you actually learned
[01:18:11] some very useful skills.
[01:18:16] Systems change is harder than technological change
[01:18:19] for a whole series of reasons.
[01:18:21] But this is something that I,
[01:18:22] and hopefully I have many decades left on this planet,
[01:18:26] but I'm hoping in the next 10 or 20 years,
[01:18:28] hopefully closer to five or 10 years,
[01:18:30] you're going to hear some things,
[01:18:31] even from us,
[01:18:32] about ways that you can get high school, college credit,
[01:18:35] potentially even job opportunities
[01:18:37] via a pretty streamlined route.
[01:18:40] Once again, it doesn't mean that it's an either-or.
[01:18:42] You might want to do both.
[01:18:44] But if you don't have access to Harvard,
[01:18:45] which I was telling some very senior people at Harvard
[01:18:49] that right now there's this false tension
[01:18:52] where they feel,
[01:18:53] everyone writes about admissions
[01:18:55] and it feels like a false tension
[01:18:57] between equity and merit,
[01:18:59] where there's like, oh, affirmative action,
[01:19:02] that gets ruled down by the Supreme Court.
[01:19:04] So how are we going to make sure we have equity
[01:19:07] and that we have good representation
[01:19:09] if we have to go more based purely on test scores
[01:19:12] or whatever else?
[01:19:13] And what I've told them is like,
[01:19:14] you know, it's a false tension
[01:19:16] because what you've held constant is capacity.
[01:19:19] Like you only admit whatever,
[01:19:22] 2,000 students every year.
[01:19:24] Even though there's probably
[01:19:26] 100,000 qualified students every year.
[01:19:29] If you could serve all of the 100,000 qualified students,
[01:19:33] you wouldn't have to have these weird,
[01:19:35] bizarre discussions about admissions
[01:19:37] and equity and all of this that people are having.
[01:19:40] You would just be able to,
[01:19:41] if you can handle the work,
[01:19:42] we are admitting you
[01:19:43] and you will get a credential
[01:19:45] that can open up the world to you.
[01:19:47] So yes, I hope that we can create some pathways like that.
[01:19:52] I think though, part of that tension is the scarcity principle,
[01:19:55] which is that Harvard prices their scarcity.
[01:19:58] And so they make money from their scarcity
[01:20:00] and Khan Academy or let's say a commercial equivalent
[01:20:04] makes money from scalability as opposed to scarcity.
[01:20:08] And Harvard doesn't want to have 100,000 students.
[01:20:11] They want to accept only 2,000
[01:20:13] because they could charge 300,000 a semester
[01:20:16] and they'd still fill up.
[01:20:18] And I think that's the tension really.
[01:20:22] Yeah, and obviously our motivation is,
[01:20:24] our costs go up as we scale.
[01:20:26] So our motivation is more of just like,
[01:20:28] well, how do we maximize impact?
[01:20:29] But you're right about there's always going to be
[01:20:32] a notion of scarcity
[01:20:33] and the reality of the Harvards of the world.
[01:20:36] And so much of the education debate
[01:20:38] focuses on let's call it about 30 universities,
[01:20:41] even though it's less than 1% of students
[01:20:44] go to those 30 universities.
[01:20:46] But those 30 universities are relevant
[01:20:49] because they do over influence society.
[01:20:53] And so I think it's not going to go away anytime soon
[01:20:57] that there's a certain,
[01:20:58] the US doesn't have a caste system
[01:21:00] in the traditional sense,
[01:21:01] but our elite institutions are the best proxy for them
[01:21:06] where, oh, you're Harvard class of whatever,
[01:21:09] were you there?
[01:21:10] Oh, did you have this professor?
[01:21:11] Oh, wow.
[01:21:12] Like we're part of the same club.
[01:21:15] I don't think that's going to go away.
[01:21:18] And I think there's other positives
[01:21:21] of not just Harvard or Stanford,
[01:21:23] but college in general,
[01:21:25] of being there in person, forming bonds.
[01:21:27] I met my wife and many of my best friends in college
[01:21:29] and we have our best memories there.
[01:21:31] So I think we can always optimize
[01:21:32] for some of those in-person experiences,
[01:21:34] but I want to create also alternative pathways
[01:21:37] so that's not the only way
[01:21:38] that you can get into an upper middle class
[01:21:41] or promising career.
[01:21:44] On that optimistic note,
[01:21:46] and I really hope you're right.
[01:21:47] I've been writing about this also for a long time
[01:21:50] and I'm really looking forward to the utopian world
[01:21:54] that you have been outlining and creating
[01:21:57] and explaining particularly in your book.
[01:22:00] I'm going to say the title again
[01:22:02] because I always forget things really quickly.
[01:22:05] Brave New Words,
[01:22:06] How AI Will Revolutionize Education
[01:22:08] and Why That's a Good Thing.
[01:22:10] And I think what you've explained today
[01:22:12] has solved a lot of the fears
[01:22:14] that many people have about AI.
[01:22:16] But Salman Khan, creator of the Khan Academy,
[01:22:18] thanks once again for coming on the show.
[01:22:21] Thanks for having me, James.
[01:22:26] We all have that friend who wakes up early
[01:22:38] to go get everyone McDonald's breakfast
[01:22:40] while the rest of us sleep in.
[01:22:42] This is your sign to thank them.
[01:22:45] And if you're that friend,
[01:22:46] this is us saying thank you.
[01:22:49] Just a friendly reminder that right now
[01:22:52] get any size iced coffee before 11am for just 99 cents
[01:22:56] and a satisfying sausage McMuffin with egg is just $2.79.
[01:23:00] Price and participation may vary,
[01:23:02] cannot be combined with any other offer or combo meal.
[01:23:06] At Capella University,
[01:23:09] you'll get support from people who care about your success.
[01:23:12] From before you enroll to after you graduate.
[01:23:15] Pursue your goals knowing help is available when you need it.
[01:23:18] Imagine your future differently at capella.edu