Results

Instructure Inc.

04/07/2026 | Press release | Archived content

Code-ablanca: Rethinking Computer Science Education and the Future of Learning Code

Back
April 07, 2026

Code-ablanca: Rethinking Computer Science Education and the Future of Learning Code

by InstructureCast

In this episode, Ryan and Melissa Loble chat with Sanjay Srivastava, CEO of Vocareum, to explore the intersection of technology and education. They discuss Sanjay's journey in the tech industry, the importance of experiential learning, the impact of AI on computer science education, and the evolving role of educators in an AI-driven landscape. The conversation highlights the skills needed for future computer science graduates, the significance of authentic assessment, and the ethical considerations surrounding AI usage in education.

Takeaways:

  • Experiential learning environments enhance student engagement.
  • Future computer scientists must understand AI's capabilities and limitations.
  • Personalization in education is a key benefit of AI.
  • Authentic assessment is crucial for measuring student mastery.
  • Teachers play a vital role in integrating AI into the classroom.
  • AI can help reduce failure rates in education significantly.
  • Ethical considerations in AI usage are paramount.

Show Notes:

What is Educast 3000?

Ah, education…a world filled with mysterious marvels. From K12 to Higher Ed, educational change and innovation are everywhere. And with that comes a few lessons, too.

Each episode, EduCast3000 hosts, Melissa Loble and Ryan Lufkin, will break down the fourth wall and reflect on what's happening in education - the good, the bad, and, in some cases, the just plain chaotic. This is the most transformative time in the history of education, so if you're passionate about the educational system and want some timely and honest commentary on what's happening in the industry, this is your show.

Subscribe wherever you listen to your podcasts and join the conversation! If you have a question, comment, or topic to add, drop us a line using your favorite social media platform.

  • Code-ablanca: Rethinking Computer Science Education and the Future of Learning Code
    Welcome to Educast three thousand. It's the most transformative time in the history of education. So join us as we break down the fourth wall and reflect on what's happening. The good, the bad, and even the chaotic. Here's your hosts, Melissa Lobel and Ryan Lufkin.

    Hello, and welcome to another episode of Educast three thousand. I am your cohost, Ryan Lufkin.

    And I'm your cohost, Melissa Lobel. And today, both Ryan and I are super excited to have our guests join us. We've known this guest for quite a long time and are really inspired by the work he's done. So we're we're thrilled to have Sanjay Sridhartha with us today. He is CEO of Vocarium, but has a long history in both technology and education. And so we're going to pick his brain today around the intersection of technology education, the future of work and coding, and what does this all mean for educators and how we think differently about perhaps what the future of teaching and learning looks like. So Sanjay, it's so good to have you here.

    Thank you so much. I'm really excited to be here. Thanks for inviting me.

    Awesome. Well, Sanjay, before we jump into the hard hitting questions, give us a little bit about your background so our audience gets to know you a little better.

    Yeah, sure. So, you know, I took a somewhat normal path to Silicon Valley, which is I went I did my undergrad in India and IIT. And then I was very, very fortunate to get invited to the computer vision lab for my PhD in University of Illinois, Urbana Champaign. I was there for about three, three and a half years and I you know, at that point, realized instead of finishing my PhD, I really wanted to go court for a living.

    So, you know, those days, this is like nineteen ninety or nineteen eighty nine, and I asked a friend of mine in Silicon Valley to send me a classified section of San Jose Mercury News, and he sent it to me. Those are deep you know, the moment of deep recession. So I sent fifty resumes to anyone who wanted to say coding. I didn't care about what it was, what language, what the domain was.

    You know, I sent about fifty resumes. Thankfully, I got one interview. There's a company called Vantage Analysis Systems. They were doing semiconductor design, and I came to Silicon Valley. And I got really lucky in hindsight. As you can see, none none of it was planned, but I got really lucky in a couple of ways that my first boss was amazing. So and I do think that your first boss can actually make or break your career in a lot of They can really either build their confidence or not.

    And the company was awesome. They're like, you know, when you're sold about four years later, there were forty five people in the company. And I was looking, counting later on, about eighteen of them went on to have titles of CEO. Right?

    I mean, this was just crazy entrepreneurial culture with lots of smart people. So I knew I wanted to start a company. So I four of us got together. Between us, we had ten thousand dollars.

    So we pooled it and started a company called Denali Software. Two people quared. The two of us who were the youngest one had the, you know, least to lose. We kept going.

    And that was semiconductor design space. In two thousand ten, I was fortunate enough to have, you know, sold the company for three hundred fifteen million dollars. Still looking for that return.

    And then in twenty twelve, I sold another company called Envelope, which is a system software to Samsung. And I knew and that was the moment I realized I was too young to retire, but I wanted to combine and I wanted to continue my passion for building companies. But this time I wanted to do something which had, you know, real sense of purpose to it. Right?

    So I ended up going and saying, okay. Let's build something in education space. And that's what Vocarium was formed in two thousand fourteen. And here I am, you know, that's the journey.

    Amazing.

    I love that. Yeah. I love that. And the the comment about your first boss, I can think of my first education boss and how much that influenced who I am today.

    Yeah. Absolutely.

    Other things that influence us can be teaching or learning moments. So we're curious. We ask all of our guests, is there a moment, a favorite moment or a moment you remember in your life where you were learning something and it was impactful, where you were teaching something that was impactful, or where you observed something and you saw the impact of teaching and learning. So would you mind sharing maybe a favorite moment with us?

    You know, so mine is probably going to be a little unusual. Right? So it was not a moment, but it was an environment. So, you know, was saying that when I was there in India, you know, we had a computer center and you could run mainframe and you had to reserve a terminal. So at late at night, you go there just so that you can do some programming on this, know, it's like a really solid terminal, which is really slow.

    And then I ended up coming to Urbana Champaign in the computer vision lab, and I was dropped in this, like, environment with workstation all around me. And then, you know, we just had acquired a Pixar machine, you know, which was before, you know, Steve Jobs bought it and converted it to animation movie. And it was I just realized I've had so much fun coding that Yeah. You know, so having the access to that hands on space.

    So for me, that was a big deal. So when I had the opportunity, I went back and contributed to something called a design lab back in Alba Mater because I to me, that was a big deal having peers around, your your PhD advisor around, your research assistant, but more importantly, all in space where you can actually do real work.

    I love that. I love that. And while you may think it's unusual, I think that's really timely and actually really connected to our conversation because so much what we're hearing about the change in education is does that need to have it be experiential. Right? And be you need to be embedded and you need to be able to code and practice and and get excited about that.

    So that jump from theory to actual practice.

    Right? Exactly. And just have fun. Like, think that's so cool that you're having just fun in learning.

    So but it's a good place to start. So we're all hearing, and maybe this is a little bit of an elephant in the room, but we're all here. I mean, there's AI coding assistants are here. Right?

    So much. There's this debate in this space around what role will being able to code play in the workplace. But even more than that, I've heard a number of like department heads of CS ask, what's my role now? What should I be teaching?

    So what are some of the big or what's the fundamental shift that you're seeing we're living through right now in in computer science education more broadly?

    The education system, as you can imagine, right, I mean, still has to react. Things are moving so so fast. Right? But in some ways, you know, I think about when I when I started Vocadrium, it was like, you know, we had a first set of policies in two thousand fourteen.

    The big deal in those days used to be about computing literacy. Right? Everyone was talking about everyone needs to learn how to code. And especially in high school states were talking about making it at least part of the requirement and then the computer science enrollment was exploding the space.

    So at least at that moment we went through one set of transformation.

    The second one, at least from what we'd observed was around data science and data literacy. So there's a, you know, again a lot, especially in business schools saying, okay, we all need to become data literate, right? We can't which did mean some element of coding. So lots of, you know, business schools.

    And then we are going through business school, sorry, to finish my sentence, here, introduce like computing and computational literacy. Right? So the moment I think we are going through, you know, it's one of those at least moments in my life where you see extreme euphoria and you're seeing extreme panic both at the same time. Yes.

    No. So at least in from a literacy perspective, it was never really about, you know, learning how to code. I mean, that was the easy part, right? So it was about how you code in the context of a system.

    So that's kind of when you learn your foundational computer science literacy, right? You're worried about scaling a cyber security and all that, right? And then over time, SaaS, you start worrying about like how do I bring design thinking into my core coding process. Right?

    And right now it's about AI. Right? And I do believe that the and I'm sure we'll talk further about it, but it's a big moment but there are lots of components to it. Right?

    Number one, the easy one is we all have to get productive. Right? We all have to know how to use AI. There are some crazy things that you hear which is Anthropic, especially coming from Anthropic that all these stories about they're hiring a hundred computer engineers but nobody's going to code.

    They're all going to be, you know, just, you know, sort of telling someone else, like their agents to code. But I think the AI literacy is a big deal, but I do believe that the change is still, from a computer science discipline perspective, I think you still need to learn the systems and you need to learn AI deeply.

    So we hear that analogy of programmers now are really more the conductor of the orchestra rather than the players of the individual instruments.

    But how are we we still need to teach students how the instruments work, how the instruments play. Right? What is what is that shift in teaching look like? Is that an is that an apt analogy that they're gonna be the conductors?

    So, you know, again, the computer science, if you think especially going forward, I mean, the computer scientists will be needed for lots of different things just as in the context of AI. Right? You will need people, of course, who will be building AI. So you have to have a deep understanding about how do you build AI, not only just a generative AI, but they are the future.

    Obviously, there's an autonomy, but more and more we stop thinking about physical AI. So that's one element to it. Right? But going back to the agent orchestration, what is exciting is I think how our customers, especially in the when I you know, people are using technology, now have the ability to write agents and orchestrate agents.

    Right? So as far as if you're a computer scientist yourself, I think the pressure right now is to get whatever twenty five, thirty percent in productivity from AI. Right? And which means which could mean forty, fifty, sixty percent of their code is written by AI, but you'll still be in charge.

    Know? And I'm just trying to just you know, you will not let AI just say, go do this whole thing for me, but you'll say, suggest changes, suggest architectures and you'll approve, you'll manage it. That's certainly a skill, right? But it still needs a deep understanding and you'll be doing it at a level at which you need a deep understanding that AI is doing the right thing.

    Now, I do believe your customers. I think that's where it gets a lot of fun. So for the first time in about thirty years, I decided to go build some agents myself. Right?

    So one of my problems used to be in business.

    You send all these emails and you're sometimes people don't respond and you're like always under pressure. Did I track this one? So I said, you know, let me write an agent to go go through my emails every morning and say, hey, who has not responded and give me a quick summary about them. Right?

    And another agent was like in the Slack, if your customer lead is coming in, go make some guesses about estimation about how important this lead is and tell me research on the customer. And the third one was I wanted to write try writing a mobile app for Vocarium, just something simple. And what is amazing about all these three projects, it took me about two to three hours to do them each. Wow.

    And I touched about eight or nine different pieces of technology, Slack APIs, Gmail APIs, going back, deploying on AWS. And it was just it was an eye popping moment or whatever. Right? So going back to your question about agents, I do think our customers, if you're sales and finance and everyone on customer support, you'll so again, the realization was amazingly easy to write it.

    The second realization immediately afterwards, like, oh, man, it's so risky. Right? Because you're giving these agents all these credentials to go do this on my Yeah.

    Yeah. Yeah. Yeah. I don't know what the war will look like, but the agents will have to be written and it can be done without the proper kind of controls and guardrails and knowledge and processes. And going back to the question, computer science, I do think they'll play a critical role in how these agents get created and how they get measured, how they get orchestrated.

    That's awesome. Speaking of computer science and thinking about that, let's say you were gonna hire students and or you're gonna hire students as after they've graduated and they have a computer science degree. What skills would you expect them to have? You're starting to touch on that, right? Some of this is around being able to build and leverage the technology, but what are some of the skills that you would expect now out of a computer science student as opposed to that might have been needed, you know, ten, fifteen years ago?

    So if I was I mean, that's an interesting question. Also in some ways, it speaks to one of the real stress in the system. Right? I don't know whether you guys saw that, that UC is the first time the computer science enrollment declined. Yes. You know? And I think partly it's happening, in some ways it's pretty unfortunate, but partly it's happening because the jobs market for new graduates is like really bad.

    Now it's bad across the board, but because we live in such a tech heavy world that what's getting really amplified is the struggle with the computer science graduates that are having. Right? But going back to your question about what skill I would want them to demonstrate as much as possible deep understanding of AI, which means what AI is capable of, but equally as important, a more sophisticated understanding of what AI is not capable of. Right?

    When, you know, it's not the replacement for, you know, in a human judgment, You know, it's a it's just a tool, you know. And you demonstrate that by hopefully having done projects just like we used to live for in apps and GitHub. Hopefully, the projects you'll be looking for is agents. You know, have you built agents?

    Can you talk intelligently about it? So I think that's what, you know, we'll be looking for. And if we if they don't have, we'll come and we'll bring them over and we'll train them. But that's kind of the starting skill.

    That's really important. I think there's an oversimplification of AI and the level of oversight it still requires that I think a lot of people need to be educated about. But when we talk about education and AI and education specifically, the debate is raging right now around whether AI is a shortcut to learning, you know, is creating a path around actual learning, or whether it lowers the barriers to access and democratizes access to learning. What is your take on that and where do you see that going?

    So this is actually an exciting question. You know, we got very very fortunate that about two years ago, UCSD, you know, they were doing some research around around a very specific problem, which is the math literacy. How do you go and and how do you go address what they were seeing and this increasing problem of people coming in not having sort of a high middle school or high school level math literacy. And they're trying to ask the question, could AI help?

    Right? And it was really exciting to go through this process and obviously, and they're now talking about it that in the last version of this pre calc course, with all the cohorts being so you're going to the same cohort, same entry score, and the failure rate went down by something like seventy percent or whatever. Right? So I do believe that I don't think there's any doubt in my mind and I think there's enough research out there already to say that AI, if implemented with the proper scaffolding and intentionality, and obviously you can talk about in more detail, it is enabler.

    Right? Especially in terms of, you know, just providing access and improving outcomes.

    Yeah. I'd like to dig into that a little deeper. So how is it enabler? Because to Ryan's point, we're also hearing a lot of and I've been in conversations where education leaders are like, how do we just make them stop using AI?

    Or they only use AI in the agents that I give them in my context. And the cat's out of bag. Like, they're not stopping to use AI. So you talked about enablers.

    Like, how are you thinking and and how does that actually maybe even show up with Vocarium? Like, how do you think about getting students to use AI in the right and more meaningful ways to actually add to their learning experiences.

    So, yeah, you know, the experiment which we did, and if you found what does not work in some ways, you know, start with. Right? So the two things in my opinion that don't work. Right?

    If you put an AI tutor out there or whatever, then the students will go just not use it, you know? Or they will use chat chatbot. I mean, you know, and it might be slightly controversial take and maybe people would not like it, but the whole idea that my tutor, all that it is doing is making sure that the domain of knowledge it is using to serve you is more restricted than what's available. So the entire well sized job word all the time, give me some money, and all that I will do is make sure it does less.

    You know?

    So that is I think that was and that's okay. That was a version, you know, not one point o, but zero point one version of deploying AI. Right? The other thing which I don't think it works, now that was kind of an eye opening.

    Right? There's all sorts of research out there which can say, The student engaged with this tutor or this chatbot for thirty minutes or so a week. Their education outcome will increase by a grade or something. But then what I think people start discovering it, and some people are calling it a five percent problem, that if you just leave it there, then only five percent of the students will go use it.

    Right? And those are the five percent students who don't need it. Right? So the conclusion that the AI needs to be deeply integrated into the teaching and learning environment.

    Right? So the way at least the deployment that I was talking about, what the educators decided, the way they do the deployment, the first assignment which is given, they have the scaffolded AI tutor deeply integrated with the assignment. So when they try to do a work it pops up and saying, Hey, can I help you here? By the way and it's personalized, it knows what they know, it has a full context.

    So then we see that the engagements with the tutor just go through the roof. The students who are afraid to ask questions or might, you know, will actually ask the AI. Right? So we have seen the numbers and pretty exciting.

    And these are the students who would who would otherwise might actually the failure rate of some of this possibly like fifty, sixty percent. Right? And we have dropped it down to ten percent, whatever.

    Was so the deployment was you do an assignment with maybe AI tutor there, maybe you do a practice test with AI tutor and then but you're accountable for your learning because then there will be a test which is higher stakes where there'll be no AI there or whatever. Right? And it's personalized, it actually tracks what your the digital twin and again there's a whole architecture that we have implemented, right, on what the student know and what doesn't know. So if you integrate, again the long answer, you know, So it's basically you have to integrate into the current learning architecture, right, where there are assignments and there are grades and AI is sitting there as an aid, right, rather than something on the side.

    And I love that that aspect of it because the next question then is, we've seen very recently there was an article about someone that created a an agent to take their course for them, take their quiz for them, you know. We know that there's tools like Goth AI that you can literally take a picture of your screen and it'll give you the How do we evolve how we're measuring mastery of skill in this day and age? Like, what is the there's no silver bullet, but how do we need to evolve that approach?

    That's a really complicated question. Right? And I'd obviously I I mean, I don't know the answer, but it was kind of exciting. So I went back to North Sea of Illinois, my alma mater, right, about a couple of months ago.

    And they had invested, and this is obviously all pre AI, but computer based testing facility, what they call CBTF, right? In this engineering school. And there, any any faculty could ask us, you know, go ahead and take this test there and it's completely monitored and locked down. And the use for that had just gone through the roof.

    Right? So for summative assessment at least, I think that's what people are doing. But in my opinion, what is exciting about it is that the formatting of assessment is now it's like, let's go out there and focus on learning. Because, and I see the two concrete examples of that.

    One is what I just managed to mention. In the summative assessment, a formative assessment now brought you mastery. You're getting AI driven mastery of concept. The teacher says, you know, I want you to go out there and demonstrate mastery of this, you know.

    And by the way, you cheat there, you know, it doesn't matter because the because your assessment, which will most of it counts with your grade, is will be completely and all that. Right? So that's one form. Another customer asked for it, which we haven't implemented, we are excited about it.

    Like you will write code and then you will have you will get an auto graded answer for grading, But then you will be asked, AI driven, of course, if you explain different portions of your code and the AI will provide you with feedback on that one. Right? So that's kind of exciting. Again, it's not it's again, it's just helping students learn.

    And of course, how do you grade that? It seems like the only answer for now is just to go in a lockdown.

    Yeah. Lockdown browser, blue go back to Blue Books. I I was looking for a stat on how much the sale of Blue Books had increased since over the last three years. I'm still trying to find that one.

    Because I do think there's a number of we're gonna go analog because that's the the only way you'd figure out. I think we're Yeah. There are better solutions than that. We just haven't quite figured them out.

    Right. Right.

    Yeah. Yeah. And I think that some of that comes from authentic assessment. And I would argue that in a lot of the computer science work, and now I am not a computer science.

    I do not hold that degree. So but in watching and learning from this, I think authentic assessment has been embedded in that discipline in certain ways. I certainly know, Locarium is very much about authentic experiences, but I think a lot of other disciplines haven't gone that direction. They're still using the same multiple choice test from twenty years ago because they feel like those facts are the most important things to know, and maybe in some cases they are.

    But where does authentic assessment play in your mind? How do you think about that or how have you thought about that as you've tried to solve, in particular, computer science and coding education.

    So as you pointed out, Musa, so one of the biggest thing which we focused on, Vocarium, then the only thing which we focused on is providing experiential environment which is as close to real world as possible. Which means that you get real tools, you do real projects, then you enable the faculty to go and assess that, right? For students to work on that and assess that. There is still a challenge of scale, right?

    If you do, and that's somewhat unsolved problem that if you have twenty or thirty students in a class, that's one thing, right? But if you start having unfortunate, which is unfortunate thing about a lot of our events, that the beginning lower division classes tends to be, you know, hundreds of students. I remember walking to, you know, like a computer science department of one of major R1s and they had an org chart of their CS one class. It was something like fourteen hundred students.

    This was the bigger. This was way bigger than any company I have worked for. Right? Yeah.

    There's a head TA, there's TA below there, there are like seven lecturers. And, you know, the assessment obviously at that scale has always been a challenge. But I completely agree with you. I think we need to find a way to make it real for by just lowering the student to teacher ratio.

    Right?

    Yeah. Yeah. Scale has been such a problem in experiential and authentic assessment for a long time. Maybe AI will give us a chance for this. Sorry, Ryan. Please go.

    No. That's great. I I mean, it's it's right in line with, you know, there was a really good report from the Digital Education Council towards the end of last year that actually said, okay. Sixty one percent of educators are actually using AI.

    Like, they've done something with AI. Right? The bar was pretty low. But that meant forty percent, you know, not too shy of half hadn't.

    Right? So what does this kind of shift with AI mean for faculty? Are they learning these tools? I I also have a weird kind of bias because I go out and talk to universities and I tend to talk to the ones that are using AI most aggressively.

    And so I'm I'm a little biased in how much I think people are using these. But what is the teacher's role using the AI tools? How does that evolve? You know, I think that's a lot of people's questions.

    So you know, one thing which Google came out, there's a blog from Google which was kind of actually shocking that, and I've forgotten the exact numbers, but they sort of went out, surveyed all the users, not just education users, right, for AI. And actually, it's turned out that the super users were both educators and students. Right? So very, very now students are using more, but it was maybe off like by more than ten points or something.

    Right? It was so I do believe that, you know, in one place, all of us started using AI, especially the first generation of it, just a productivity tool. Right? Just go out there, let me help me create my lesson plans, my assessment, my study plan if you're a Yeah.

    Study guides. Yep.

    Yeah. Rubrics and study guides. So I think that sort of happened. Right? But I do believe that all the disruptions, the fun, if it stays with productivity improvement, that's only interesting.

    At least in case of education, I'm optimistic that AI will help the educators deliver what I think has been the holy grail. It's like personalization. Right? So for example, you know, in the last change, was visiting one of I was really surprised to see, like, a computer science department, maybe I shouldn't have been that surprised, but, you know, it's a few hundred percent class and the teacher has already moved to a full flip classroom.

    So that was kind of the first attempt of personalization. Right? So the student do all the, you know, videos in a whatever in their dorm rooms, come there, she will go out there, have these groups of four or five students all through the classroom working on a project or whatever and she and her TA is trying to sort of help as needed. So I'm hoping that the superpower AI will give to the educators now is to do the personalization at the next level.

    Right? So just as a personal story, you know, my grandson who was like fourteen year old, when I had the chance, I informed so that he joined a school called Fusion School. Don't know if you guys have heard of it.

    Oh, yeah.

    They are a Canvas user. They are a Canvas user. Right?

    Yep.

    Yep. And I love the fact that it's everything. It's every educate you know, it's one on one. Right? And and my grandson is thriving in it and I just and that's for obviously, there's a there's a to scale this up, there's a cost issue. But and so, at least going back to your question about educators, I'm hoping AI is the one just, you know, gets them closer, right, to deliver personalization exactly what, you know, what tools they use and we are experimenting with that. I'm sure others are too.

    Yeah. I think that's one thing that gets lost a lot too is the idea that you don't have to be a graphic designer, don't have to be a videographer. You can create tools that help engagement in your courses in ways that you've never been able to do before. And I think that gets lost a lot in this as well.

    So Well, and I wanna hold on to this idea.

    So so personalization excites you. What else excites you next five to ten years ahead around what AI can bring to either computer science education or education in general or both?

    So computer science, I mean, generally right now, I mean, it excites me to be in this moment even though there's all this doom and gloom around it. Right? I mean, because personally I don't think it's a model T moment. Right?

    It's not like, oh no, we have to start worrying about all the people who are caring about the horses and could drive, you know, drive carriages. Like, what'll happen to them? Right? I do think it's one of those moments more like a calculator moment or a little bit more boring is that, you know, you came up with a higher level language in computer programming.

    In some ways, programming was due for a disruption. We have been doing the same thing like that thirty years. I mean, when we went from assembly like code to hardware programming. So I do think what excites me right now that the computer science is going to, in various forms, is going to start touching every part of our society.

    Either maybe maybe in terms of physical AI, in terms of going you know, I'm hoping like data science was interesting. Right? A few years ago, there were no data science programs. Now their data science programs are their people are building colleges because that's an intersection of computer science and say, the statistics.

    Right? Now I'm hoping that, you know, computer science will see maybe at the intersection of maybe learning. Wouldn't it be amazing that if we start seeing now learning science and AI in computer science becoming this big sort of area of focus because finally, in all the stuff that we have been learning and learning science, you can actually deliver it. Right?

    So it comes back to personalization, but driven by computer science.

    So The other I'm gonna ask you about the flip side of that because there is a lot of doom and gloom. There's lot of the scary stories. In fact, Wired just read an article. I loved my OpenClaw AI agent until it turned on me.

    Right? We we heard the story of a of an OpenClaw agent that couldn't make a dinner reservation online because the form was broken, so it gave itself a voice and called the restaurant. Right? You mentioned earlier that level of autonomy comes with risk, but what do we need to be cautious of moving forward?

    How do we, how do we make sure that we don't go down this dystopian path?

    No, I think we need a lot to sort of cautious about because if you think about, at least from a technology perspective, the technology's job was to, you know, automate some function. Right? So it was supposed to drive productivity. So as a consumer, you knew when you, you know, got into a car or you made a phone call, you knew what it was, you know, roughly what it was doing or whatever.

    Right? So you you said, make me more productive. Get me somewhere faster, you know, connect me someone. But, you know, this is the first time there's a risk that we can actually outsource, you know, ethics to a tool or we can outsource judgment to a tool or whatever and not realize so I mean, you know, even a tech company, I occasionally get challenged by, hey, but the AI said that.

    My decision making, you know, someone is using it. Yeah.

    Yeah.

    I have to go compete with you know, there is a real, real high risk of actually yeah.

    As you said, too much autonomy. There's a real high risk of both in terms of our judgment and our biases, but there's a real high risk of course which giving AI, you know, just like Club Art, know, giving I mean, I'm sure you guys are seeing all the stories about it. It start deleting your emails or start doing random stuff, including the example that you gave. And we all need to understand deeply, I mean, that AI is just a tool which was trained on a set of data and the human beings were involved, which is scoring the AI to say, hey, this is a good answer, this is a bad answer, and this was used to train it. Right? So the risk is we have to all deeply, deeply be concerned about is that if you don't educate the people, I mean, the risk, you know, is large.

    I mean, and if you think about from a sovereign in a country's perspective, you know, if you're in Africa or if you're in like India, you're asking AI to pass judgment on a specific period of history and it is all, you know, and it's often has a, you know, a specific lens, maybe it will, you know, and I think Well, bias they have that bias aspect, you know, how it was trained.

    And then I think our own interpretation that our tendency to anthropomorphize these tools is is We see that repeatedly.

    Yeah. Yeah. Yeah. And I remember going back to my grandson's story that like a few months ago, was like, hey, you know, do you think I did that with AI and do you think it's cheating?

    You know, and that was like and it really stopped dead me in their tracks or whatever. I'm like, I don't know. I said so I could give some really vague answer like, hey, you would know when you are cheating, but very deeply unsatisfying. But Yeah.

    Right? Yeah. But but there are these questions like, am I you know, it's like, how do I interact with the eye we wish we just turned out? And we just need to evolve as a society, you know?

    Yeah. I wish we could come up with all the answers on this podcast, but we will probably end it there. Sanjay, this has been an incredible conversation. I always you know, your insights are are fantastic, and we're kind of shaping the future together. So these conversations are super valuable.

    Yeah. Thank you so much for being here, Sanjay. This is so much about this conversation is inspiring, and I hope our listeners can start to think about how they're approaching computer science education, education in general, and the intersection of AI. So thank you.

    Yes. And we'll we'll include some links in the show notes below so we can link out to some of some of the stuff that you mentioned. But this has been a great conversation. Thank you, Sandhy.

    Awesome. Thank you so much. I really enjoyed it.

    Thanks for listening to this episode of Educast three thousand. Don't forget to like, subscribe, and drop us a review on your favorite podcast players so you don't miss an episode. If you have a topic you'd like us to explore more, please email us at InstructureCast at Instructure dot com, or you can drop us a line on any of the socials. You can find more contact info in the show notes. Thanks for listening, and we'll catch you on the next episode of Educast three thousand.
Instructure Inc. published this content on April 07, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on April 20, 2026 at 15:44 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]