J.D. Mosley-Matchett (00:30) It's time for another episode of AI Update brought to you by I'm J.D. Mosley-Matchett, the founder and CEO of InforMaven. And our guest today is John W. Munsell, the CEO of Bizzuka and the author of a new book called Ingrain AI. John is a leading expert in helping organizations rapidly adopt AI across non-technical departments through proven frameworks and scalable training. He's passionate about making AI accessible to everyday professionals, especially those under pressure to do more with less. Welcome to the podcast, John. John Munsell (01:07) Thanks Great to be here. I appreciate you having me on. J.D. Mosley-Matchett (01:11) book, Ingrain AI, is truly powerful, but it opens with a dystopian race to economic oblivion that's triggered by what initially appeared to be AI-based advantages. You called it every CEO's nightmare, and it really paints a doomsday scenario that's truly terrifying. So, without giving away all the important points that your book covers, how can we avoid the AI disaster that so many people believe is inevitable? John Munsell (01:41) Great question. So the idea, the front end of the book was to let people know, look, you can achieve a lot with AI, but everybody else is doing the same thing. Right? And so if you're using AI and it makes you more efficient, then you initially go, ⁓ wow, this is amazing. I got my whole team more efficient. So now we've 40 hours worth of work and we've pulled it to let's say 35 or 30, all right, for every person. Okay, cool, what do you do with that other 10? All right, do you take that other 10 hours and do you sell it and you pull production into that new additional bandwidth? Or do you just tell people, hey, go home, play with your kids, do whatever you want? Or do you reduce prices, right? And so over time you're faced with these decisions. And so initially you're like, ⁓ well, shoot, let's just sell into it. And then your competitors do it. And so instead of selling into it, they cut prices. And so now you're looking at it and like, okay, what do I do? Well, maybe, maybe I should cut prices. So then you cut prices and then they, they lay off a couple of people cause they've got even more efficient. And now you're like, shoot, maybe I should lay off people. And then as everybody starts laying off people. J.D. Mosley-Matchett (02:36) Thanks John Munsell (03:05) then there's a smaller economy because people no longer spend because they're like, shoot, I don't have a job. I need to kind of pull back on spending. And so now you can't sell into the economies of scale you just got from AI because there's a shrinking economy. And you can see how this thing starts to snowball. That's what everybody's afraid of. That's why you hear all these people kind of sounding the alarm and like, look, we need to kind of put some... We need to pump the brakes on AI just a little bit because it's going to affect the economy. Now, the reality is there are a couple of things that are going to slow this down. One is security. To me, that's the big thing that nobody's really addressing is security. The other is skills. So they keep talking about how 86 % of businesses or 90 some odd percent of businesses have adopted AI when reality is that they were really talking about maybe the Fortune 1000. When I go talk to, you know, small business America or small business Australia or whatever, maybe have Copilot, you know, Microsoft Copilot. They maybe have something in there, but they haven't really done anything for a couple of reasons. One, the CEO doesn't really fully understand the power of AI. Two, the CEO is aware that people below him or her are using AI and that CEO isn't really sure how to lead a pack that maybe knows more about a technology than them. The other is security. Like I mentioned before, they're afraid to let this thing loose because they don't really know how to control it. So you've got a lot of things mixing up in there that are in my mind going to delay the adoption curve and therefore this impending doom. J.D. Mosley-Matchett (04:28) . John Munsell (04:55) So as a CEO, that gives you time, but not much. So the more you can make something happen and the more you can get acclimated to AI, the better you're going to be. The key is teaching your employees how to use it at their desktops. Don't get caught up in a vertical application. Teach everybody how to use it at their desktops. When you do that, then your flywheel starts moving faster than everybody else's. J.D. Mosley-Matchett (05:21) That makes a lot of sense. Rocking the boat is frowned upon in most organizations, but colleges and universities often seem to take complacency and inertia to a whole new level. So can you share some suggestions for how AI can start making an impact in non-academic areas like financial aid, enrollment, or compliance without triggering IT or setting off culture alarms. John Munsell (05:49) You're never going to not set off culture alarms because they're already I mean, look, what it takes is it takes a university president or some level of leadership to say, this is the biggest shift in human history. And we need to do it across the board. But at a minimum, we need to operate more efficiently. More universities are really struggling for enrollment. The prices have gone crazy. I graduated, the tuition at LSU was $375 a year. Put a couple more zeros on there and you're getting close to what it is now. So it's kind of out of control. J.D. Mosley-Matchett (06:23) you John Munsell (06:34) I graduated with, think, a $10,000 student loan, nine of which was spent on a fraternity. But I know people who are graduating with a hundred, $120,000 student loans. That's a mortgage. So that's crazy. Enrollment is going down. Funding is going down, especially for state colleges. J.D. Mosley-Matchett (06:42) You John Munsell (06:58) So the best way to address that is by making your workforce more efficient. So let's just take the academic side of it out of the equation momentarily, and let's just focus on getting the workforce more efficient. So it can help in a number of ways. You just mentioned a couple. Fundraising is another one, frankly. So there's a ton of ways that you could use AI. It just takes somebody to say, look, let's try to learn how to use this to our advantage. Most businesses look at AI like, we have to build an AI tool to solve a singular problem. Like let's say a customer facing chat bot to take a load off of our customer service lines. It can be the same thing with the school. Why don't we create a customer facing chat bot for our website so that we can answer student questions, blah, blah, blah. It'll take a load off of us. That's a very narrow, will it have an ROI? Sure. But it's a very narrow application of AI. But imagine if you had 2000 employees in your university on the administrative side, all 2000 of them. learned how to use AI and were able to increase their productivity by 15 to 20%. That's an enormous gift to your university in terms of productivity, right? And it's enormous stress reliever too on the staff. Having 2,000 people being that much more productive, that's a way bigger ROI than a little vertical app. But when you do that, J.D. Mosley-Matchett (08:28) So true. John Munsell (08:38) the ideas for more vertical app just start to explode, right? And that's where you start to see a flywheel of productivity and efficiency really start to roll. J.D. Mosley-Matchett (08:50) I love that. So true. Now, there's a lot of hype these days around AI agents and automation, but what kind of results are possible when institutions skip all that and just train their staff to use AI as a personal assistant? John Munsell (09:07) Well, I mean, that's to me is the goal. So everybody at their desktop can use a tool like ChatGPT, Claude, Perplexity, Gemini, any number of things. And inside of those, have like ChatGPT has a custom GPT and they have a project. Claude has projects, Gemini has gems... They're almost like automation, but you're not connecting other tools. But you can build a knowledge base into these custom GPTs to actually take a load off of you. Like we had one of our professors who went through our training literally built a GPT help the students go through her syllabus, which It was like a hundred page syllabus. Someone else was getting a ton of questions. She was actually a professor at two different universities. And so she built the GPT to handle that. Two other people built GPTs to help them with grant applications. And again, these aren't connecting databases. These aren't doing API calls. They aren't doing any of that. It's just a desktop tool that allows them to respond faster. As an administrator, if you're dealing with a lot of enrollments or a lot of student activities or any number of things, you can build these things to just manage the workflow without any API calls or anything. So there's a lot that you can do on with your own fingertips on your desktop without involving IT. J.D. Mosley-Matchett (10:39) Now, when IT administrators hear build a GPT or use AI on your desktop, their first worry is data security. So how do you help teams choose the right tools and models while staying compliant? John Munsell (10:55) Well, mean, the first thing, it starts with education. So IT's concern is everybody else's concern. What if they start putting confidential data in there? What if they start putting all of the students' personal ID and information in there, or just a database of students or any number of things, and then it's used to train the models for future people. Now it's all of a sudden access by the world. Yeah, that's everybody's concern. If people are trained what to do and what not to do, if the IT people have configured the LLM, the large language model that they potentially are using to not be used to train the model, then you're at least in a better spot, right? But it really starts with telling people, you know, don't put your finger on a hot stove. That's really it. J.D. Mosley-Matchett (11:53) A lot of administrators are afraid that using AI will make them look like they don't know how to do their job. How do you help teams move past that stigma and see AI as a professional advantage rather than some kind of a crutch? John Munsell (12:07) Well, one of the ways is to just get them together, have a little workshop. It could be a 45-minute workshop or whatever, but it's easy to do in a university environment because they've got labs, they've got computer labs or whatever. What you want to do is just give somebody an opportunity to see what it's like to work with it. And you have to give them a very specific example. So that's one of the things that we do as a first step when we're working with a corporation or a university or school is we have a little workshop. We have very specific exercises we go through that solve problems. And then all of a sudden people go, whoa, this could take a load off of me. Especially when you're talking about the administrative layers that are trying to do more on less money, they're stressed. But when they see how much this improves things, I mean, when you can take a six hour process and boil it down to something that takes you now three to eight minutes, man, you're getting people excited now, right? So they're no longer seeing it as an opponent. They're seeing it as an assistant. And that's what we want to do. J.D. Mosley-Matchett (13:16) Please. Why is letting every department figure out AI for themselves one of the most expensive mistakes an institution can make? And what should they be doing instead? John Munsell (13:33) Well, I mean, look, there's nothing wrong with trying to figure it out yourself. The problem is that there's no continuity. There's no consistency. There's no shareable, scalable skills that are present. So when everybody figures it out by themselves, some will do it well and some won't. Some will get frustrated and throw their hands up and essentially say, is stupid, and others are going to excel. What you want to do is you want to have them all learn it the same way, learn it from the same ⁓ level of understanding, which is why we created the thing that we call the AI strategy canvas. When we started teaching, we teach this concept called scalable prompt engineering. And what that is, is it's a process where somebody could look at your prompt, know exactly what it does. And they can say, look, if I just swapped out this variable for that one, I could use that same prompt to do something completely different in my area. When you have that capacity, now all of a sudden, the learning accelerates, the sharing accelerates, and knowledge isn't just stuck in one person's head. Does that make sense? That's what we're trying to do. If everybody just teaches themselves, it's like you got a whole lot of kids loose on the playground and nobody really knows how to play baseball, right? And so they're just swinging balls and bats. J.D. Mosley-Matchett (14:49) Absolutely. John Munsell (15:00) We want to organize that a little bit. J.D. Mosley-Matchett (15:03) That makes sense. John, I know that you've worked with many organizations across a lot of different industries. So what's one thing that higher education should be borrowing from the private sector when it comes to using AI in operations? John Munsell (15:20) A mindset, I think. You know, the private sector is looking at it like we got to move fast because we have competition and we can't survive if the competition is not just nipping at our heels, but just chewing our legs off. And that's what AI would enable them to do. The competition, if they learn it and accelerate their use of it beyond what you're doing, you're going to get so far behind that it's hard to catch up. So I think the one thing I can tell you from dealing with both is the private sector has a different mindset. The private sector says, let's move fast. And the academic sector is more, they're just used to moving slow, right? I mean, most of the time they'd be used to using textbooks and the textbooks, by the time they get in the hands of the students, through a publishing process, they're two years old. There's nothing useful in the textbook. So the idea that something that you put in a student's hands could be outdated in a week is a crazy idea, but that's how fast this is moving. I'll give you a quick, for instance. I was supposed to teach some students on Friday about the different models that are present in ChatGPT, Grog, Perplexity, all of them, right? And so I had these big slides presented, you know, there's six models here and seven models there and four models here and blah, blah, blah, blah. And on Wednesday, ChatGPT came out with 5 and they took away your access to the other models unless you were paying $200 a month and then you had access to them again. I'm not paying that. But, J.D. Mosley-Matchett (16:57) and Yep. John Munsell (17:08) all of a sudden I had a choice between two flavors of ChatGPT-5 and I'm like, whoa, I just spent an hour making this beautiful slide. I've got to throw the whole thing away. And I thought I was doing a good job because I was updating it from what I was, what I presented two months ago. So that's, that's a hard thing to get your head wrapped around in a university setting, but you have to move fast and you have to know that this thing is going to change. But, that's another thing that I noticed, by the way, with the release of ChatGPT 5 and I've been noticing it every time that any model releases an update or a new version. The fear three years ago was that, we're not going to need prompting anymore. We're just going to have conversations. A prompt is a request just like you would ask me to do something for you. That's all a prompt is. Imagine having an employee where you never had to give that employee any instructions. That's not going to happen. You have to have a conversation to make something work. So what I noticed as these models progress is prompting is actually becoming more important. Clarity is more important and context is more important. And if you never learned those skills, then the things get actually more challenging because they start to take on minds of their own. They try to do things on their own by inferring from words in your requests. And if you don't word them properly, they'll make an inference and go down a rabbit hole and then you've got to figure out how to fix it. J.D. Mosley-Matchett (18:51) Can you give us some idea about what AI fluency really looks like for administrative teams? And how does it change the way they work on a day-to-day basis? John Munsell (19:03) Sure. So you hear a lot of people say, look, we really need to focus on AI literacy. To me, AI literacy is You know, I think literacy to me, do I want to hire, if I'm trying to conquer a Spanish market, do I want to hire somebody who's literate in Spanish or fluent in Spanish? I want somebody who's fluent. OK. So with AI, literate just means that you know, if I ask you the question, it'll give me a response. Fluent means you know how to ask the question, you know how to structure it, you know where you're going to have conflict, and you know how to share your skills and expertise with others. Right? That's really what fluency looks like. Once you have that present in your organization, it's a flywheel that just gets better and better and better. This is kind of like what I was saying at the beginning. When people build a custom GPT that they can use on their own, then all of a sudden they have more ideas on how to use AI to make their jobs better and more efficient and more effective. And then when you get them together and you create an environment where people share that, the ideas flourish even more. The enthusiasm accelerates, everything just does a whole lot better. But you have to have a process for that. And you have to work on the culture shift because look, JD, I'll tell you, this is not so much about technology adoption as it is about change management. And that's a critical thing that the universities need to get their heads wrapped around. That's why they're so slow. Because there's inertia. Because there's a, "Well, we've always done it that way." That mentality's gotta shift. J.D. Mosley-Matchett (21:01) Do you have any final parting words that you want to offer? John Munsell (21:06) I think if I were going to give any kind of advice or whatever to a university, it's ironic that you would ask me this because literally in 30 minutes I'm having a conversation with a university. think the thing that I want to get through to them is, listen, challenges are always going to be coming your way. But If you just see AI as your assistant and not your opponent, like I was saying earlier, if you just give it a couple of minutes to see how it will solve problems for you, then the light bulbs will start to go on and you'll start to realize, OK, now I get it. But you can't get stuck in this idea that let's just wait and figure it out. Like I had a conversation with the university two weeks ago and they're like, yeah, we're going to be in good position to start working with this in October of next year. And I'm like, you don't have until October of next year. You don't have it. You need to really be thinking about what can you roll out in January and how can you start getting people using it before September of this year. J.D. Mosley-Matchett (22:08) ⁓ my gosh. John Munsell (22:19) Start with baby steps, but you need to get that flywheel running now. That would be my biggest piece of advice. J.D. Mosley-Matchett (22:20) Thanks. This has been great, John. You've provided some really helpful insights for integrating AI into our workflows as we enter this new academic year. And I want to encourage everyone to buy a copy of your book, Ingrain AI, because it's filled with practical and actionable higher education examples of successful AI use cases. John Munsell (22:52) Well, thanks JD. I appreciate it. J.D. Mosley-Matchett (22:56) For more information about AI news and trends that are directly impacting administrators in higher education, please follow InformAven on LinkedIn and visit our website at InformAven.ai.