J.D. Mosley-Matchett (00:37) It's time for another episode of AI Update brought to you by InforMaven. I'm JD Mosley-Matchett, the founder and CEO of InforMaven. And our guest today is David M. DiSabito, Jr. He's the AI Liaison and AI Working Group Chairperson to the College of Business and Professional Educator of Business Analytics and Information Management at Western New England University. It's December, so we're spending this month considering how AI has changed since the beginning of the year. David was our AI Update guest in March, and he has some new thoughts regarding a question he answered back then. Welcome back, David. David M DiSabito Jr (01:17) Thank you, JD. ⁓ It's pleasure to be with you again. J.D. Mosley-Matchett (01:20) The question you said that you'd like to revisit is, how can AI enhance academic assessment? So what's changed about that since March? David M DiSabito Jr (01:30) The first thing that comes to mind is the speed in which we can assess student evidence, student artifacts. For instance, we can do 100 5-page papers now in under one minute, which is ⁓ pretty amazing. Think about how long it would take you to score 100 five-page papers. The other thing that's come a long way is our prompting ⁓ process in a product that we've developed called Walter is much improved. ⁓ use this thing called a Rubric Wizard that makes entering a rubric much quicker and much more efficient. It does things like it checks to make sure that the criterion scores add up to the total score, for instance. And we now have a library of rubrics that you can choose from. So you can just load a rubric, you can modify the rubric, and you can even create your own custom. rubric from scratch, which is pretty exciting. ⁓ So the speed in which we can and the streamlining of what we can do now is amazing because it's basically zip up your choose your rubric and run your process. then you get a very detailed report. that gives you artifact level scoring. It's basically a spreadsheet with columns. So each of your rubric criteria gets a score so you can do statistical analysis on each criterion as well as the total score. The system is also very cool because it does a two run through process. where it scores, gets your criterion level score, and then it does a feedback cycle. So it looks at the score, looks at the student evidence again, and gives ⁓ feedback to the artifacts based on the rubric. And again, it's pretty to see within one minute that kind of information, those data points. So they're super useful to assessors. J.D. Mosley-Matchett (03:58) So you. David M DiSabito Jr (04:02) to provide evidence for credit. J.D. Mosley-Matchett (04:05) What's an artifact? David M DiSabito Jr (04:07) Well, it's a term used in the assessment community to basically represent evidence or student work. So it could be a paper, it could be a report, it could be an essay, it could be R script, it could be Python script, it could be any of those student-generated J.D. Mosley-Matchett (04:30) well, let's talk about the rapid rubric iteration process. So start from the top and tell me about it. David M DiSabito Jr (04:38) All right, well, one of the things that we've learned in our research at Western England University is, this concept that we're now calling the rapid rubric iteration process. as you know I mentioned before you know how long does it take you to you know go through 100 5-page papers. Well with the rapid rubric iteration process if you wanted to change your rubric and then see the results you know traditionally that would take you weeks or months or years to actually change the rubric and get the new results. J.D. Mosley-Matchett (05:12) Yes. David M DiSabito Jr (05:16) But with Walter, with the use of AI integration, what you can do is you can change the rubric and get instant results. So you can have 100 know, new results ⁓ in one minute. So you can go through this, what we call rapid rubric iteration process, where you make a change, you get your results, you make a second change, you get the results, you make a third change, you get a result. And what that lets you do is it lets you fine tune ⁓ your rubric to get to the assessment results that you're really looking for, to pull the ⁓ information from the student evidence that supports what your ever learning objective or learning goal might be. J.D. Mosley-Matchett (06:07) And that makes a lot of sense. Now, I know that you developed Walter primarily for the assessment community, but it sounds like it could be used for grading in the classroom as well. David M DiSabito Jr (06:19) ⁓ Yeah, it can and I've been doing that on occasion with our students' permission. They're actually really interested in how it works as well. They're interested in the feedback it can provide. So it's just a crazy place to be right now in education and ⁓ working with artificial intelligence. J.D. Mosley-Matchett (06:32) Mm-hmm. David M DiSabito Jr (06:46) with the students, with the faculty, with the administration. ⁓ So as part of our ⁓ working group, I'm getting all kinds of perspectives from faculty that are quite ⁓ enlightening. J.D. Mosley-Matchett (07:00) I can imagine. Well, thanks for that update, David. It's really amazing how much AI has improved the way higher education operates over the past few months. David M DiSabito Jr (07:10) Yeah, it's changing under our feet. We can't keep up. J.D. Mosley-Matchett (07:14) So true. For more information about AI news and trends that are directly impacting administrators in higher education, please follow InforMaven on LinkedIn and visit our website at InforMaven.ai.