AI in a Final Exam? "An absolute disaster."

An accounting prof explains why and shares his AI use cases.

[image created with Dall-E 3 via ChatGPT Plus]

Welcome to AutomatED: the newsletter on how to teach better with tech.

Each week, I share what I have learned — and am learning — about AI and tech in the university classroom. What works, what doesn't, and why.

In this week’s piece, I share an interview with an accounting professor covering how he uses AI to help his students reflect, how he uses AI to save time and produce course content, what challenges he’s faced (including a disastrous AI-enabled final exam), and why he thinks AI-independent skills and knowledge are key to effective AI use — and why we need to train students to have them.

🗣️ An Accounting Professor on AI

Today, I share an interview with one of our very own Premium subscribers, Jose Luis Ucieda Blanco, who is a Professor of Accounting at the Faculty of Economics and Business Sciences of the Autonomous University of Madrid.

“Using artificial intelligence effectively requires that you know what you're doing and what you're looking for.“

[…]

When I let my students use AI in a final exam, “it was a disaster, Graham. An absolute disaster. I mean, 50% of my students failed the exam, and I think I was very generous. Let me explain to you why.”

[…]

“I am confident that artificial intelligence is here to stay. You can resist it, you can reject it, you can have all kinds of objections to it, but it's definitely here to stay.”

Jose Luis Ucieda Blanco

In this interview, Jose tells me

Click any of these links to jump ahead, or read the whole interview below…

Graham Clay: Can you please tell our readers what courses you teach, as well as the pedagogical context of these courses (the subject, the content, the class size, the institutional environment, etc.)?

Jose Luis Ucieda Blanco: I teach at a public university – what we call a "state" university in Spain – in Madrid. And I mainly teach financial accounting classes. Recently, I have been teaching sophomores and juniors, and this semester I taught two classes addressing different accounting topics at different levels. I typically have between 50 and 60 students per class. The sophomore class is part of a bilingual degree, so it's actually taught in English, but the other one is in Spanish.

I also teach another class on social skills – nothing to do with accounting – to a wide variety of students from mathematics to biology. It has only 15 students, and most of them are in the last year of university. What I try to teach them, in a very practical and applied way, is different social skills, like teamwork, leadership, communication, time management, and so on.

Graham: Can you describe a specific way that you've integrated AI tools into your teaching of these courses?

Jose: One of the things I try, in terms of in-class experimenting, is letting the students choose whatever they want to use. I tell them that ChatGPT has a free option and the same for Copilot. (I would tell them about Claude, but Claude cannot be accessed from Spain, or at least it wasn't when I was using it. Of course, you can use it through a VPN, but I can't ask the students to do that.)

Then I say something like, "Today we're going to learn about liquidity." I put the students into small teams, and I give each team one concept that they're going to be working on during class. So, for instance, I would tell one team: "Your team is going to work on solvency." And I would tell another team: "Your team is going to work on debt." I tell them to go and work for five, seven minutes on a definition of that concept. I tell them: "You can use Google, any artificial intelligence, or any other source to formulate a definition."

They have to spend a few minutes just looking for different definitions and then work to agree about one, as well as be able to justify why they chose that one. It's not just a matter of copy pasting, but also a matter of discussion amongst each team about their choices. Finally, we have a little discussion after that, as a whole class. They explain why they chose those definitions, and I just try to pick the most important and relevant ideas about those concepts.

Next, my question is: "How do you measure it? For example, when you have to measure the solvency of a company, how do you do it?" So, they have to go back and use Google or artificial intelligence to find how it is measured. Of course, they're going to find a lot of stuff. I circulate in the class helping and asking: "How are you searching? What problems are you facing? Etc." They have a lot of ideas on how to do it.

After this period, we have another discussion with the whole class. I write on the blackboard what they found, how they found it, as well as their reflections on these metrics.

My last question to them is this: "Now you have a way to measure your liquidity. Why don't you apply that to a real company?" I then suggest one company. They are encouraged to go to the Internet, find the company's financial statements, and apply/measure their metric and come up with a number. We then discuss what they find as a whole class, as before.

Graham: What have you learned from your integrations of AI about its utility? What are some positive ways in which it has impacted your teaching?

Jose:

  1. Improved Lesson Planning: One way I use it, before my social skills class, is to give me something, like activities, that is very applied and practical that I can do with my students. It gives me lots of ideas. I tell it, "I like this. I don't like that. Let's focus on this. Develop that." So, that has really helped me, especially when I am not an expert in a topic. I can get better ideas on how to involve students in an activity where they can actually practice what that social skill is about.


    Similarly, I use it in my financial accounting class and ask it, "Give me an idea to help the students understand this concept with an example, with an activity, or something like that."


    I will say that it often comes up with role playing activities – which is actually a very good dynamic if you can do it – but I can't do that with 60 people.

  2. Streamlined Basic Assessment Evaluation: I use it to review essays to make sure they have necessary parts. It streamlines the process of finding parts or looking for missing parts. However, one issue is that it is very positive. It always claims the student is doing a great job, or that they are doing excellently in this part. When I read that part, it's like, "There's no way. What are you talking about?" So, I have to be careful.


    Relatedly, when students submit to me a PowerPoint presentation, it can quickly review 30-40 slides and get some highlights to help me quickly analyze it.

  3. Faster Rote Emailing: Every couple of months or so, I get someone, like a PhD student, asking for me to be their PhD tutor or mentor. If I don't have the time right now to do it – because I'm not interested in the topic or I'm already tutoring others, whatever the reason – I need to quickly convey this to them. So, I copy their email and I say to the LLM, "Please say 'no' in a polite way. Try to be as kind as possible, encouraging them to look for someone else." Whether I or they use perfect English or perfect Spanish, regardless of where they come from, any country around the world, its response email is absolutely perfect. And I waste less time while giving a very good answer.

  4. Ideation: After I have finished a meeting – or when I am just going about my day –I I think of ideas. So, I get my phone, I open a new Google Doc, I put the voice input mode, and I just start talking: "So this is my idea, and I want to do this, and I want to do that." Or, after a meeting: "I had a meeting with this person, we talked about this, I think next steps could be that, he said or she said whatever." Then I copy all that text and put it into ChatGPT or Gemini or whatever. I direct it to rewrite it in a formal way that is more logically constructed. That's been very helpful to me.

Graham: What challenges have you faced in incorporating AI? Has anything gone poorly as you try to use AI?

Jose:

  1. Gaps in AI Literacy: The number one challenge is to make students understand that artificial intelligence is not just going to give you the right answer all the time. When I talk to students, they basically say, “The artificial intelligence is helping me a lot. It's very useful.” And that is a bad thing. [Note from Graham: a lot more on this below.]

  2. Keeping up with Change: Another one is that technology is rapidly changing – it's rapidly evolving. Something I used at the beginning of the fall semester of this year, like in September, is now out of date. New things have come out and it's difficult to adjust. It's hard to keep up with that.

  3. Biases in Training Data: When I try to prepare exercises, like multiple choice questions about topics, it’s not very straightforward. You can’t say, “Give me ten questions on financial accounting for inventory.” It doesn’t work well. I've noticed, for instance, that most of the accounting information is taken from the United States’ accounting rules. If I ask it to respond in Spanish, it's based in information about Mexico. I have to be very specific about the Spanish terminology I use. It works better to give it specific content via upload.

  4. Struggles with Advanced Assessment Evaluation: I've been trying lately to use it to assess the responses of my students to multiple choice questions, subtract points for wrong answers, and look for the quality of each question (the difficulty and the discrimination index). I'm stuck at that point. I'm not getting there yet.

Graham: Are there skills your students need to develop to effectively use AI tools? If so, how do you develop these skills?

Jose: I think definitely we need, as instructors, as professors, to learn how to teach the use of artificial intelligence. And I think critical thinking is critical, that we know how to do it and how to train them, and not just as an independent skill, but how it is used in the context of your course and the use of artificial intelligence. Students need to be aware that not everything that comes out of it is good, so they need to learn how to actually use it and evaluate it. You need to know exactly what you need and what you want so that you can prompt it to be more specific and concrete.

In short, you cannot just let them use it. You have to teach them how to use it first. Otherwise, they just eat up everything they read.

To put this into context and illustrate my point, there was one semester where my students could use artificial intelligence throughout. I told them, "You know what? I'm going to let you bring your own laptop to the final exam. You will be able to have all the notes, all the presentations, all the content, all the material of the class. Plus, you can use the Internet and you can use artificial intelligence, like we've been using it the class. You are absolutely free to do it." I was experimenting. I thought to myself that it would be good to see what happens.

The final exam had 30 multiple choice questions (which I prepared with the help of artificial intelligence, of course). I also gave them a case involving the financial statements of Costco, and I directed them to prepare a report on how they did financially in the last year, according to the statements. (And I gave them guidelines, like a few milestones or points that they need to address, like revenues, margins, etc.).

It was a disaster, Graham. An absolute disaster. I mean, 50% of my students failed the exam, and I think I was very generous. Let me explain to you why.

One of the things they had to do was talk about the profitability of the company. They then went to ChatGPT and asked it to compute the return on assets – the profitability of the company. Their answer was that profitability in the last year was 3%, this year is 4%, so the company is doing great.

But if this is all you can say, this is limited in value. We already knew this; this is something that ChatGPT or computers can already do. It's in the statements. What value are you adding to the analysis? You need to explain why the company was able to do that, looking at all their information.

To me, that was a turning point. And I kept seeing similar unreflective reliance on artificial intelligence again and again.

Graham: And so, do you think their inability to give deeper and more insightful answers to "Why is this company profitable?" has to do with their inability to prompt AI to answer that question? Or do you think the problem is that they lack independent understanding or skills?

Jose: It absolutely requires independent understanding and skills. Using artificial intelligence effectively requires that you know what you're doing and what you're looking for. It's going to help you find information faster, but – in this case – you need to understand what the concepts around profitability are, how it's computed, and how it's measured. You need to keep digging a little bit more, not just focusing on one ratio or number, but actually being able to link all the analysis that you're doing with the company to get an overall picture.

Funny story... After this happened, I was talking with my children, who are 19 and 21 and in university, and conveying my frustration.

My eldest says, "Dad, it's so clear what happened." And I said, "Oh, please, illuminate me a little bit."

And he said, "If you were a student, if you were a 20-year-old student who had, like, seven final exams in a two weeks period, and in only one of them the professor tells you that you can use all the materials, you can use artificial intelligence, but you are not allowed to do that in the other exams. What class, what course do you think you would spend less time studying for?"

And I said, "Oh, damn." He was absolutely right. I mean, if I were a student, I would optimize my time.

I think I would have had more success if I had made it very clear that I still had high expectations for the exam, and that the students still need to have all the material understood.

On the flip side, last week I gave the same exam –multiple choice questions plus the case – but I told these students that they are not going to be allowed to bring their laptop or use the artificial intelligence at all. Only paper, pencil, and a calculator.

They were so mad. They said, "Why did you change it?" They literally even said, and I quote, "With this change, now I have to study all over again."

And I said, "Oh, please, that is what you should have done at the beginning."

All of this has been very illuminating for me. Artificial intelligence is something that really is new, not just for them, but also for us. We have to be careful how we teach them how to use it because they'll always try to go and find a shortcut.

✨A Timely Premium Interlude

Coming July 3 - Guide: How to Train Students to Use AI

A step-by-step comprehensive guide to the pedagogy of teaching students to use AI in the context of university courses, weighing in at over 8000 words — with roughly half dedicated to a wide range of specific examples of course objectives, sub-objectives, in-class group activities, solo group in-class activities, take-home assignments and assessments, all focused on training students to use AI in field-specific ways.

🗣️ The Interview Continues

Graham: How do you foresee the role of AI evolving in higher education, particularly in your field or fields like yours? What new AI developments are you most excited about?

Jose: It's changing rapidly. Very rapidly. Since this January, we have had two or three new models, and each one is an improvement on the old previous ones.

It's hard to keep up, but I am confident that artificial intelligence is here to stay. You can resist it, you can reject it, you can have all kinds of objections to it, but it's definitely here to stay.

I understand that if I were 63 or 65, I probably would not feel as much pressure to care about it, but that's because of my career, not the technology itself. Whether I am here or not, it's something that is going to be here. It's going to stay. So, we must try to adjust and deal with it. I think it is similar to when the calculator first arrived in the schools.

The new AI development I am most excited about is definitely agents. I really want to look into this further, but I haven't had the chance (most of them are paywalled, and I haven't had the time yet to go and test them). But I think agents will help us a lot.

Graham: Can you give an example of why you're excited about agents, for accounting or in general?

Jose: I think agents are going to help automate tasks much faster and more easily. Whether it's going through essays or reviewing the answers of a multiple-choice test, agents will be able to help us complete these tasks much faster.

I want an agent Copilot right now. I want to go first thing in the morning and say, "Hey, Copilot, do I have any urgent emails I need to look at? Do I have email from a student that requires my attention before lunch or something? And if they're looking for a meeting with me, check my agenda and why don't you just provide a suggestion to them?" Something like that.

I think that's something that's coming sooner rather than later. I realize Microsoft is on its way to this, and Google can do it to some degree as well with Gemini being able to look in all your documents.

And sure, I understand the privacy risks. But, in my case, I want to allow the Copilot to do it. I want you to help me.

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.

✉️ What You, Our Subscribers, Are Saying

“Yes. I teach at a law school and lawyers (or at least large law firms) have embraced generative AI and the potential for its use in providing legal services. A law school would be remiss if it did not prepare students to use the tools they will use following graduation.”

Anonymous Subscriber

“No. I am sure I will use it once in class, mostly out of curiosity about how well AI answers a given question and to see if it creates a teachable moment. But AI seems to have a general purpose relative to the courses I will teach (Finance 101 and 201) to spend time on.”

Anonymous Subscriber

“Yes. I teach at a law school and lawyers (or at least large law firms) have embraced generative AI and the potential for its use in providing legal services. A law school would be remiss if it did not prepare students to use the tools they will use following graduation.”

Anonymous Subscriber

“Yes. We are gearing up to use AI throughout our courses.”

Anonymous Subscriber

“Yes. I teach various marketing courses — Principles of Marketing, Consumer Behavior, Marketing Research and Marketing Communications.”

Anonymous Subscriber

“Yes. I'm already using it to develop assignments and create React things.”

Anonymous Subscriber

“Yes. “Having a teacher modify a lesson plan as she creates it.”

Anonymous Subscriber

✨Recent and Upcoming Premium Posts

Coming July 3 - Guide: How to Train Students to Use AI

A step-by-step comprehensive guide to the pedagogy of teaching students to use AI in the context of university courses, weighing in at over 8000 words — with roughly half dedicated to a wide range of specific examples of course objectives, sub-objectives, in-class group activities, solo group in-class activities, take-home assignments and assessments, all focused on training students to use AI in field-specific ways.

July - Tutorial: How Professors Should Use Gemini 1.5 Pro

July - Tutorial: How Professors Should Use GPT-4o

Graham

Expand your pedagogy and teaching toolkit further with ✨Premium, or reach out for a consultation if you have unique needs.

Let's transform learning together.

Feel free to connect on LinkedIN, too!