3 Strategies to Discourage AI Misuse

Now is the time to implement them. Here's what to do.

[image created with Dall-E 3 via ChatGPT Plus]

Welcome to AutomatED: the newsletter on how to teach better with tech.

Each week, I share what I have learned — and am learning — about AI and tech in the university classroom. What works, what doesn't, and why.

In this week’s piece, I discuss how you can set up your students for success and avoid the headaches of AI misuse, before sharing some Premium updates and your comments from last week about my genie thought experiment.

🍎 An Ounce of Prevention is
Worth a Pound of Cure

As many of us approach the new academic year, there’s no better time than now to focus on preventing and discouraging AI misuse.

By taking proactive steps in our course design, we can create an environment and an incentive structure that makes AI misuse less appealing and less effective. This channels our students’ energy towards learning and saves us time and stress. It’s a lot easier to stop AI misuse before it happens than deal with it after we suspect it has occurred.

Today, I'll explain three key strategies from my comprehensive ✨Guide on How Professors Can Discourage and Prevent AI Misuse — our most popular Premium piece by far, for unfortunate reasons — that require you to plan ahead to implement them effectively:

  1. Pair susceptible assignments with those that are not

  2. Design AI-immune assignments

  3. Kick up the motivation meter

For those who are ready to move beyond the AI misuse conversation: don’t worry! Next week I'll be focusing on positive uses of AI in the classroom. I'll also be diving deep into this topic in my webinar on September 6th on training your students to use AI (pre-order before August 31 for 50% off).

Strategy 1: Pair

One powerful approach to discourage AI misuse that has worked for me and many of my clients is to implement pairing strategies in your course design.

This method involves linking a first assignment that might be susceptible to AI misuse with a second one that cannot be completed by students who haven’t achieved the learning objectives of the first assignment — including those who misuse AI as a shortcut. (Often, this strategy requires you to weight the second assignment more.)

By planning these pairings before the semester starts, you can create a course structure that naturally incentivizes students to engage with all assignments in the way you think is most conducive to their learning.

Here are some effective pairing strategies to consider:

  1. Take-home written assignments paired with in-class presentations: 

    Require students to present and defend their written work in class or in an oral exam. This approach ensures that students deeply understand the content they've written about. (Note this is compatible with them using AI to gain this understanding, in some cases.)

  2. Problem sets paired with in-class application exercises:

    After students complete problem sets at home, have them apply similar concepts to new problems in class without access to AI tools.

  3. Take-home assignments paired with follow-up discussions: 

    Schedule small group discussions where students elaborate on or discuss the ideas in their take-home work with their peers. This not only verifies their understanding — their peers can be tasked with evaluating their take-home work or their expression of it — but also provides valuable feedback opportunities.

For example, in my philosophy courses, I have often paired substantial take-home essays with 30-minute oral examinations (only possible when I have <80 students, I have found). Students write their essays independently, but then must discuss and defend their arguments in a one-on-one session. This pairing not only discourages AI misuse in the written portion but also deepens students' engagement with the material.

As I have written about at length before, pairing essays with oral exams takes a bit of planning and effort from me but it is a pedagogical “home run” that my students love — and I enjoy grading a lot more.

When implementing pairing strategies, consider these tips:

  1. Clearly communicate the pairing structure in your syllabus so students understand expectations from the start. I flag to my students repeatedly that the assignments are weighted towards the latter one, so they understand that they need to take the first one seriously to succeed at the one that matters more.

  2. Ensure that the paired assignments are closely related in content and skills assessed. Ideally, the first one builds to the second one.

  3. Balance the workload so that the pairing enhances learning without overwhelming students. You’re assigning two assignments — or a two-parted assignment — not one.

Strategy 2: Immunize

Another way to prevent AI misuse is to create assignments that are inherently resistant to AI-generated responses.

As I discuss in the Guide, there are two main approaches:

  1. Format-based immunity:

    • Handwritten submissions or in-class writing exercises.

    • Assignments that use version history tracking in word processors.

    • Oral presentations or recorded video submissions.

  2. Content-based immunity:

    • Assignments that require engagement with unique, course-specific content that AI tools struggle with natively — and can’t be provided with effectively.

    • Tasks that demand high field-specific standards that AI tools struggle to meet, regardless of how they are propmted.

    • Assignments that require students to apply concepts to personal experiences or current events that are challenging to prompt.

Obviously, format-based AI immunity is easier to achieve, although it may come with other costs.

Determining whether a given assignment is more or less AI immune in virtue of its content takes time, experimentation, and some skill with the relevant AI tools.

Remember, the goal isn't to make assignments unnecessarily complex, but to create tasks that genuinely assess and develop the skills you want your students to gain by “forcing” them to deploy these skills earnestly without the use of AI — if you think this is the best way for them to learn them. When designing these assignments, always keep your learning objectives at the forefront.

Strategy 3: Motivate

One of the most effective ways to prevent AI misuse is general good practice: motivate students to value the learning process itself.

Before the semester starts, take some time to reflect on your course content and how you can make it intrinsically motivating for students. Consider these questions:

  1. How can I demonstrate the real-world relevance of my course material?

  2. What engaging activities or discussions can I incorporate to spark genuine interest?

  3. How can I design assignments that students will find meaningful beyond just earning a grade?

For example, in my philosophy courses, I've found success in connecting abstract concepts and arguments to students’ personal worldviews. I might design an assignment where students apply ethical theories to a dilemma from their own lives.

This doesn’t prevent them, in its own right, from misusing AI to complete the assignment. Yet, it lowers the probability that they do so, especially when it is part of my overall pitch to them to use my course as an opportunity to grapple with what matters to them.

Another crucial aspect of motivation is creating a classroom culture that values learning and integrity. Before the semester begins, consider how you'll communicate these values to your students, perhaps via your syllabus AI policy.

Of course, it's important to note that a student's desire to use AI for assignments doesn't necessarily indicate a lack of interest or unwillingness to learn. Some students might view AI as a tool to enhance their learning or to work more efficiently. Depending on the context, this may mean that AI use doesn’t equate to AI misuse.

I will cover ways to leaning into the use of AI next week and in my webinar on September 6th on training your students to use AI.

✨ Updates to Three Premium Guides

With the semester approaching, I am updating some of our ✨Premium pedagogy Guides. Here’s the schedule:

August 21 - AI Misuse Guide, AI Policy Guide

August 28 - AI Assignment Guide

Remember: accessing the Premium Archive — with all 15 Guides and Tutorials, plus our 3 Premium-only GPTs — and the next year’s worth of new pieces will cost $99 starting on September 1, so be sure to lock in the current price today if you' haven’t already.

✉️ What You, Our Subscribers, Are Saying

Last week, I presented the following scenario:

Suppose you had a genie who you were certain is (almost) always correct when he judges that a bit of writing is AI-generated.

Would you rely on him as an educator to judge your students’ work?

What if his reasons for his judgments are completely opaque to you?

That is, suppose you have to tell any student who you accuse [of AI misuse on the basis of the genie’s judgments] “I don’t really know how he came to his judgment, but I know he is reliable. He’s been proven!”

Then I asked subscribers if they would rely on the genie if his evidence were opaque to them and their students. Only a dozen votes came in, but many voters had something to say…

Would you rely on the genie if his evidence was opaque to you and your students?

“Accusing someone of cheating is a pretty big accusation so I would want to have some proof of how the AI arrived at that conclusion.”

“I want to know how it works.”

“How can any argument be supported without evidence? Without evidence, it opens the door to counterclaims of biased, stereotyped, emotionally-loaded intuition. Having faith in God is hard enough, having faith in an AI detector is ... ”

“The consequences of false positives is too high. Also, naive AI usage will result in low grades anyways.”

Anonymous Subscribers (each quote is from a different person)

🤖 Our Course Design Wizard GPT

As I update the aforementioned Guides, I am continuing to work to improve our Course Design Wizard, a custom GPT (which is now freely usable by all) that can assist you in updating your courses for the AI era.

It can help you customize the 3 strategies from the start of this piece to your specific course context, it can design assignments, it can help draft course AI policies, and more.

In essence, it embodies the guidance of many of our Premium Guides.

Give it a whirl if you’d like an interlocutor as you work to update your courses for the next semester — and the challenges and opportunities of the AI era.

And if you want to learn how to make custom GPTs like it — and see some of its secret sauce — check out our ✨Tutorial on How to Build (Better) Custom GPTs.

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.

Graham

Expand your pedagogy and teaching toolkit further with ✨Premium, or reach out for a consultation if you have unique needs.

Let's transform learning together.

Feel free to connect on LinkedIN, too!