✨Guide: Crafting Your Syllabus' AI Policy
How to set norms for both student and professorial AI use in your university/college course.

[image created with Dall-E 3 via ChatGPT Plus]
Welcome to AutomatED: the newsletter on how to teach better with tech.
Each week, I share what I have learned — and am learning — about AI and tech in the university classroom. What works, what doesn't, and why.
In this fortnight’s Premium edition, I present a guide to help professors develop and express their overall stance on AI use in their courses. This guide covers both student use of AI and professorial use.

🖼️ The Big Picture
In this Guide to developing AI policies for university-level syllabi, I recommend that you, the professor, take a holistic approach that addresses both student AI use and your own professorial AI use. This not only creates an equitable culture of honesty and trust but also the opportunity for a dialogue between you and your students about the evolving role of AI in (their) education.
When you develop and refine your student AI use policy, I…
recommend aligning your views on AI integration with institutional policies, departmental and field standards, and course objectives
suggest that you need to consider how students would ideally use AI to achieve your courses’ objectives, before turning to incentivizing students to use AI in those ways (or, perhaps, incentivizing them to not use AI at all)
discuss providing guidelines demarcating responsible AI use from AI misuse, as well as the importance of assessing students' familiarity with AI, ensuring equitable access, and providing AI training opportunities if needed
Regarding your development of your professorial AI use policy, I…
advocate you start by clearly defining the scope of your AI use cases in connection with your course, ranging from brainstorming to using personally identifiable student data
summarize some of the insights of our ✨Guide on ethically using AI with student data in discussing a range of data use strategies you might want to deploy, such as restricting AI use to non-student data or obtaining explicit consent when sensitive student data is involved
stress the importance of transparency and flexibility depending on the context.
In both cases, I provide two generic examples of AI use policies, one AI-inclusive and one AI-exclusive, that incapsulate the lessons of the Guide and that can act as starter policies as you develop your own.
Table of Contents
Developing Student AI Use Policy
The following steps offer a roadmap for you to create an AI policy for your syllabus that effectively manages AI's role in your students’ engagement with your classroom, ensuring that it aids in their learning rather than hinders it — and does not make your job more difficult.
This policy should align with institutional and departmental policies, respect the utility of AI use in your educational context, take into account accessibility issues with the relevant AI tools (including, if necessary, training opportunities), guide students on when and how AI tools can be responsibly used in the course, and set clear boundaries that demarcate how you will handle what you believe to be AI misuse.
Let’s cover each of these in turn.
👀 1 - Review Institutional Policies
The first step in developing an AI policy for a syllabus is to thoroughly review your institution’s policies, as well as those of your department. Universities and colleges often have detailed policies regarding technology use, academic integrity, and student conduct, which are essential to understand when integrating AI tools into your course.
I see these constraints — along with more general legal constraints — as guardrails that define the outer limits of your AI policies, regardless of your own views about what should happen within them.
Additionally, departmental guidelines can provide insights into discipline-specific considerations and best practices. You should consider whether your AI policy should be not only compliant with institutional rules but also resonate with the academic culture of your field and standards of your department, if relevant.
You can add language from both your institution and your department to your policy if you think it is necessary or if it would be helpful to your students.
✅ 2 - Consider Course Objectives
Relative to AI Use
The second step involves a careful examination of your course objectives in relation to the potential benefits and drawbacks of AI use. This step is key to determining how AI tools can align with or detract from the educational goals you have set. Begin by listing or reviewing the key learning outcomes or objectives of your course. Then, critically assess how AI tools can enhance or undermine these objectives.
We have covered aspects of this topic at greater length elsewhere, like in my ✨Guide on how to train students to use AI and my ✨Guide on how professors can discourage and prevent AI misuse, but broadly the two ends of the pedagogical spectrum are AI-inclusivity and AI-exclusivity…
Considerations in Favor of AI-Inclusivity
When considering AI-inclusive assignments or assessments, professors face two primary scenarios.
First, there are assignments with learning objectives that explicitly involve an AI tool or a set of them. This scenario is ideal for courses where understanding and utilizing AI tools is central to the learning outcomes. For example, an architecture course might include assignments where students develop skills in using AI for sketching conceptual renderings or mock-ups of the exterior of buildings. If students need to finish your class with skill at using some particular AI tool — or some kind of AI tool — then incorporating this AI tool in the course would be essential.
The second scenario involves more general learning objectives that can be achieved through the use of AI tools. Here, AI is not the central focus but an effective means to an end. For instance, in a philosophy course, ChatGPT could assist in argument analysis, where the primary goal is to understand and critique philosophical arguments, not to learn about AI itself or how to use it. (I can personally attest to the value of using ChatGPT as an interlocutor, especially for students whose groupmates are not the best at questioning or supporting them.)
Considerations in Favor of AI-Exclusivity
AI-exclusive pedagogy involves designing assignments or assessments where students are not allowed or encouraged to use AI tools. This approach is often chosen to encourage deeper engagement with the material, critical thinking, and independent problem-solving skills that students need irrespective of their access to AI tools.
In these cases, it is crucial to have strategies that motivate students to align with these restrictions. Professors should instill in students an appreciation for the value of learning without AI assistance, emphasizing the importance of developing their skills and knowledge.
This approach often requires professors to be innovative in assignment design, ensuring tasks are engaging and sufficiently challenging to be completed without AI (either in virtue of their format or their content). I covered this topic at length in the aforementioned guide and also ✨my Guide on designing assignments and assessments in the age of AI.
This step requires relating the pedagogical opportunities and risks AI presents to the educational values you aim to uphold.
The goal is to determine the role that AI would ideally play in your course, such that you can define the terms of students’ engagement with AI to incentivize and encourage them to use it in that way. AI in your classroom should meaningfully contribute to the learning journey of your students, without compromising its integrity.
The result of this step won’t necessarily be any verbiage in your syllabus, but it will determine the contours of everything that you do write into your policy. If it is essential that students use a given AI tool, your policy needs to not preclude its use, account for your students’ varying abilities and access to the tool (see the next step), and so on.
🏋️ 3 - Assess Student Abilities and
Access, and Consider Training Options
Once you have figured out the role that AI would ideally play in your course — relative to your course’s specific objectives — you need to consider what you can expect of students.
Start by evaluating and estimating the current level of familiarity and proficiency your students have with AI technologies relevant to your course. This understanding is crucial as it influences how you integrate AI into your curriculum. For instance, in a majors-level computer science course, students will likely already have significant experience with coding assistants — perhaps they use GitHub Copilot regularly — whereas students in a first-year humanities course might not have used LLMs like Claude for drafting or ideation.
Next, assess the accessibility of AI tools for your students. This includes considerations of cost, technical requirements, availability, and accommodations.
Finally, you should consider how you can address any gaps in abilities through training and support. This could involve providing introductory sessions on AI tools, offering resources for self-learning, or incorporating AI tool tutorials into your course structure. Consider also the need for ongoing support, such as help desks or discussion forums, to assist students as they navigate and utilize AI tools throughout the course.
Ensuring equitable access is key; all students should have the opportunity to engage with the AI components of your course without burdensome barriers, whether financial, technological, or otherwise.
This step ensures that you are aware of what it will take to get your students equipped to engage with AI in a meaningful way, aligned with the course's learning objectives.
The results of these assessments, plans, and evaluations should be manifested in the language found in your syllabus. For instance, if you expect your students to use an AI tool that comes with a financial cost, you should consider whether you want to assign it as a requirement, like textbooks or other software. Whether this is reasonable depends on their means, expectations at your institution, the cost of the tool, and many other considerations. If, on the other hand, you determine that the cost of a very useful tool would be too great for some students, you need to address the fact that others do have the financial means, thereby granting them a potentially unfair advantage. This may require outright bans or other assignment structures that level the playing field.
🥕 4 - Reflect on Incentivization
and Accountability Frameworks
Developing an accountability framework for AI use in the classroom involves a cultural element first and foremost. Practical constraints are important but setting norms of ethical accountability are foundational.
For instance, you should consider the level of transparency you expect from students regarding their use of AI. This could involve requiring students to disclose their use of AI in assignments, submit chat logs (as recommended by AI for Education), or include reflective components in their submissions where they explain how and why they used AI tools in the ways they did. (In case you missed it, I discussed how to cite AI at some length earlier this year, including what the MLA and APA recommended.) These measures can foster a culture of openness and honesty in the classroom.
Next, it is important to think about practical methods for regulating AI use. While it is natural to think that the best strategy is to use software that can detect AI-generated content — and signal to students that it will be used in the syllabus and otherwise — there are reasons to worry about this strategy (consider the reservations I have expressed about these methods).
An alternative is to incorporate AI-resistant elements into your assignments. Design tasks that require personal reflection, in-depth analysis, or application of course-specific knowledge that AI tools cannot easily replicate. This approach not only challenges students to engage more deeply with the course material but also upholds the integrity of the learning process. Another strategy is using AutomatED's “pairing” method, which involves pairing AI-susceptible assignments with AI-immune ones to incentivize students to complete the former honestly and earnestly. For instance, you could assign a take-home essay that students then must defend in an oral exam. This method not only helps in maintaining academic integrity but also in assessing students' understanding and application of the course material independently of AI assistance.
With respect to your AI policy on your syllabus, the advantage of these alternatives is that, well, they do not require much of a policy! The structure of your course itself acts as the safeguard against AI misuse, rather than an outright ban or threat of detection.
⚖️ 5 - Reflect on Remediation
and Disciplinary Plan
Establishing clear guidelines and consequences for AI use is crucial. Develop and communicate guidelines that clearly outline what constitutes acceptable and unacceptable use of AI in your course. Pair these guidelines with specific consequences for misuse to ensure that students understand the boundaries and expectations set forth in your course.
(Note that a background assumption here is that you have provided support for students who might overly rely on AI due to gaps in understanding or skills. By providing these resources, you can help ensure that all students, regardless of their starting points, have the opportunity to succeed in the course without undue reliance on AI — and are not incentivized to misuse AI and violate your policy because you have not sufficiently prepared them for success. This should be addressed from step 3 above.)
Developing a disciplinary plan for AI policy violations requires a comprehensive and thoughtful approach. A graduated response to violations is wise. Begin with less severe consequences, such as warnings or the requirement to redo assignments for initial or minor infractions. Escalate to more significant consequences, like grade penalties, for serious or repeated violations. This approach ensures fairness and proportionality in response to various levels of infractions. It is also important to consider individual student circumstances when determining consequences. Factors such as intent, level of understanding, and previous record should influence the decision-making process, ensuring that each case is treated with due consideration.
In general, incorporating elements of restorative justice can be beneficial. Encourage students to reflect on their actions and understand the impact of their behavior. This approach promotes long-term understanding and respect for academic integrity and can be more effective than purely punitive measures.
Lastly, maintain a well-documented process for handling violations. Documentation is not only crucial for transparency and consistency but also for any necessary reviews or appeals. This ensures a clear and accountable process that upholds the integrity of your educational environment.
Transparency and consistency in the application of the disciplinary plan are key. Make sure students are well-informed about the guidelines and potential consequences at the start of the course by explicitly stating them in your syllabus AI policy.
🔬 6 - Implement, Get Feedback, Revise
Once the policy is integrated into your syllabus and classroom practices, it is vital to observe its impact and effectiveness. Solicit feedback from students to gauge how well the policy is understood, its effectiveness in addressing the concerns it was designed for, and its impact on student learning and engagement. This feedback can be gathered through surveys, open discussions, or informal feedback during office hours.
Creating a space for students to discuss their understanding of AI use and its implications can facilitate a deeper comprehension of the policy and its rationale, even if you end up unmoved with respect to making any changes to it in response to student feedback.
This dialogic stance is especially important given the range of perspectives students will have on AI use and misuse. Making students feel heard is key, not only because it requires you listening to them but it also requires them to articulate how they understand what you express with your policy.
Equally important is being open to revising the policy based on this feedback. It is unlikely that you will land on the perfect policy this semester (or the next, or the next, …). Adaptations might be needed to address unforeseen challenges, evolving AI capabilities, or changing student needs. This iterative process of implementation, feedback, and revision ensures that the AI policy remains relevant, effective, and aligned.
Examples
A Generic AI-Inclusive Example
Student AI Use Policy for English Literature Course
Our classroom will integrate artificial intelligence (AI) as a supplemental tool to enhance our critical engagement with the assigned texts. This policy outlines how AI will support our analyses, discussions, and creative work. Adhering strictly to our university's academic integrity guidelines, we will use AI to expand, not bypass, our intellectual exploration.
Equitable Access and Skill Building: I will provide resources, including tutorials and workshops, to ensure that all students are comfortable using the designated AI tools, which will include ChatGPT and Claude. Our library sessions will include demonstrations of how to interact with AI ethically and effectively.
AI-Assisted Assignments: When we analyze the narrative structure or themes of a novel, AI may serve as a preliminary tool to identify patterns or generate a list of motifs. However, the critical interpretation, the articulation of thematic significance, and the development of thesis statements remain solely our intellectual task. For creative writing exercises, AI can offer suggestions on stylistic choices or assist in brainstorming, but the narrative voice and all final content must originate from you, the student. The guiding principle here is that the core of the intellectual work — for any assignment — lies with you, not AI. AI can help brainstorm, outline, plan, suggest, etc., but no more. If you have questions about this standard, reach out.
Documentation and Transparency: To maintain transparency, all AI-assisted work must be footnoted. In the relevant footnote, a statement must specify the AI's role. For instance, "AI suggested possible themes in 'Pride and Prejudice' which were then evaluated, selected, and reflected upon by the student." Likewise, all AI-assisted work must contain a citation to your chatlog with the relevant AI tool, so that I and other readers can review your use of it. Finally, all AI-assisted work must be completed in a writing program that allows for the tracking of changes, like Google Docs, so that your reader can see the process of creation. If significant portions of a student’s words are pasted into a Doc, this will be assumed to be text that was generated by an AI tool. This honesty ensures that the final submission is a product of your own thinking, even if bolstered by AI in the ways documented.
Ensuring Academic Integrity: Our collective commitment to academic integrity means any student’s work I determine to be uncredited AI-generated work will be addressed in an in-person conference. If AI-generated content is submitted without proper disclosure, and if the student has no prior violations, then the response will be to resubmit the assignment with a reflective piece on the role of AI in literary analysis. This is not a punitive measure but an opportunity to realign with our academic values. Only if a student violates this policy twice will they lose credit for the relevant assignment. Further violations will result in references to the honor system.
Open Communication and Policy Evolution: I welcome your thoughts on how AI is shaping your learning experience. Regular feedback sessions will be incorporated into our schedule to reflect on our AI policy's effectiveness and to adapt it to serve our educational mission better.
A Generic AI-Exclusive Example
Student AI Use Policy for Mathematics Course
In our exploration of the fundamentals of mathematics, we will cultivate an environment of creative problem-solving and independent reasoning. In alignment with our commitment to foundational learning, this course will not rely on AI tools like Wolfram Alpha for in-class or take-home work.
Equitable Access and Maintaining Skills: In a field where solving problems with one's own mental faculties is the primary goal, our classroom will focus on sharpening these skills. I will provide all students with equal access to traditional resources such as textbook excerpts and supplementary problem sets. During office hours, I will be available to discuss and offer guidance on concepts and problems.
Manual Problem Solving Emphasis: We will tackle equations, proofs, and algorithms through pencil-and-paper calculations and in-class discussions without devices. All major assessments will occur in class and on paper. This approach ensures that each student develops a strong, personal understanding of mathematical principles without reliance on computational shortcuts.
Documentation of Work: All submitted work must be accompanied by a step-by-step handwritten account of the problem-solving process, clearly showing the progression from question to solution. This documentation is crucial as it reflects the student's journey through the logical reasoning and analytical thought that is central to mathematics. If you have an accommodation in connection with handwriting, I will be in touch after our first class session to discuss alternative methods of submission.
Ensuring Academic Integrity: Our academic integrity will be upheld through a culture of honesty and personal effort. I trust each student to engage with the material authentically. However, should there be instances where AI-generated or AI-assisted work is submitted, we will address it as a serious breach of our course policy. The initial step will involve a conversation to understand the situation, a failed grade if necessary, and then a remediation plan tailored to address any skill deficits and reaffirm the value of manual problem-solving skills.
Open Communication and Policy Reaffirmation: I encourage an open dialogue about the role of technology in mathematics. While our classroom remains AI-exclusive, I value your perspective on technology's evolving role in education. Our discussions will help us stay connected to the broader conversation about technology in learning while reaffirming our commitment to foundational mathematical skills.
Developing Professorial AI Use Policy
Here’s my pitch for including a professorial AI use policy in your syllabus:
If you place an AI policy within your syllabus that outlines your own expected uses of AI tools in relation to the course, you are contributing to a reciprocal culture, where you hold yourself to a standard of transparency parallel to what you hold your students. By clearly articulating your use of AI, you model the principles of openness and integrity you seek to instill in your students.
You also provide students with the knowledge to critically assess the influence of AI on their learning experience. When you explain how AI might be used in improving your feedback or in lesson planning, you are making your students informed participants in their education.
Such a policy also opens avenues for dialogue and critical reflection. In revealing the role of AI in their own work, professors underscore their role not as unilateral decision-makers but as facilitators of a collaborative educational experience. This syncs well with a stance on student AI use where you are open to feedback on your policy and to the possibility of revision.
If you decide to include a professorial AI use policy in your syllabi, it should outline how AI tools will be utilized in your teaching and how you will handle student data, if necessary. The following steps provide a method to create a policy that makes room for you to leverage AI for educational enhancement but also mindful of the above ethical and cultural issues.

Subscribe to Premium to read the rest.
Become a paying subscriber of AutomatED to get access to this post and other perks.
Already a paying subscriber? Sign In.
A subscription gets you:
- • Access to All Tutorials and Guides
- • Two New Premium Pieces per Month
- • Free Access to Monthly Webinars
- • Access to Exclusive AI Tools
- • Discounts on Courses
- • One Free 1-Hour Consultation with Dr. Clay