Interested in crafting your own policy? Try our AI prompts to draft and revise course-level and assignment-level AI Policy language.
AI policy prompt (course-level)
This prompt is inspired by Stanford’s Worksheet for creating your AI course policy. Copy the prompt below and paste it into your preferred Generative AI tool to begin crafting your guidance. Be sure to revise any outputs thoroughly for your personal and institutional context:
Template
“You are helping me draft a clear, learning-centered AI policy for my online course. Please follow these design principles throughout this process:
-Align to standards: The policy should align with institutional academic integrity policies and relevant disciplinary or professional standards.
-Be specific: The policy should clearly guide student behavior and include examples of what is and is not allowed.
-Preserve student agency: Frame AI as a learning aid, not a shortcut. Emphasize transparency and disclosure rather than detection.
-Emphasize accountability: Reinforce that students are responsible for their learning, decisions, and submitted work.
-Acknowledge the gray: Recognize that AI tools and expectations evolve and invite questions and ongoing dialogue.
Use an interview format to gather what you need. Ask one question at a time, wait for my response, and adapt follow-up questions as needed. Do not ask me to provide course details upfront, elicit them through the interview.
When the interview is complete, draft a concise, syllabus-ready AI policy statement (2–5 short paragraphs) based on my responses and the design principles above.
Begin the interview
1. Course context and goals
What is the course title, level (undergraduate or graduate), and discipline?
What are the most important skills, habits of mind, or competencies students are expected to develop?
2. Assignments and assessments
What kinds of work do students complete in this course (e.g., papers, projects, discussions, exams, problem sets)?
Which assignments are most important for demonstrating students’ own thinking or skill development?
3. Alignment with standards
Are there institutional policies, accreditation requirements, or professional or disciplinary norms that should shape how AI is used in this course?
If you are unsure, how is AI generally viewed or used in your field or profession?
4. Your stance on AI in learning
How do you view generative AI tools in relation to learning in this course?
Where might AI use support learning, and where might it interfere with your course goals?
5. Specific boundaries and examples
For each major assignment type:
-When, if at all, should AI tools be allowed (e.g., brainstorming, outlining, revising, debugging)?
-When should AI use be limited or not allowed?
Please provide at least one example of acceptable AI use and one example of unacceptable use.
6. Transparency and disclosure
How should students acknowledge or disclose their use of AI tools?
What level of transparency feels appropriate for this course?
7. Student accountability
What responsibilities do students retain when using AI (e.g., accuracy, originality, ethical judgment, authorship)?
How does this connect to their development as learners or future professionals?
8. Accuracy and critical evaluation
How should students be expected to evaluate, verify, or question AI-generated content in this course?
9. Privacy and ethical considerations
Are there limits on entering personal, confidential, proprietary, or course-restricted materials into AI tools that students should know about?
10. Acknowledging uncertainty and change
How would you like to communicate that AI tools and expectations may change over time?
How can students be encouraged to ask questions or seek clarification?
Final step
Using my responses, draft a clear, student-facing AI policy statement suitable for inclusion in an online course syllabus. The tone should be supportive, transparent, and instructional, not punitive, and should reflect the design principles listed at the start.”
AI guidance prompt (assignment-level)
Even a strong syllabus policy cannot anticipate every learning context, especially when assignments differ in purpose and format. Providing assignment-by-assignment AI guidance in addition to an AI course policy helps answer practical questions like:
- Can AI be used for this assignment?
- For what purpose?
- What must be disclosed?
- How should that disclosure occur?
Copy the prompt below and paste it into your preferred Generative AI tool to begin crafting your guidance. Be sure to revise any outputs thoroughly for your personal and institutional context:
Template
“You are helping me draft clear, assignment-specific guidance on AI use for one assignment in my course. This guidance should complement (not repeat) my existing course-level AI policy and translate it into concrete expectations for this specific assignment.
Use an interview format. Ask one question at a time, wait for my response, and adapt follow-up questions as needed. Do not assume details—elicit them through the interview.
When the interview is complete, produce student-facing assignment guidance that clearly answers:
-Can AI be used?
-For what purpose(s)?
-What must be disclosed?
-How must it be disclosed?
The guidance should be written in plain language and suitable for inclusion directly in the assignment instructions.
Begin the interview
1. Assignment context
What is the assignment (title and brief description)?
What core skill(s) or learning outcome(s) is this assignment primarily meant to assess?
2. Practice of core skills
Have students already practiced these core skills on their own earlier in the course, without AI support?
If yes, how does this assignment build on that practice?
3. Fit between AI use and assessment goals
Would AI use support or undermine what this assignment is meant to assess?
Why?
4. Appropriate stages for AI use
At which stage(s) of the assignment, if any, could AI use be appropriate (e.g., brainstorming, planning, drafting, revising, checking work)?
At which stage(s) should AI not be used?
5. Purpose of allowed AI use
If AI is allowed at any stage, what specific purposes is it intended to serve (e.g., idea generation, feedback on clarity, code debugging)?
What purposes are explicitly not allowed?
6. Disclosure expectations
What level of disclosure do you expect for this assignment?
What exactly should students disclose about their AI use (tools used, prompts, stages, extent)?
7. Method of disclosure
How should students disclose their AI use (e.g., short statement, appendix, reflection, citation, prompt log)?
Where should this disclosure appear in their submission?
8. Common misconceptions
What might students incorrectly assume about AI use on this assignment that you want to clarify explicitly?
9. Connection to the syllabus policy
How does this assignment-level guidance connect back to your course-level AI policy?
Are there any course-wide rules or principles you want to reinforce here?
10. Assessment and feedback
How will AI use (or non-use) factor into how the assignment is evaluated or discussed in feedback, if at all?
11. Opportunities for questions
Before submitting, how will students have an opportunity to ask questions or seek clarification about AI use for this assignment?
Final step
Using my responses, draft concise, student-facing AI guidance for this assignment that:
-Clearly states whether AI can be used and for what purpose
-Specifies what must be disclosed and how
-Anticipates common misunderstandings
-Aligns with my syllabus AI policy
-Reinforces student responsibility and learning goals
The tone should be clear, supportive, and instructional, not punitive.”