Faculty work independently during AI Coffee Talk.

AI Guidance for Teachers

AI Guidance for Teachers

While the fundamentals of teaching and learning have not changed, we recognize that students must be equipped for a world increasingly shaped by AI.  Whether or not we choose to utilize AI as a tool in the classroom, it’s important to acknowledge its growing influence and work collaboratively to chart an ethical, equitable, and meaningful path forward. 

Two key concerns we often hear from educators due to the rise of AI in education are the potential for learning loss and questions surrounding academic integrity.  We’re here to support you in navigating these challenges and exploring AI tools with purpose and confidence.  As the technology evolves, we look forward to learning from your experience and updating our resources to reflect your perspective.  

Students must be prepared to lead in an AI-infused future.

Report of the Yale Task Force on Artificial Intelligence (2024)

Yale’s Approach to AI

In June 2024, the Yale Task Force on Artificial Intelligence provided their report with recommendations.

The University’s primary approach is to rely on instructors to decide how and whether students should use AI in their courses, while providing a set of centrally available tools and resources for learning more about how AI impacts teaching and learning. As outlined in Yale’s Undergraduate Regulations, any violation of course policies are considered to be academic integrity violations. For questions about how to handle potential academic integrity violations to AI course policies, please reach out to the Executive Committee (yc.executivecommittee@yale.edu).

We encourage you to share our  AI Guidance for Students page with your class, which includes helpful ideas for how to leverage AI effectively to support learning.  Students should also understand that violations of AI course policies are considered to be academic integrity violations, as outlined in Yale’s Undergraduate Regulations and will be treated as such.

For questions about research support, please visit Yale Center for Research Computing.

AI Course Revision Grant

Interested in revising your course to incorporate AI into the teaching and learning experience?

Apply

Share Your Thoughts and Ideas

We want to hear about how you are integrating AI in your lessons or adjusting in light of them!

Email us

Guidance for Instructors

The Poorvu Center recommends crafting a statement that clarifies your course policies on the usage of AI. If we assume our students are already using AI tools in various ways, it is helpful for them to understand how you, as an instructor, want them to use or not use these tools and the rationale behind those choices. Students need to understand what you consider to be an academic integrity violation, as your policy may differ from a colleagues’. The simplest way to state a policy is to address it in your syllabus, but you can also have a discussion with your students in class. For additional resources, we offer examples of  AI policy statements from various Yale instructors.

Instructors should be direct and transparent about what tools students are permitted to use and the reasons for any restrictions. If students are permitted to engage with AI tools, you should be explicit about how you expect them to use them and how they should make you aware they’ve been used.

Some questions for you to reflect on when writing an AI course policy are:

  • What are your personal views on AI?
  • What does your department and field say about the use of AI?
  • What course objectives could AI be useful or harmful for?
  • Are there any skills that would be beneficial for students to know after Yale where AI could help?
  • Will you be using AI in your instructor role in the course (e.g., provide feedback, write emails, etc.), and if so, how will you communicate that to students?

If you are co-teaching or working with teaching fellows, ULAs, or tutors, be sure that everyone on your teaching team is on the same page regarding your AI course policy.

 There is no tool or method to detect AI use (link is external) with any certainty. Turnitin (link is external) has acknowledged a reliability problem with its detection tool software, and Yale has not enabled this feature in Canvas. However, if you are concerned about a student’s use of AI, this can be reported to the Executive Committee.

If you suspect misuse of AI, as discussed in the Handbook for Instructors of Undergraduates in Yale College, it is good practice to talk to the student about your observations of the level of work or any changes in the work that may have caused you to suspect AI use. This becomes a discussion focused more on standards and the writing process and less on accusations.

Authentic Assessments

Authentic assessments are assignments that:

  1. have a sense of realism to them.

  2. are cognitively challenging.

  3.  require authentic evaluation.

Authentic assessments also align with practices that prioritize student learning and make it harder to generate  generic responses with AI tools, including: 

  • Asking students to use resources that are not accessible to generative AI tools, including many of the amazing resources in Yale’s collections.
  • Using alternative ways for students to represent their knowledge beyond text (e.g., facilitating a discussion, recording a podcast episode). Consider adopting collaboration tools like VoiceThread for students to use for their assessments. 
  • Asking students to share their work and respond to questions about it in-person, either as a presentation to the class or a short discussion with you or another member of your teaching team.

AI Literacy

Long & Magerko (2020) defines AI literacy as a “set of competencies that enable individuals to critically evaluate AI technologies, collaborate effectively with AI, and use AI as a tool online, at home, and in the workplace.” 

If students are expected to lead an AI-infused future (2024 Yale AI Task report), then it is crucial for instructors to help students develop the skills necessary to engage with these technologies. We have listed examples of AI literacies from Ng et al., 2021. As you review these, consider how these literacies can be integrated within your classroom (e.g., learning outcomes, assessments, in-class activities, etc.). 

  • Knowing the basic functions of generative AI and understanding how generative AI applications work.
  • Applying AI knowledge, concepts, and applications to different contexts and fields.
  • Situating the creation and evaluation of AI within higher-order thinking skills (e.g., evaluate, appraise, predict, design).
  • Understanding human-centered considerations (e.g., fairness, accountability, transparency, ethics, environment, safety) when using AI.

You can review our AI Literacy handout to learn more about these examples or engage with Yale Library’s AI Literacy framework. In addition, you can review this webinar series from the University of Kent

 

Available Tools

The best way to learn about AI tools is to try them yourself. We recommend faculty and students use Yale’s provided tools when exploring the potential of AI in the classroom because these tools offer security and privacy that is not guaranteed by other publicly available platforms. These tools will allow you to explore how these programs work and help you better understand how your students might or can use them. 

Please visit Yale’s official AI webpage to learn more about all of Yale’s AI-provided tools, news and insights on AI, and updated guidance and resources. 

Data Training

If you choose to use tools outside of the ones provided by the university, you risk the material you submit becoming part of the program’s database and development. Yale’s Clarity Platform and CoPilot with Data Protection have chat data securely encrypted, and your data is not used to train any of the AI models. It is up to the instructor to let students know these risks and provide alternative assessments if they do not feel comfortable.

FERPA Considerations

Use caution and do not submit your student’s work directly to outside tools, as this violates their intellectual property and could be a FERPA violation. Yale’s AI Clarity Platform states you can include most high-risk data in the chatbot but must refer to data classification guidelines before proceeding. Please consult with a Data Steward (CAS authentication required) if you have any questions or concerns related to AI and data.

How to use AI? 

There are many ways students and instructors are using Generative AI tools, from lesson planning to creating personalized “tutor bots” to help quiz them on course content. If you are getting started with exploring the potential of these tools, here are some recommendations: 

  • Ask the AI tool to...
  • Write a response to an assignment from your class to see the quality of answer it can provide.

  • Teach you about a subject you are already familiar with and then quiz you at the end of the lesson.

  • Help with a task you’re working on, like choosing the next step toward completing a project.

The response you get from a chatbot changes depending on how you ask your questions, so it’s worth experimenting to create the right phrase or “prompt”. Here are some recommendations on how to craft an effective prompt when starting out with AI: 

  • Craft a persona when writing the prompt (e.g., “You are a patient/student/instructor/philosopher,” etc.).
  • Define constraints of knowledge  (e.g., only look at research from the past 5 years).
  • Provide it with examples (e.g., type or attach examples of data or responses)
  • Refine your prompts through iteration. You can do so by responding to the chatbot as if you are in conversation.
  • Always analyze the validity of findings/outputs.

See Examples from Yale Faculty

We have curated examples of the novel ways Yale instructors are engaging their students with AI.

AI Teaching Examples

AI Prompt Literacy

The University of Michigan provides some helpful strategies when it comes to AI prompt literacy.

Prompt Literacy in Academics

Benefits and Limitations

Some of the possible benefits of using AI in the classroom can include: 

  • Clarity of expression when trying to formulate ideas or responses (helpful for ELL students)
  • Finding patterns with vast amounts of data
  • Synthesizing and/or summarizing large amounts of information quickly
  • Generating code
  • Brainstorming ideas
  • Role-playing and simulation
  • Personalizing instruction or feedback 

Some of the possible limitations or challenges of using AI in the classroom can include: 

AI in Education - Your Thoughts Requested

Please tell us if or how you are integrating AI into your teaching.  Your experience and perspective will help strengthen our support and resources.

Take the Survey

Recommended Reading

New articles on this fast-developing topic are appearing every day. Below is a selection of thoughtful pieces that address various elements of this AI.