Your use of AI tools in the classroom must comply with the Family Educational Rights and Privacy Act (FERPA). If you choose to use tools outside of the ones provided by the University, you risk the material you submit becoming part of that program’s database and development. Yale’s Clarity Platform and Copilot with Data Protection have chat data securely encrypted, and your data is not used to train any of the AI models.

It is up to the instructor to let students know these risks and provide alternative assessments if they do not feel comfortable. 

Student Privacy Rights

Your use of AI tools in the classroom must comply with the Family Educational Rights and Privacy Act (FERPA), which protects the privacy of student educational records. In particular, you cannot require students to create external accounts for tools Yale does not directly license.

Privacy in an AI Era: How Do We Protect Our Personal Information?

Stanford University’s Human-Centered Artificial Intelligence institute (HAI) on Personal Information in the AI era

Read the HAI Article

AI at Yale | Guidance

Review a growing collection of Guidance resources curated on the AI at Yale website.

Find a Resource

Your Data

Remind your students that your intellectual property and data deserves the same respect you grant theirs. Consider creating a Syllabus Policy along these lines:

Sample Language

Do not submit instructor-created materials to AI tools without explicit permission. This includes: 

  • Course readings, handouts, and lecture materials 
  • Assignment prompts and rubrics 
  • Exam questions or problem sets 
  • Any proprietary content shared in class 

Uploading these materials to AI platforms may violate copyright, expose sensitive academic content to commercial entities, and compromise the integrity of course materials for future students.”