Yale instructors are considering how to redesign elements of their courses to address ChatGPT and other AI platforms. Examples across the disciplines demonstrate some of the novel ways instructors are engaging their students as this technology evolves.
AI Guidance for Teachers: broader guidance around ChatGPT for instructors
Humanities
Edward S. Cooke, Jr, the Charles F. Montgomery Professor of American Decorative Arts in the Department of the History of Art
Cooke invited students to use ChatGPT to write labels for objects(link is external) from the Yale University Art Gallery. He assessed students on the quality of their prompts in developing the AI-generated label and their critique of the label—demonstrating how an instructor supports students in being critical consumers of generative AI tools.
Alexander Gil Fuentes, Senior Lecturer II & Associate Research Faculty of Digital Humanities
Fuentes provided his graduate students with the option in SPAN 846 Introduction to Digital Humanities II: Algorithmic Approaches to Culture to engage with ChatGPT for their final paper instead of completing a computational final project. His description included: “Option B: The ChatGPT Final Paper. This final paper may be the strangest final paper you have submitted for a grade in your whole life. You won’t be writing this one alone. You won’t be writing it with another person either, not directly in any case. You also won’t even be writing the first draft. No, AI will do that. The way this works is simple: pick a topic related to your current research. Using GPT3, or GPT4 (if it’s out already) you will have the machine write the first pass. Your job is to correct and edit the work to bring it up to your standards. You will submit the original AI draft and your final version.”
Anna Iacovella, Senior Lector I Italian, Language Program Director of Italian Studies
This AI-related assignment(link is external) for ITAL 140: Intermediate Italian II exists within a media and technology unit. The purpose of the overall unit is for students to compare the use of mediatic information, websites, and social media across different generations around the world and within the students’ specific age group in Italy. The assignment incorporates a tutorial ChatGPT video from YouTube and modified with Edpuzzle. The video introduces questions for comprehension for students to respond in Italian regarding the limitations and advantages for the use of ChatGPT and other related topics.
Ryan Wepler, Director of the Graduate Writing Lab at the Poorvu Center for Teaching and Learning
This was a short, ungraded assignment(link is external) for the course ENG 429: Writing Humor. The primary goal was to have students reflect on what elements of their writing could be replicated by ChatGPT and what elements were unique to a human author. Wepler was hoping a deeper understanding of the difference between what a robot could draw from a corpus of existing texts and what a human could create would nudge students away from the predictable and toward originality in their writing. Wepler shared that he is “not sure this was ultimately the outcome, though their reflections on seeing how a machine wrote a paper they’d already authored were deep and compelling—and [they] talked in class about the implications of what they figured out for the practice of writing.”
Social Sciences
Justin Farrell, Professor of Sociology at the Yale School of the Environment
Farrell assigned his students in the course, ENV 770 Western Lands & Communities Field Clinic: Research to Practice, to use ChatGPT to dig deeper into the problems they want to solve in their research project for the course. In this AI-specific assignment(link is external), each student posed a question relevant to their problem statement and research question to ChatGPT and then annotated what ChatGPT produced by focusing on the ways the AI-produced write-up may be inaccurate, misleading, incomplete, and/or unethical. The students also considered how ChatGPT helped them refine their research project or generated a new way for them to see their problem.
Jennifer Gandhi, Professor of Political Science & Global Affairs
Gandhi asked ChatGPT the following question for her first assessment in her course, GLBL 5040: Comparative Politics for Global Affairs. “Are democratic elections good at translating citizens’ preferences into public policy?” She asked her students to improve upon the AI response by writing an essay in which they identify what features of the ChatGPT response may be superficial, what important points might be missing altogether from the response, and how would they elaborate on them.
Joanne Lipman, Lecturer in Political Science
Lipman redesigned an Op-Ed assignment to include the use of AI in her Media and Democracy course. Students were required to first prompt AI to write their op-ed; then to interrogate it (fact check every line, then write a memo about what worked and didn’t); then to rewrite it themselves. Students found that by interrogating AI, they were able to spot flaws in the facts and arguments. After interrogating those flaws, they were able to see similar shortcomings in their own reasoning. AI helped improve the students’ own original thinking & analysis.
Brian Macdonald, Senior Lecturer and Research Scientist in Statistics & Data Science
Macdonald has encouraged his students in the course, S&DS 361: Data Analysis, to use ChatGPT but has cautioned them that it can have trouble with technical topics where precise language is essential and small changes in wording can make a statement incorrect. In this question(link is external) as part of an assignment about the assumptions and appropriate uses of Poisson regression, the students were asked to identify two errors in ChatGPT’s response (one of which students often make!) when it was asked about the appropriate uses of Poisson regression.
Lisa Messeri, Associate Professor of Anthropology
Messeri published an essay in Anthropology Now on her experience of experimenting with GPT in the classroom, which includes sample exercises, GPT responses, and student work.
David Morse, Writing Program Director at the Jackson School of Global Affairs
Morse created a comprehensive guide for faculty on Designing Writing Assignments in the Era of AI(link is external). Although specifically written for instructors in the Jackson School of Global Affairs, the guidance is applicable across most disciplines. If an instructor is brand new to AI and primarily uses writing-based assessments, this guide is a great place to start.
James Sundquist, Lecturer at the Jackson School of Global Affairs
For Sundquist’s midterm exam in GLBL 275: Approaches to International Security, he asked ChatGPT to answer the following prompt: “Does anarchy mean that war is inevitable?” He provided ChatGPT’s answer to his students and asked them to write their own essay evaluating the response, with an emphasis on what it misses/gets wrong/how it could be improved. He noted that he learned it is important to ask an essay question that requires ChatGPT to take a position than to provide a summary. Then, students have opportunity to evaluate the response in a principled, evidence-based way
Sciences
David Moore, Assistant Professor of Physics
In PHYS 678: Computing for Scientific Research, a graduate-level course in Physics, Moore and his colleagues encourage students to use ChatGPT when developing code to write simple functions, correct and debug syntax, and in general to complete their exercises in class and on the homework. In less than a year, generative AI tools of this type have become important for more quickly developing code, including by working physicists, so Yale Physics faculty want students to learn and take advantage of these tools at an early stage in their graduate career. So far, in many cases Moore and his colleagues have found this allows students to focus on the physics ideas, problem solving, and algorithms covered in the course, while reducing the development time for their code. While students are encouraged to make use of these tools, they must treat these resources similarly to discussions with classmates or other web resources — i.e., they can consult them freely but must understand and write up their code independently and cannot directly copy/paste code from any sources, including ChatGPT.
Jakub Szefer, Associate Professor of Electrical Engineering & Computer Science
Szefer and PhD student Sanjay Deshpande co-authored a short paper(link is external) about the course, EENG 201: Analyzing ChatGPT’s Aptitude in an Introductory Computer Engineering Course. They report that as a text-only tool, ChatGPT cannot handle questions with diagrams or figures, nor can it perform hands-on lab experiments. It does well on quizzes and short-answer questions, but could confuse students when generating incorrect, but still plausible, human-sounding answers.