These questions reflect common concerns that TEP, UO Online and the Office of Provost have heard from faculty around designing assessment in the age of AI, student use of AI, and faculty use of AI in teaching. If you have other questions not reflected below, please reach out to us!
Questions
- What are my baseline instructional responsibilities to do with AI?
- Can I assign/insist on student work with AI (CoPilot, other platforms)?
- Can I penalize student work I think is generated by AI?
- Can I use AI detectors to identify student work generated by AI?
- Can I move toward in-person assessments that are not impacted by AI (live exams, etc.)?
- I use Canvas quizzes and exams: what should I keep in mind?
- Can I use GenAI to create course materials?
- Can I use AI to generate feedback on student work?
- Should faculty use of GenAI be disclosed to students?
- What policies and resources support student engagement with AI that I should know about?
- Are students allowed to put my course materials into AI (CoPilot, others)?
1. What are my baseline instructional responsibilities to do with AI?
All UO instructors should: (1) have a syllabus policy statement explaining what GenAI use is allowed and disallowed in their course. These decisions should be made by individual faculty and their units based on the learning objectives of the course. (See sample course policy statements.) Moreover, (2) instructors should act to preserve the integrity of their assessments from AI misuse by being aware of the degree to which AI can produce passable responses to assignments, Canvas quizzes, etc. This may require the regular revision of assignments as part of engaged teaching.
Teaching note: Students are navigating a wide variety of instructional approaches to AI, so clarity and open conversation are important. Students report appreciation for instructors’ efforts to engage them in purposeful discussion about GenAI by, for example, explaining the rationale for their specific policies as grounded in essential learning and discussing how AI is impacting their disciplines and the professional fields students are poised to enter. Faculty who assign, or simply allow, work with AI should anticipate teaching and modeling positive uses.
Consultation and support are available to faculty and graduate instructors through the UO Teaching Engagement Program and UO Online (request a consultation).
2. Can I assign/insist on student work with AI? (CoPilot, other platforms)?
Instructors can assign engagement with AI through UO’s established software systems: specifically, UO’s Copilot with Data Protection or, for example, as part of coursework platforms used by units (like, for example, the ALEKS system used by the Math department, or Westlaw CoCounsel used in the Law school). In these cases, students don’t need to make individual accounts with third-party vendors.
Like any instance in which instructors ask students to engage with third-party digital tools, instructors need to consider the security, accessibility, and cost of these tools. When students are required to create an account with external vendors, instructors should convey to students what information will be captured/stored; allow pseudonyms or controlled access to student contributions; and provide alternatives for students who do not want to use the service. (See External Vendor Digital Tools in Teaching.)
Teaching note: Some UO students report significant ethical concerns with using AI tools, including about their environmental impacts. Some instructors who work with GenAI do live demonstrations or provide AI outputs to the whole class for analysis to decrease the number of AI prompts students need to make as individuals.
3. Can I penalize student work I think is generated by AI?
You can, of course, penalize student work that falls short of your grading criteria (say, the quality of the citations or the specificity of the engagement with the prompt). Like for any case of suspected academic misconduct, you may not apply a grade penalty for suspected AI misuse (which may be considered plagiarism, cheating, or fabrication by UO’s Student Conduct Code) without a finding of academic misconduct by the Office of Student Conduct and Community Standards (SCCS). Use the Reporting Academic Misconduct form to initiate SCCS’s process.
4. Can I use AI detectors to identify student work generated by AI?
UO does not provide a central AI detection tool because they are notoriously inaccurate and known to flag the writing of nonnative speakers of English as AI-generated. (The University of San Diego Legal Research Center’s “Generative AI Detection Tools” thematizes and links to multiple relevant articles.) The Office of Student Conduct and Community Standards does not use AI detection tools and notes that a detection tool finding is not dispositive of student AI use—additional information would be required to substantiate a violation of the Student Conduct Code.
5. Can I move toward in-person assessments that are not impacted by AI (live exams, etc.)?
Many UO instructors are moving toward in-person assessments including proctored exams, oral exams, and live writing. Some UO students express appreciation for faculty efforts to prevent AI misuse and, thus, better maintain the integrity of course grades.
Still, the rise of easily available AI tools comes at the same time a record number of UO students have accommodations with the Accessible Education Center (AEC): “the 3,559 students registered with AEC in 2023-24 represents an 8.67 percent increase from the prior year, and an almost 50 percent increase in last five years,” AEC reports.
Students’ accommodations related to assessment must be honored as long as they don’t fundamentally alter the nature of the course (a determination only AEC can make). AEC runs a Testing Center to help ensure your students with accommodations can complete in-person assessments with modifications. AEC and the Teaching Engagement Program can brainstorm teaching adaptations to preserve universal design in the age of AI, including offering students choices in assessment types.
6. I use Canvas quizzes and exams: what should I keep in mind?
Colleagues who teach with online exams in Canvas should know that Google Chrome (the recommended browser for Canvas) includes AI integrations (Google Lens, or the recently paused Homework Help). These make it easy and almost instantaneous for students to see AI-generated answers to multiple choice, true-false, and even essay question prompts.
Solutions to this challenge could involve:
- moving to handwritten or oral exams, or using Scantron for efficient scoring of some question types. (A central Scantron is available to all faculty in the Biology department’s office, 77 Klamath Hall. Learn more from the IS Knowledge Base article Scantron Services. Note that on-demand support is not available.)
- allowing AI use on low-stakes Canvas quizzes that are important preparation for in-person assessments to motivate student learning.
- allowing students to take Canvas exams in person, while printing parts of the exam on paper (case studies, word problems, etc.), so AI can't easily access this material.
- shifting toward questions that reference specific course experiences, asking students to recall, reflect, or analyze locally significant moments from lectures or discussions ("a key objection our class raised to idea X was...").
- moving to other assessment types (See our page on Assessing Learning as a Process.)
Lundquist faculty are participating in a pilot of the Respondus LockDown browser and UO is bringing a new technology-aided assessment tool, like Gradescope, to campus in the months ahead with special funding from the Budget Advisory Group—this tool will make it possible to efficiently grade a wider range of handwritten exam answers.
7. Can I use GenAI to create course materials?
AI can be a useful timesaving tool for instructors, provided that the instructor’s expert judgement is always in play. Brainstorming unit topics, developing lesson outlines and alternative sets of multiple choice questions for makeup exams, creating images for slides (provided AI use is cited to the same standard to which we would hold students accountable) can all be positive uses of AI for instruction.
8. Can I use AI to generate feedback on student work?
The provision of feedback on student work is a core part of faculty responsibilities, including a condition of meeting expectations of research-informed teaching, one of UO’s teaching quality standards. While no existing UO policies specifically prohibit faculty using AI to generate feedback on student work faculty must retain their human judgment for the assignment of grades.
Faculty who are seeking efficiencies in feedback and grading must keep in mind:
Data Privacy, IP
Faculty should not put student material into any AI too, including UO Copilot for Web with Data Protection, that includes FERPA-protected information like names of individual students and class lists. While Copilot for web with data protection is the recommended tool for educational uses because UO users do enjoy some data protections, this platform is not yet FERPA compliant—the Office of the Registrar is considering the question of FERPA compliance now.
Moreover, faculty should consider students’ intellectual property rights to much of their coursework and their data privacy, taking care not to enter any student IP into any non-UO platforms that would add students’ work into its data or might compromise personal or sensitive information.
9. Should faculty use of GenAI be disclosed to students?
Faculty should promote credibility and trust and modeling best practices. Specifically, faculty who choose to use GenAI as a tool in creating teaching materials or assessing student performance should adhere to the same standards they ask students to follow. Proactive disclosure may be appropriate, and student questions about faculty use of GenAI should receive timely, candid responses.
10. What policies and resources support student engagement with AI that I should know about?
UO’s Student Conduct Code has been updated to redefine plagiarism as:
Presenting another’s material as one’s own, including using another’s words, results, processes, or ideas, in whole or in part, without giving appropriate credit. […] Plagiarism also includes the submission of material generated by others. This may include artificial intelligence (AI) content generators and generative AI tools such as ChatGPT; websites with a question-and-answer feature such as CourseHero, Chegg, and Bing; assistance from tutors or online language translators that results in unoriginal work; and work that is purchased or otherwise prepared by another individual.
AI misuse may also constitute Cheating or Fabrication in the Student Conduct Code.
The mandatory IntroDUCKion modules on Academic Integrity (and a shorter module for faculty to adapt and use in individual courses) have been updated to address AI by defining misuse of AI as academic misconduct and encouraging students to talk with faculty about their AI course policies.
UO Libraries is working on student-facing modules on AI literacy that could be integrated into the IntroDUCKtion materials and made available for faculty to adapt and use. Moreover, UO Libraries has added an AI Service Point support to its walk up supports in Knight Library Commons as a pilot beginning in Fall 2025.
11. Are students allowed to put my course materials into AI? (CoPilot, others)
Given that individual faculty or the University of Oregon, in some specific cases, hold the Intellectual Property rights to educational materials, students adding them to an AI tool other than CoPilot for Web with Data Protection without the faculty member’s explicit permission could be in violation of IP and the Student Conduct Code (IV.1.g).