Guidance for instructors


In the classroom, our first goal is AI literacy, for our students and ourselves. We want our students to know how AI technologies work and how they may be applied appropriately and ethically. Beyond literacy, our students can learn to evaluate existing applications. As they gain skills, they can move on to creating new technologies appropriate to their fields. Step by step, we prepare students to enter their careers AI-ready.

Considerations for Teaching AI at Your University

AI permeates every aspect of our lives. We believe faculty, staff, and students need to be AI literate to effectively weigh the exciting opportunities and complex challenges presented.

Opportunities

  • Increase efficiency in course preparation: AI can assist instructors with course preparation including structuring lesson plans, aligning student learning objectives with materials, and creating custom content.
  • Generate diverse content: AI can create case studies from diverse viewpoints, generate custom datasets for assignments, and develop role-playing scenarios.
  • Personalize learning: AI can help instructors provide supplemental learning through prompts for student self-quizzing, study guides, and tutoring.
  • Prepare students: AI can simulate real-world scenarios, provide instant feedback, and give learners opportunities to critically evaluate experiences.
  • Support different abilities: AI has the power to assist students with different abilities and non-native learners who may need assistance with interpretation, structure, or alternative formats.
  • Increase access and assistance: AI can extend learning with 24/7 access for students who may not be able to contact their instructors. AI-powered tutoring can offer personalized support to students outside traditional hours, ensuring access regardless of geographical location or time constraints.

Challenges

  • Ensure AI literacy and readiness: As with all digital literacy, students, faculty, and staff will need to develop AI literacy skills. This includes appropriate usage of AI tools and critical evaluation of AI output.
  • Anticipate biases and limitations: AI can replicate existing stereotypes and biases present in society, making it necessary to prepare students for possible biases they may encounter with the use of AI. AI’s accuracy cannot be guaranteed.
  • Acknowledge issues of authorship and ownership: Reflecting on ethical considerations for generative AI use within your discipline is crucial. The texts collected to create Large Language Models may have been gathered without consent from writers, artists, and content creators, although the output is not copyrightable. Deciding how AI will be cited and acknowledged in your course is important.
  • Evaluate privacy and security: Protecting student privacy is an important consideration as you select AI tools. You should examine the information students must provide when registering and the data the system collects about the users.
  • Consider equity and access: It is important to ensure affordable and equitable access to the selection of AI tools.
  • Prevent academic integrity concerns: It is essential you provide clear expectations to guide student work in your course. Generative AI is built to replicate human language. AI detection software cannot be relied on to detect AI-generated content. Current AI detection software is unreliable and biased against non-native English writers.

AI Integration: Three Levels

These general approaches may help address AI use in your course.

AI Permitted

Generative AI tools may be required in this course. Generative AI use is promoted in some assignments and will be clarified in assignment instructions. Any work that is done using generative AI must be cited in your submission.

Some AI

Generative AI tools may be used to enhance some assignments in this course. Assignment instructions will differentiate between distinct human and AI tasks. Any work that is done using generative AI must be cited in your submission.

No AI

The learning that takes place in this course requires your unique perspective and human experience. Use of AI would make it harder to evaluate your work. It is not permitted to use any generative AI tools in this course, and the use of AI will be treated as an academic integrity issue.

Integrating AI Into Teaching Practice

As with any new technology or teaching strategy, incorporating AI into your classroom will also come with important considerations and best practices for course design, facilitation, and grading. The following principles and strategies will help guide you to select the appropriate tools and methods for implementation.

Explicitly state when and to what degree students will be using AI tools in your syllabus, in a Start Here/Getting Started module, or in assignment instructions. Follow this transparent assignment template for all assignments to ensure clarity.

Syllabus guidelines may vary by college, department, and course. Consider your plan for integrating AI into assessments and include clear statements in your syllabus and assignment instructions that align with your approach.

AI tools commonly used in the classroom span across several categories, ranging from assistive (e.g., autocorrect and autofill, and text-to-speech) to large language models (LLMs) that allow generative and conversational AI.

Consider these best practices as you explore AI Tools:

  • Sign up for the tools and familiarize yourself with their main functions. If you plan to ask students to share a conversation they had in AI, test if this capability is enabled.
  • If you plan to use AI in an assignment, test prompts and assignment instructions in the tool you selected before sharing it with students. Some subject matters that are more niche may present inaccuracies or limitations.
  • Refer to Fast Path Solutions for updates on data usage and compliance.

Navigate students through the basic use of AI tools by including the following:

  • Instructions on how to create an account or access an existing account of the tools used in the class.
  • Introduce the AI tool through a low-stakes or ungraded assignment, such as this AI Introductory Discussion.

Promote opportunities for students to critically evaluate issues related to privacy, ethics, and biases in AI. Consider the following:

  • Ensure students understand safety and security considerations and/or use a protected account, such as their UF login with NaviGator AI or Microsoft Copilot. Refer to Fast Path Solutions for updates on data usage and
    compliance.
  • Ask students to evaluate the output from AI for potential biases or inaccuracies or purposely design assessments to incorporate practice for evaluation, such as this AI Literacy assignment.

In backward design, writing student learning objectives (SLOs) prior to designing assessments helps ensure instructors align and measure student performance based on outcomes for the course and program. Refer to CITT Course Mapping Resource Guide for assistance on writing SLOs to meet various learning levels and domains, and
aligning SLOs to program outcomes, assessments, and content.

When AI is introduced in an assignment or course, instructors may need to differentiate between AI capabilities and human skills, as noted in Bloom’s Taxonomy Revisited. This should be considered in the assessment design and explicitly written in the assignment instructions.

Assessment Design and Grading

Building authentic assessments helps students apply their knowledge and experience to real-world situations and problems. Authentic assessments include personal reflections, analysis and problem-solving in case studies, role playing, debates, and simulations.

These assessments are often challenging to prepare, facilitate, and grade, but AI can help make them more feasible by using it as a tool for conversations or to generate content for diverse cases, problems, or datasets.

You can explore a variety of authentic assessments that leverage AI in the Elevated Recipes (Authentic + Artificial) section of the AI Prompt Cookbook.

Transparency in Learning and Teaching (TILT) provides a framework that emphasizes transparency and clarity in purpose, tasks, and criteria for success. Use this transparent assignment template based on the TILT model that also includes ways to clarify expectations for AI usage for each assignment.

Scaffold assessments for frequent, low-stakes opportunities for feedback. Breaking assessments into multiple stages, or scaffolding, provides students with multiple opportunities to receive feedback and guidance.

This is also a great strategy to reduce the pressures students may feel with higher-stakes assessments, potentially reducing academic integrity concerns. Scaffolding can also help clarify at what stage(s) of the assessments AI is appropriate to use.

Follow Universal Design for Learning (UDL) principles to optimize the relevance, value, and authenticity of assessments. UDL principles recommend multiple means of engagement, representation, and expression in the teaching and learning experience.

By providing multiple ways for students to engage with course materials and multiple submission options (e.g., video, text, and auditory), students can show more creativity and autonomy in their submissions. Giving students the option to choose a creative submission type (e.g., infographic and video) could help reduce the use of AI where it is not needed.

Read the Full Guide

The advice on this page is drawn from “Best Practices for Generative AI.” Find additional guidance about the implementation and use of generative AI for instructors, students, researchers, and HR professionals.

Contact AI at UF

Mailing Address
P.O. Box 113175
University of Florida
Gainesville, FL 32611

Physical Address
105 Ayers Building at Innovation Square
720 SW Second Ave.
Gainesville, FL 32601

Call us: (352) 294-1895