Published September 15, 2025
Accurate assessment of student knowledge and abilities is foundational to the work of education. Generative AI tools present an existential threat to that accuracy because students can outsource many assessments to AI, successfully bypassing real learning and thus violating academic integrity. This has introduced the perplexing question for instructors of if, when, and how to allow AI use in their courses.
This Instructional Insight provides guidance for addressing that question through the concept of “cognitive offloading,” or allowing tools to perform thinking tasks. It makes sense to permit AI when the tools can help students fulfill course learning objectives in the same way that it makes sense to prohibit AI when the tools inhibit students from achieving the objectives. Ultimately, clarity of learning outcomes and careful assessment design to measure them are more important than ever.
I’m Kelly Ahuna, the director of the Office of Academic Integrity here at UB.
I know I don’t have to tell you how the widespread availability of generative artificial intelligence tools has dramatically affected the work of teaching and learning. With Gen AI capabilities built into most platforms, its pervasive availability and ease of use make it a serious temptation for students. This can be problematic when students aren’t able to distinguish between when using AI increases their efficiency of output versus when using AI actually advances their learning. These are two different things. You can produce a product with an AI tool that you actually don’t understand and can’t explain.
When student-use of AI mimics learning (or gives students the feeling that they’ve learned because they were able to produce a product) but actually prevents it (because students don’t need to put in the mental labor to grow and develop), academic integrity is in jeopardy.
Academic integrity is simply the principle that students turn in work that they themselves are able to create. Academic integrity is important for many reasons, not the least of which is that when we graduate students, we are certifying their learning to the world. If what we are assessing is not what students have learned, but what AI can do, that certification is false. As a result, our UB diplomas only retain their value when they accurately confirm learning. Additionally, our alumni can only make positive contributions to their larger communities if they are truly prepared to face the demands of their chosen professions.
So I want to make the following recommendation as part of our serious obligation to educate the next generation of global citizens.
As you consider if and how to allow your students to use AI in your classroom, I would encourage you to consider the concept of cognitive offloading. If you are unfamiliar with this idea, cognitive offloading is the practice of reducing the cognitive demands of a task. We have many tools available to help us with this practice – calculators, translation tools, spellcheckers, google maps, citation tools – things that take away some of the thinking tasks. When cognitive offloading is acceptable depends on the learning objectives you have set for your students.
Let me give you an example. If you are teaching an introductory Spanish class, you are unlikely to allow your students to use a translation tool. The learning objectives of the course require students to memorize the words and their meanings. They simply cannot advance to the next level of Spanish without that knowledge. It cannot be cognitively offloaded. But if you are teaching a capstone Spanish class and students are reading Don Quixote, you will likely not care if they use a translation tool for some of the more obscure words. The use of the tool will not get in the way of successfully meeting the learning objectives, and in fact, cognitive offloading to a translation tool will likely help in this context.
So I encourage you to deeply consider the learning objectives you set for your courses -- what students should know or be able to do when the course ends that they don’t know or can’t do when it begins. Perhaps your learning objectives should be revisited in the era of AI, but perhaps not. Once you establish what the learning objectives are, use them as your North Star in making decisions about assessments and how students can or can’t use artificial intelligence in completing those assessments. As the expert in the course, you are in the best position to judge if using AI tools will help advance students toward fulfillment of the objectives or stand in the way.
Once you make these decisions, please be as specific as you can with your students about the parameters of AI use in your class. Know that they are getting different messages from each instructor, and this can feel frustrating to them. If you explain your rationale for why cognitive offloading to AI tools is or is not appropriate, students will be more likely to stay within the parameters you set. You can put language in your syllabus, discuss this in class, and revisit the topic before every assessment.
Another consideration is how you rely on AI tools. Sometimes students will see faculty use of AI as a double standard. If you prohibit them from using the tools, but you develop your slides or your test questions with the help of AI, students may be emboldened to disobey the rules you put in place for them. Transparency can be helpful here. If you are transparent in how you, the expert, use the tools and why some cognitive offloading in those ways does not inhibit the quality of your teaching, we can help students see how true learning and understanding is the prerequisite to AI use.
Ultimately, it is our responsibility to educate students and prepare them for their future world of work. This requires us to investigate the available AI tools and allow use of them when we can, for students will certainly need to use AI tools in their lifetime of work. But this must be considered in light of accurate assessment of what students know and can do. Perhaps reframing the relationship between these two goals through consideration of cognitive offloading can help you as the instructor make some important decisions about if and how to allow AI in your classroom.
