Ensuring That AI Tools Support Student Learning

Article Icon Article
Monday, September 29, 2025
Illustration by iStock/Yurii Karvatskyi
How can instructors encourage students to use artificial intelligence in ways that develop ethical awareness and enhance critical thinking skills?
  • Instructors face three primary challenges when students use AI: providing all students with equal access, detecting when AI is used, and ensuring that its use doesn’t interfere with learning.
  • To promote the ethical use of AI, faculty must stay aware of technology trends, be honest about when they employ AI, and help students develop human-centric skills.
  • Instructors and institutions should learn why students want to use AI—whether to save time, get better grades, or improve their skills—and support those goals.

 
Does this sound familiar? A struggling graduate student in my Business Ethics class submitted a stellar literature review. The language was not his usual style, the citations and references were abundant (and mostly accurate), and the conclusion was valid and insightful. The work was almost too good to be true.

As a business professor, I am increasingly faced with the challenge of knowing what to do when I suspect student work is authored by artificial intelligence (AI). In this instance, the first question I asked myself was: How do I prove or disprove my suspicions? But I have come to believe the more important question might be: How do I approach AI in the classroom so that it becomes a valuable learning tool for students?

Don’t get me wrong, I support students’ use of AI tools in research projects. I am from the camp that believes AI marks a powerful evolution of our civilization. But I realize that, if I am going to accept that students will use AI in their submissions, I must do so in a manner that not only supports critical learning, but also contributes to their personal development in business ethics.

Challenges and Responses

To learn more about how AI is being used in college classrooms, I consulted current research on the topic. I have identified three primary challenges and come up with my own response to each one:

First, student use of AI is nearly undetectable. The conundrum for instructors is that we might suspect a submission has been written by AI because it is so different from a student’s usual writing style and ability level, but we can’t accuse students of using AI because we cannot prove it conclusively. AI detectors such as Winston AI, QuillBot, Originality.ai, and Phrasly.ai sometimes can help identify assignments that were generated by AI. However, these tools may become less useful as large language models and large general models evolve in their abstract reasoning abilities.

To meet this challenge, I do use Winston AI to check student submissions, but my intent is not to accuse or punish. When I find that students have incorporated significant AI into their assignments, I take the opportunity to work directly with them to help them build content that is their own. While students might be a bit defensive at first, they quickly come around when I show them how to craft quality material that reflects their knowledge and builds understanding. It is gratifying to see students gain pride in work that they have created on their own, without the use of AI.

Second, access to AI is inequitable. I know—because I ask—that many of my students use AI to help them complete assignments or author academic papers. These students have widespread access to reliable internet service or AI tools such as ChatGPT. But I know that other students can’t afford to pay for subscriptions and might not be aware of free resources.

I respond by making use of the tools available at my school, Colorado State University Pueblo. In fact, many university libraries are embracing technology and can serve as valuable educational partners. At CSU Pueblo, Library Services provides technology resources and instruction for both students and instructors. These include a dedicated information literacy coordinator, as well as a dedicated resource center, LINC (Learning, Innovating, and Networking Center).

The danger is that students might use AI with the goal of earning the highest grades possible, when they should be concentrating on mastering the material.

Third, AI can interfere with student learning. Instructors know that students might use AI to build content or complete academic work, yet learn nothing in the process. The danger is that students might be motivated by the goal of earning the highest grades possible, when they should be concentrating on mastering the material.

To help students understand the best practices related to AI in academic work, I hold introductory Zoom meetings in which I am very clear about my approach to AI. Over four weeks of Zoom sessions, I cover these topics:

  • The Value of AI in Coursework: Content, Time Savings, Writing Assistance, (Grades?)
  • How I Monitor Sloppy/Lazy AI Submissions
  • My Favorite AI Research Tool; How to Generate Research Questions
  • How AI Could Impact Your Profession in the Future

Students appreciate these candid briefings, which enable them to use AI tools in ways that provide them with the most support.

Advice for Instructors

While it can be challenging for instructors when students use AI in their coursework, there is a positive side: AI platforms create high-quality work, and they have the potential to be powerful learning tools. But that’s true only if educators and curriculum developers ensure that students derive the most ethical and powerful benefits from new technologies.

Banning AI in education is not an option. We need to be smarter than—or at least as smart as—our students when it comes to using AI platforms and AI-assisted learning. We do not need to be experts, but we do need to be knowledgeable.

Recently, CSU Pueblo’s Hasan School of Business provided all instructors with access to an AI training course offered by Auburn University. This effort significantly leveled the playing field between students and instructors—and between instructors!

Banning AI in education is not an option. We need to be smarter than—or at least as smart as—our students when it comes to AI-assisted learning.

It takes time and effort for educators to ramp up their AI IQ. But I would recommend that instructors and curriculum developers take the following three steps as they guide students toward the ethical and positive use of AI platforms:

First, maintain awareness. Stay up-to-date on popular current AI platforms and be an active user of AI technology in your own research. For instance, to enhance personal publication efforts, use academic AI platforms such as Elicit.com. This nonprofit AI research assistant searches 125 million research papers for specific content to help refine research questions.

Second, candidly outline your approach to using AI in research and coursework. Adopt a positive, structured policy for accommodating the use of AI in the classroom. Be up front with students and let them know which AI tools and methods you use. Many state universities already have established their own guidelines related to the ethical use of AI for research and studies, but instructors need to apply these guidelines to individual class dynamics and incorporate the information into course syllabi.

What is most important is to listen to student voices when crafting protocol. Understand why students use AI. Do they want to save time, get better grades, or improve their skills? Support the ways they want to use AI to enhance their learning and develop their abilities.

Third, empower students to be critical thinkers. In course assignments and discussion questions, ask students to answer “what if” questions and to show the “why” behind their answers. Initiate conversations that require them to explain their thought processes. Include discussions and information on human reasoning versus analogical reasoning. When students possess the human skill of critical thinking, they will be prepared to work with AI throughout their careers.

A Look Toward the Future

There is no doubt that AI has significantly impacted teaching and learning. As we always have done in the past, educators must recognize and understand the challenges raised by new technology so we can prepare future business leaders to understand and ethically use it once they’re in the workplace.

Even more important, we must help students understand how, in the future, they can co-evolve with AI technology in ways that benefit both individuals and humankind.

What did you think of this content?
Your feedback helps us create better content
Thank you for your input!
(Optional) If you have the time, our team would like to hear your thoughts
Authors
Martha J. Wilcoxson
Adjunct Professor, Hasan School of Business, Colorado State University Pueblo
The views expressed by contributors to AACSB Insights do not represent an official position of AACSB, unless clearly stated.
Subscribe to LINK, AACSB's weekly newsletter!
AACSB LINK—Leading Insights, News, and Knowledge—is an email newsletter that brings members and subscribers the newest, most relevant information in global business education.
Sign up for AACSB's LINK email newsletter.
Our members and subscribers receive Leading Insights, News, and Knowledge in global business education.
Thank you for subscribing to AACSB LINK! We look forward to keeping you up to date on global business education.
Weekly, no spam ever, unsubscribe when you want.