How to Teach Students to Think Differently About AI
- Business schools are turning classrooms into AI labs, where we call on our students to question the fairness, inclusivity, and accountability of AI outputs.
- A pedagogy based on inquiry and empathy teaches students to treat AI not as a shortcut, but as a tool requiring creativity and human oversight.
- This approach prepares future leaders to use AI to benefit enterprises, communities, and society.
It was a lively afternoon in a web development class taught by Deepak Singh Parihar, a professor of practice at Woxsen University in Hyderabad. The room buzzed with anticipation as students opened their laptops for what seemed like just another coding session. What they did not realize was that the class would turn into a much-needed lesson about artificial intelligence (AI).
As class began, Parihar said, “Today, I’d like you to try something new. Type a single prompt into this AI tool. Then let’s see what happens.”
Anamika, a curious BBA student, typed confidently: “Create a clean, professional website for a startup selling an organic skincare product, with sections for About Us, Products, Testimonials, and Contact.”
In less than a minute, the AI tool produced a ready-to-use website with a complete responsive design and layout, along with surprisingly detailed content. The homepage carried a welcoming headline, and the “About Us” section told the story of a young startup whose founders were passionate about organic skincare products. The “Products” page neatly displayed descriptions with pricing options. The AI even provided a “Testimonials” section, filled with placeholder customer reviews, and a professional-looking “Contact” form ready for use.
The class gasped. The website was fast, functional, and seemed almost too good to be true.
As the first wave of amazement faded, Parihar reminded the students that this was just the beginning of their learning journey. “The question is not whether AI can do this,” he said, “but how we can evaluate, improve, and use its output responsibly.”
The real learning began when the professor asked Anamika several probing questions:
- “Does the website communicate the company’s personal touch, or does it feel cookie-cutter?”
- “Does the content generated reflect the brand’s true story and values, or is it just generic text?”
- “What privacy concerns should you consider if the site collects customer data?”
- “If this website went live today, how would it affect the business’s reputation?”
- “Would customers trust a business whose story is written entirely by AI?”
This was the moment when innovation met accountability. It was the moment when the students began to think about AI differently—not as a magic tool they can exploit to supplant their own thinking, but as a strategic tool they can use to enhance it.
How to Use AI Effectively in the Classroom
AI is no longer just an idea. It’s already in classrooms, helping students make websites, write essays, and look at data in ways that were once impossible. But with all of the excitement that comes with a new technology comes a big question: How can students and faculty use AI in education responsibly?
What both faculty and students must remember is that they need to find a balance. Whether students turn to AI to enhance their studies or faculty use it to streamline their teaching and research, they can’t simply marvel at its speed. They must learn to ask a deeper question: Does its output show real value?
AI in education isn’t a substitute for hard work; it’s a tool that can be used to encourage critical thinking, practical application, and social responsibility.
For students, using AI to enhance their learning and develop their critical thinking requires them to adopt several habits:
- They must view AI outputs as first drafts that they will need to customize with their own ideas and creativity.
- They must question the proof, accuracy, and fairness of those outputs.
- Finally, as they use AI to come up with new thoughts and approaches, they must make sure the final product reflects their own voices.
Once these habits become second nature, students will know not only how to use AI from a technical standpoint, but also how to manage and control its output from a strategic and creative standpoint.
For faculty, using AI in the classroom presents different challenges. When AI first emerged, their first impulse was to ban its use. But that approach is no longer an option. Today, teachers instead must decide how to integrate AI into their classrooms and prepare students for an AI-supported workplace. They can accomplish this objective in several ways:
- By providing assignments, tools, and software that require students to use AI to obtain real-time feedback on their work and to enhance their critical thinking.
- By modeling responsible behavior, showing when, how, and why they use AI in their own work.
- By regularly asking students, “What did you learn from using these AI tools?” This helps learners think about their application of the technology, builds trust in their ability to use it effectively, and reminds them to adopt AI responsibly.
At its best, AI in education isn’t a substitute for hard work; it’s a tool that can be used to encourage critical thinking, practical application, and social responsibility. Well-designed AI-integrated curricula will turn out future leaders who know how to use AI in ways that are efficient, fair, and beneficial to all stakeholders. When teachers and students use AI intelligently, it improves business and society.
Guiding AI-Driven Learning
This way of teaching and learning is based on what Parihar calls the “Pedagogy of Polaris”—a teaching mindset that is especially important in AI-driven learning environments. Named after the North Star, which shows travelers the way, this framework is meant to help students find their own paths in the exciting but sometimes confusing world of technology. It gives students clarity, purpose, and moral advice in their AI projects.
The Polaris framework asks faculty to focus on three primary priorities in their classrooms:
Encouraging students to learn with an objective. Faculty can encourage students to ask themselves, “Why am I using AI? and “What problem do I want to fix in the real world?”
For instance, Anamika used AI to make a website for a startup, but the pedagogy reminded her that the goal wasn’t just to make a website that worked. It was to tell the startup’s true story in an honest way, include all users in that story, and serve the community.
AI can get things done fast, but it fails to understand what’s fair, what’s inclusive, or how AI-informed approaches will impact society.
Acting as ethical guides. AI can get things done fast, but it fails to understand what’s fair, what’s inclusive, or how AI-informed approaches will impact society. Using the Polaris teaching framework, faculty ask students to think about whether an AI output could unintentionally cause harm, promote bias, or reinforce stereotypes. Faculty then ask students to consider how they can make sure that AI solutions are open, ethical, and accountable.
Pushing further exploration and thought. Faculty should ask students to experiment with AI outputs—to try out different methods and think about potential outcomes.
For example, Anamika improved AI’s website design over time and added more human elements such as customer stories, accessibility features, and material that was sensitive to different cultures. She learned that AI is not a tool to be used without thinking, but to be employed only with a great deal of consideration and input.
When AI Meets Human Judgment
To show students that the true power of technology extends beyond its speed and ease of use, Parihar engaged openly with his students about how he has integrated AI into his own work, drawing on his previous experience in digital marketing.
For example, he told his students about working with a small, eco-friendly skincare brand on a new marketing campaign, similar to the project Anamika had worked on. He explained that members of the brand’s marketing team started the campaign by using AI-driven analytics to predict consumer behavior; then, based on that data, they created advertisements and selected influencers for collaboration.
Initially, the campaign was successful: Engagement rates soared, conversions were impressive, and the data seemed promising. However, as the weeks progressed, the marketing team observed a significant decline in consumer interest, prompting a reevaluation of their strategy. Loyal customers began sharing critical comments about the campaign, such as “This doesn’t sound like you anymore,” and “Your posts feel robotic.”
That was the pivotal moment. The members of the team resolved to identify the factors contributing to the downturn and adjust their approach before it was too late. They realized that while AI had enhanced efficiency, it had compromised the brand’s authenticity. They needed to balance technological advancement with a commitment to their core values.
The team members then reassessed the campaign. While they maintained AI’s insights regarding audience timing and keyword effectiveness, they rewrote the narrative to reflect the founders’ voices, not perfect algorithms. They used data-driven strategies alongside real community stories, behind-the-scenes content, and customer testimonials.
This revised approach not only brought new life to the campaign, but also struck a chord with the brand’s loyal customers. By blending AI with genuine human connection, the company created a brand story that was both innovative and relatable.
Students realized that while technology can enhance their creations, it is their unique perspectives and emotions that will make their work truly meaningful.
Through this story, Parihar showed the class that brands can rebuild trust and keep customers returning by creating real interactions—AI output should be only a starting point. “When we relaunched the campaign, the numbers didn’t just appear impressive; they resonated,” he told the class. “Engagement doubled, but the most important part was that customer trust was restored.” This shift, he added, strengthened customer loyalty and sparked a renewed passion within the team.
After sharing that story, Parihar emphasized that bringing AI and human judgment together make a real brand. In response, Anamika said, “AI helps us be more efficient, but empathy keeps us grounded.”
Everyone in the room stopped talking and thought for a moment. This was a reminder that the goal wasn’t just to use AI tools such as ChatGPT, Wix, or Canva AI Designer well, but to use them responsibly.
At the end of the session, Parihar asked the class, “What will you add to your next AI-built website that no algorithm can?” The class responded, “Heart.”
It was a powerful moment that illustrated the value of human connection in a digital age. The students realized that while technology can enhance their creations, it is their unique perspectives and emotions that will make their work truly meaningful.
Educating for Inclusive Innovation
The memory of that first AI-built website stayed with the students long after that class session ended. On day one, Anamika and her classmates were amazed at how quickly AI could complete tasks without hassle or the need for thinking, coding, or planning. But as they reflected, they realized that the true lesson wasn’t how fast AI could create, but how thoughtfully it should be used.
On the last day of class, Parihar reminded the students, “AI can build in seconds, but wisdom takes years to develop.” Actual understanding, judgment, and moral insight, he told them, come with time and experience. “It will be you, the world’s future leaders, who must know how to use AI to improve efficiency,” he emphasized. “You will have to identify ways to balance speed with careful decision-making, empathy, and social responsibility.”
As future leaders, our students must be aware of their roles in deciding how to use new technology. As business faculty, we have the responsibility to ensure that they understand that new ideas and approaches should benefit people, strengthen organizations, and improve society rather than harm it.
Only by accomplishing that objective in our classrooms can we help ensure that technological advancements have a positive effect on businesses and communities.