Integrating AI Strategically in Challenging Times

Article Icon Article
17 March 2026
Illustration by iStock/Yurii Karvatskyi
Now that business schools have moved beyond the “panic stage” of AI adoption, it’s time to craft comprehensive strategies that will work long term.
  • As AI accelerates, business schools must move from ad hoc experiments to strategic, scalable integration.
  • Schools can adopt the 4P framework, encompassing People, Policy, Pedagogy, and Platform to embed AI consistently across teaching, research, and operations.
  • By making thoughtful choices about staff development, governance, curriculum, and tools, business schools can make AI adoption more inclusive, ethical, and sustainable.

 
Over the last two years, artificial intelligence (AI) has entered business education at speed. Business schools initially responded with urgency: drafting academic integrity statements, updating misconduct codes, and experimenting with detection tools or chatbots. Given how quickly AI reshaped education, that “panic phase” was understandable.

But we are now in a different moment. AI is no longer a future issue. It is reshaping how organizations make decisions, design products, and manage people right now.

At the University of Leicester School of Business in the U.K., we recognized the strategic importance of AI early on, creating a dedicated position: the dean of AI. When recruiting for the role, we did not simply look for someone with technical competence in AI. We looked for a strategic educational leader who could translate fast-moving industry developments into scalable, evidence-informed practice.

This individual also needed to have demonstrated leadership in curriculum innovation, made an impact through scholarship and AI-enabled business education, developed credibility with staff and students, and built networks that connected academic practice with external employer ecosystems.

In this sense, the dean of AI plays a “bridging” role in the school. As someone who works alongside the dean of education, dean of research and enterprise, and head of executive education, this individual ensures AI is embedded coherently across teaching, research, and operations. Creating these bridges reinforces the idea that AI cannot succeed as a standalone initiative; a single role, center, or tool is not enough. To be meaningful, AI must be woven into everyday practice.

Integrating AI widely, however, does not mean buying or adopting every new platform. Instead, it means asking: How can we integrate AI in ways that are responsible, equitable, and achievable with the resources we have? One practical approach is to organize an AI strategy around four interconnected dimensions: People, Policy, Pedagogy, and Platform.

People: Building Confidence Before Capability

Any AI initiative can succeed or fail depending on the people who implement it. If faculty and professional staff feel anxious, judged, or overwhelmed, AI will remain peripheral. The starting point is therefore confidence-building, not procurement.

Create spaces for safe experimentation. No school should expect all of its academics to become AI experts overnight. Instead, schools can take these actions to build faculty’s confidence gradually:

  • Host best-practice sessions where early adopters demonstrate how they use AI to design activities, case studies, or feedback in specific disciplines.
  • Offer AI clinics and informal drop-in spaces (on campus or online) where staff can work with a learning technologist or “AI champion” to evaluate and refine their assessments, teaching plans, and grading processes.
  • Focus on one concrete use case at a time. For example, a single clinic might focus only on “using AI to generate alternative exam questions” or “using AI as a co-designer for group projects in finance.” Don’t ask faculty to adopt too many novel approaches too quickly.

These structures normalize experimentation. Staff see that the school does not expect them to “get it right” immediately, they can first try, reflect, and iterate.

As they do, a dean of AI provides visible leadership for professional development, building staff confidence and capability through training pathways, communities of practice, and “safe-to-try” spaces that normalize experimentation and sharing of best practices.

AI is not about replacing staff. It’s about supporting them so they can design richer learning experiences and focus their time where it matters most.

Support AI for assessment, teaching design, and feedback. Where policy allows, schools can help staff use AI to:

  • Draft and refine assessment briefs and rubrics that save time on first drafts while keeping academic judgment firmly with faculty.
  • Generate formative feedback templates that can be tailored quickly, especially in large cohorts.
  • Pilot AI-assisted marking or moderation in tightly defined contexts. For example, staff can use the technology to tackle short-answer questions with clear criteria while maintaining human oversight for complex judgment.

In challenging times, this attention to people is essential. It signals that AI is not about replacing staff. It’s about supporting them so they can design richer learning experiences and spend their time on what matters most.

Policy: Creating a Living Framework

AI governance in business schools works best when treated as a living framework, closely tied to values and regularly reviewed. No governance policy should be viewed as a one-off document.

Refresh policies to match AI’s evolving landscape. An effective AI policy framework will:

  • Be reviewed on a regular schedule (for example, annually or when major tools or regulations change).
  • Link explicitly to institutional values such as academic integrity, inclusion, equity, and student well-being.
  • Involve students, faculty, and professional staff in its development and revision.

Importantly, policies should be clear about what the school encourages, not only what it forbids. For instance:

  • Encouraged activities might include using AI to brainstorm ideas, structure arguments, or rehearse presentations (with disclosure).
  • Restricted activities might include using AI to generate complete assignments without transparency or to bypass required learning processes.

Pair policy with practical guidance for staff and students. Policy by itself rarely changes behaviors. Schools can support implementation by developing guides that offer clear direction.

Student-facing guides can explain, in accessible language, how AI can be used appropriately in programs and disciplines. Staff-facing guides can provide sample AI statements for module outlines, examples of AI-inclusive assessments, and scenarios that show acceptable and unacceptable use of the technology. By aligning policy with realistic guidance, schools help their communities move from fear and guesswork to informed, responsible practice.

A dean of AI acts as a cross-college coordinator for institutional AI policy development, contributing to universitywide direction by serving as a member of the AI strategy group. In this capacity, the dean of AI translates university direction into clear, usable guidance for staff and students, ensuring policy is practical, current, and consistently applied across colleges.

Pedagogy: Designing AI-Enabled Learning

The heart of AI integration is pedagogy. Technology should follow from learning goals, not the other way around. For business schools, this means being explicit about what AI literacy looks like in each discipline and how it develops across a program.

A dean of AI provides the overarching vision and strategy for integrating AI into the business curriculum. This includes working with each discipline to define what AI capability and AI literacy mean and coordinating program-level mapping so integration is systematic rather than ad hoc. The dean also can ensure that assessment and learning activities develop students’ judgment, verification practices, and ethical use of AI.

Define discipline-specific AI literacy. AI literacy is not one-size-fits-all. Schools can work with disciplinary leads to answer questions such as:

  • What should a final-year finance student be able to do with AI for portfolio analysis or risk modeling?
  • How should a marketing student evaluate AI-generated insights, personas, or campaign ideas?
  • What does responsible AI use look like in human resources management, in areas such as recruitment, performance evaluation, or learning and development?
A dean of AI ensures that AI integration is systematic rather than ad hoc and that assessment and learning activities develop students’ judgment, verification practices, and ethical use of AI.

Schools can map these expectations across the curriculum so that students build AI-related knowledge and skills progressively, from foundational awareness in early years to applied, critical use in later stages.

At the University of Leicester, for example, we guide this process by using the AI in Teaching and Learning Framework developed by Xue Zhou, an author of this article, and Lilian Schofield. Using this approach, we map where AI literacy is already taught and where it can be further integrated, so that students develop all four dimensions of AI literacy: knowing and understanding the technology, using and applying it, evaluating and creating with it, and engaging with AI ethics.

Integrate AI into learning and assessment. Practical examples for incorporating AI into the curriculum include:

  • Designing case-based activities where students compare human and AI-generated analyses, critique both, and decide which elements to trust and why.
  • Asking students to document and reflect on their AI use: what tools they used, how they verified outputs, and what they changed, rather than allowing them to pretend that they did not use AI in their work.
  • Engaging students in workflow-based tasks where they design and explain a step-by-step problem-solving process that integrates AI tools at specific stages (such as data exploration, pattern identification, or drafting insights). They also should clearly articulate where human judgment, contextual understanding, and ethical decision-making were required to refine the output.

Where finances allow, programs can ensure that students gain hands-on experience with at least two AI applications directly relevant to their fields—for example, an analytics tool specific to their chosen discipline and a general language model. The emphasis should be on their developing critical thinking skills and workflows that are transferable to different platforms, not simply mastering a single AI tool.

When AI is embedded into the curriculum in this way, students see it not as a shortcut, but as part of how their professions work. They experience business school as a place where they can experiment safely and ethically.

Platform: Implementing Strategy

The platform question is often the hardest. Many business schools face significant financial constraints and cannot purchase every new AI product or license. Here, the key shift is from technology-led to problem-led decision-making.

Start with the problem to be solved. Before committing to any new AI platform, school leadership can ask:

  • What is the most pressing challenge we want to address with AI? (Do we want to reduce attainment and progression gaps? Improve staff efficiency and workload sustainability? Enhance student satisfaction and engagement?)
  • What existing systems could be incrementally augmented with AI, rather than replaced? (Learning management? Assessment? Student support?)

This framing prevents “shiny object” purchases and instead encourages schools to run targeted pilots with clear success criteria.

Leverage and extend existing platforms. Systems that schools already use often have features that can help develop students’ AI capabilities. Practical options include:

  • Activating AI features within existing learning platforms. For example, faculty might use AI-assisted feedback, quiz generation, or early-alert analytics (with appropriate governance).
  • Running small-scale pilots of AI functions in one or two modules. From these pilots, AI strategic committees can gather evidence on learning outcomes, staff workload, and student perceptions before scaling pilots with successful results.
  • Ensuring that any tools adopted align with institutional standards for data protection, accessibility, and transparency.

Consider creating campuswide AI environments where feasible. For example, solutions such as ChatGPT Edu can provide campus-branded AI tools. Another emerging model, now adopted by the California State University system, involves offering a controlled AI environment to support personalized learning and prepare students for an AI-enabled economy.

Institutions that cannot make such investments can still make meaningful progress by experimenting with carefully selected free or low-cost tools, offering strong guidance, and promoting curriculum-embedded AI literacy. Schools can teach students how to think with AI, question AI, and document AI use even when funds to invest in different platforms are limited.

Strategic, Inclusive, and Intentional

An idea stated above bears repeating: AI will not succeed as a standalone initiative. Its thoughtful, strategic use must run through research, teaching, and operations. The 4P framework is one way to keep that integration intentional, ethical, and realistic.

In a challenging financial environment, business schools do not need to do everything at once. By focusing on People, Policy, Pedagogy, and Platform together, they can move beyond short-lived pilots to build AI practices that are inclusive, evidence-informed, and mission-aligned. They can prepare graduates not just to use AI, but to lead with it.

What did you think of this content?
Your feedback helps us create better content
Thank you for your input!
(Optional) If you have the time, our team would like to hear your thoughts
Authors
Xue Zhou
Academic Head of the AI Education Centre, Queen Mary University of London
Dan Ladley
Professor, Pro Vice-Chancellor and Executive Dean, University of Leicester School of Business
The views expressed by contributors to AACSB Insights do not represent an official position of AACSB, unless clearly stated.
Subscribe to LINK, AACSB's weekly newsletter!
AACSB LINK—Leading Insights, News, and Knowledge—is an email newsletter that brings members and subscribers the newest, most relevant information in global business education.
Sign up for AACSB's LINK email newsletter.
Our members and subscribers receive Leading Insights, News, and Knowledge in global business education.
Thank you for subscribing to AACSB LINK! We look forward to keeping you up to date on global business education.
Weekly, no spam ever, unsubscribe when you want.