Leading the AI Transition
- When business schools do not adopt systems of coordinated governance, early AI innovators can face fatigue, with their curricular innovations producing inconsistent and limited institutional impact.
- A six-pillar governance framework can align vision, curriculum, faculty incentives, assessment, infrastructure, and resources, turning scattered efforts into a coherent and responsible AI strategy.
- By applying good governance to AI adoption, business schools strengthen learning, clarify faculty roles, boost institutional relevance, and actively shape how AI transforms management education.
Two business professors meet at a conference, where they compare experiences integrating generative AI (GenAI) into their teaching. Both share how they have designed AI tutors, adopted AI to create realistic cases and datasets, and guided students in using AI for case analysis. Applied well, they agree, AI can enhance and accelerate learning, supporting deeper engagement, faster feedback, and more time for exploration and problem-solving.
Yet both also describe a familiar pattern. They talk about how their schools provide little support or guidance. How workshops about teaching with AI draw limited participation. How early adopters experience fatigue because innovation feels risky, isolated, and disconnected from school priorities.
Their experiences are becoming too common—and increasingly unsustainable.
What’s missing at both of their schools? A structured governance framework that supports the transition from ad hoc innovation to sustainable impact—that makes innovation easier and more coherent. The findings of a 2025 AACSB survey of deans and faculty suggest that while some business schools are developing GenAI policies, nearly 45 percent lack the necessary coordinated governance to translate those policies into practice.
We propose a governance framework to help schools systematically navigate the GenAI revolution. The framework is based on six interlinked pillars: vision and priorities; curriculum innovation and scaling; faculty empowerment and recognition; learning integrity and assessment; data, infrastructure, and access; and resource planning.
Pillar 1: Vision and Priorities
Governance begins when schools establish a shared institutional vision of how AI will influence teaching, research, and outreach. School leadership can take several actions to support this pillar:
Write a clear statement of intent that aligns faculty experimentation with the school’s mission.
Form a group to establish the vision through a task force, steering committee, or dedicated lab. An AI steering committee, for example, can review progress, recommend resource allocations, stay attuned to emerging developments, and engage in multiyear planning. Its work makes forward planning a routine part of governance rather than a reactive exercise.
Good AI governance connects a shared institutional vision with changing realities in learning, research, and outreach—rather than merely managing technology.
Crucially, schools should ask how their missions and programs should evolve as AI reshapes business. The Wharton School’s Pincus AI Lab for Organizational Innovation at the University of Pennsylvania in Philadelphia exemplifies this broader view; its role is to explore how GenAI transforms leadership, decision-making, and collaboration through “human–AI co-intelligence.” Schools should likewise ensure governance connects institutional vision with changing realities in learning, research, and outreach—rather than merely managing technology.
Set clear guidelines, determined either by the business school or by the parent university. George Mason University in Virginia, for example, has established campuswide AI guidelines. Its Costello College of Business uses these guidelines as it builds its strategic roadmap for AI use.
Deliver an annual AI “vision update” that communicates priorities and showcases successful initiatives. Ongoing communication ensures that AI integration is visible, informed by evidence, and anchored in academic values, not just buoyed by short-term enthusiasm.
Pillar 2: Curriculum Innovation and Scaling
Meaningful AI integration starts in the classroom. Faculty should be encouraged to experiment with AI to determine how it can promote experiential learning, enrich case analysis, help students reflect more deeply, and provide personalized feedback and self-paced learning. No formal approval process is necessary for low-risk innovations. A simple notification to the department can create visibility and foster peer learning. This decentralized approach builds faculty confidence and AI literacy across the school.
Two ways to bolster curricular innovation include:
Creating a repository of resources. Successful practices can be captured in concise AI teaching guides or practical playbooks that summarize objectives, learning processes, sample materials, assessment options, and AI tool setup notes.
For instance, Wharton’s Generative AI Labs (GAIL) curate a Prompt Library for teaching and research, helping faculty experiment without duplicating efforts; GAIL provides a shared infrastructure that lowers individual risk. A university-level task force at another school has published AI Guidelines for Instructors, a document that includes sample syllabus policies.
An author of this article used GenAI to design cases and data sets tailored to specific learning objectives in teaching analytics; this approach could be captured in a playbook and adapted to courses in marketing, finance, or human resource management. Similarly, a marketing instructor who uses AI to facilitate deeper reflection in case analysis might create a playbook later adapted by colleagues in operations or entrepreneurship.
Establishing an AI curriculum and learning (AICL) committee. Composed of faculty innovators and AI-savvy instructional designers, an AICL committee can scale and curate effective practices, promote curricular coherence across programs, coordinate with instructional designers and assessment experts, and close the loop to build on successful pilot outcomes and faculty insights. It should include faculty with experience in assessment who can ensure consistency with learning integrity and assessment principles.
An AICL committee can recommend changes based on high-value AI teaching practices, but its recommendations would still move through departmental and college committees for approval. Its role is to make the AI governance process more responsive and evidence-informed. An AICL committee does not bypass existing governance. Instead, it works to make formal curriculum changes more responsive and evidence-based.
From Innovation to Governance: The Role of the AICL Committee
| AI Focus Area | The AICL Committee’s Key Actions |
| Strategic Direction | Set two to three annual AI teaching priorities and scan industry and technology trends to identify new opportunities. |
| Faculty Enablement | Facilitate peer learning circles, coordinate grants and incentives, and advise on recognizing AI-related teaching contributions in faculty workloads and evaluations. |
| Curation and Scaling | Turn successful experiments into AI teaching guides, maintain a shared inventory of playbooks and pilots, close the loop from pilot outcomes and insights to refine playbooks, and set future priorities. |
| Program Coherence | Ensure horizontal alignment of AI expectations and provide shared prompt libraries and templates. |
| Agile Governance | Streamline approvals, allow low-risk pilots via simple notification, and provide provisional endorsements. |
For example, the Leeds School of Business at the University of Colorado–Boulder has adopted a coordinated approach to incorporating GenAI across 14 core courses involving nearly 50 instructors, so that all students receive a baseline competency. Nottingham Business School in the United Kingdom also focuses on broad GenAI adoption, encouraging faculty to embed practical, in-context skills across the curriculum.
Another college created a committee to conduct a curriculum audit, reviewing all graduate syllabi and surveying faculty to map existing AI use. This analysis led to a competency-based proposal for a Business of AI graduate certificate built on two tracks: one for analysts and another for managers.
Pillar 3: Faculty Empowerment and Recognition
When AI integration is left to individual enthusiasm, efforts can become fragmented and difficult to sustain at scale. To make AI a routine part of the curriculum, schools must view it as a strategic priority that requires focused attention. To support the third pillar, they can consider:
Forming faculty learning circles. Small cross-departmental groups can connect instructors exploring similar ideas, encourage co-creation, and help innovations spread organically.
Including AI integration in the promotion and tenure process. A brief workload note can clarify that faculty are expected to contribute to the school’s AI integration goals. Substantive efforts on course redesigns, resource development, and peer mentoring should count meaningfully toward faculty’s teaching and service contributions. Schools can also recognize rigorous research on AI for teaching and learning—including publications in educational or applied outlets—as valued scholarly output. Such recognition can be time-limited while schools build capacity and norms around AI use.
Offering rewards and recognition. Microgrants, teaching fellowships, workload adjustments, and student assistants can offset the time faculty need to design and assess AI-enhanced learning. Similarly, AI innovation spotlights, semester-end showcases, and public recognition in the dean’s communications bring visibility to early adopters and diffuse promising practices across departments.
These initiatives cultivate a culture of exploration and reinforce the school’s shared sense of purpose.
Pillar 4: Learning Integrity and Assessment
A recent survey at a public university revealed that students’ top concerns about GenAI included “over-reliance on AI,” “inconsistent faculty policies,” and a “lack of clear guidance.” To address these concerns, schools must make responsible AI use an explicit, assessable competency. They can support learning integrity and assessment by:
Moving beyond plagiarism concerns. Assurance of learning (AoL) committees and academic integrity offices should focus on embedding professional standards of responsible AI use into curricula and assessments.
Recognizing that AI is reshaping what students must learn. With automation taking over routine tasks, graduates must master higher-order capabilities such as problem framing, critical reasoning, evaluation of AI outputs, communication, and ethical judgment. At the University of Chicago Booth School of Business, for example, AI is taught through a multidisciplinary lens that integrates tools, use, and impact, helping students think critically about how AI shapes decision-making.
Updating learning outcomes to include AI competencies. When AI takes on analytical or writing tasks, learning goals shift from producing outputs to interpreting, critiquing, and improving those outputs. Departments can conduct light-touch, periodic reviews to keep learning outcomes aligned with evolving expectations.
Designing new assessments. Departments can create assessments that incorporate “AI-on” tasks, where students are graded on how they guide and critique AI use. These efforts can be balanced with in-class checks to verify individual understanding.
Departments, AICL committees, and academic integrity offices can use AoL findings to make expectations for AI use more consistent across courses and departments.
A case exercise by an author of this article takes this approach in an MBA course called Management of IT. There, student teams take competing roles, using AI to strengthen their arguments and challenge their opponents’ positions. They then revise their analyses in response to an AI-driven “what-if” scenario that stress-tests initial assumptions.
Departments, AICL committees, and academic integrity offices can use AoL findings to tweak course content, align policies, guide responsible use, share incident patterns and instructional resources, and identify overlaps in AI use across courses. These efforts make expectations for AI use more consistent across courses and departments.
Pillar 5: Resource Allocation and Long-Term Planning
Over time, digital learning support from AI will reduce the time faculty spend on routine tasks, enabling them to focus on experience-rich learning and higher-value activities. In anticipation of these structural shifts, administrators must create governance policies that:
Scale instruction strategically. AI integration will allow schools to blend AI-guided preparation with instructor-led sessions, adjust course sequences and credit structures, and help faculty balance their workloads and deliver personalized AI-assisted learning at scale. For example, Imperial College Business School in the U.K. has partnered with OBRIZUM to build an AI-based infrastructure for executive education that enables adaptive and efficient learning delivery.
Invest in people and partnerships. Schools can leverage restructured teaching models and efficiency gains to strengthen faculty development, build research capacity, and hire instructional designers and graduate assistants to support evolving pedagogy. Schools that invest in faculty capability, cross-disciplinary collaboration, and practitioner engagement will be prepared to adapt as technology transforms education and business. The Wharton School, for example, emphasizes such investment through its AI and Analytics Initiative, which funds research and curriculum innovation while fostering collaboration between academia and industry, mitigating adoption risk.
Collaborate with industry. Partnerships with companies extend a school’s resources and impact by expanding access to tools, data, and evolving practices. Co-developed cases, practitioner-led projects, and internships enhance teaching relevance, generate research insights, and reinforce the school’s role as a trusted contributor to responsible, AI-enabled business practice.
Governance for long-term planning should remain light, adaptive, and academically anchored, coordinating academic and financial leadership. An AI steering committee can align priorities and resources, translate strategy into investment roadmaps, monitor learning and structural shifts, and ensure alignment with broader priorities.
Pillar 6: Data, Infrastructure, and Access
With most infrastructure decisions managed at the university level, business schools can add a mission-aligned layer to foster innovation, agility, security, and equitable use. A data and technology working group can bridge central governance with college needs.
This group can maintain a list of approved AI tools, ensure student access and privacy, and manage shared AI resources. It also can establish guidelines for student-paid tools to ensure transparency, affordability, and fairness.
For instance, the group might adopt a tiered risk model to keep governance practical, responsive, and secure. When faculty want to pilot low-risk tools that do not store student data (such as ChatGPT), simple notifications to the group may suffice. However, the working group might require additional reviews and approvals before faculty can adopt moderate-risk tools that temporarily store anonymized data (such as cloud-based analytics platforms) or high-risk tools that collect identifiable student information.
By conducting consistent review cycles and maintaining transparent tool lists, a data and technology working group can ensure that faculty’s AI use stays within clear standards. This effort protects students and the institution, while giving schools the flexibility to innovate.
From Isolated Efforts to Sustainable Impact
GenAI adoption can easily produce the isolated efforts and lost momentum experienced by the two professors mentioned earlier. To avoid this outcome, schools should not dampen this energy with bureaucracy but look for ways to channel it.
This holistic six-pillar governance framework offers the light, transparent structure needed to achieve this goal. It aligns classroom innovation with schoolwide strategy, builds shared capability, and helps schools move from ad hoc innovation to coherent, sustainable, and transformative impact.