AI's Challenge to Higher Education

When my university, Bocconi, introduced OpenAI access to all faculty, staff, and students in June 2025, a colleague approached me and asked what we should do “now that the students have access to AI”. The concern in their question was genuine and understandable. But it also betrayed an illusion: students have had access to AI since November 2022. AI is not around the corner; it is already here. What has changed is not AI’s availability, but our perception of it.

This attitude toward AI – the ostrich approach – is common. Hoping it will all blow over is ill-placed in this case. Other inventions have affected education, but none have had a such an all-encompassing impact. We cannot continue teaching as before.

In the course of my academic career, AI has gone from a niche academic interest to a central societal and educational challenge. It is a challenge which touches and changes many aspects of our lives and work, but it poses an existential challenge to higher education. How do we equip our students with the knowledge they need for the world, for work, without compromising their ability to think critically, and for themselves?

Academia is not a place where things move fast, but the development of AI is forcing us to think fast. Our traditional classroom experience does not serve us well, this being such a new challenge. What we did even a handful of years ago has little bearing on the current situation. This means rethinking some of the fundamentals.

Here is how that might look:

  • Reconsider the student journey. Traditionally, a bachelor’s degree would teach foundational knowledge: theories, methods, a particular way of thinking to make these aspects of learning second nature. But now, foundational knowledge is truly available at students’ fingertips, forcing us to rethink that linear progression.

  • Make AI literacy a core skill. Instead of focusing on largely discipline-specific basics, we first need to train everyone in AI literacy, critical thinking, and bias detection. At Bocconi, we have devised horizontal AI courses for all incoming students to teach basic AI literacy and responsible use and have updated our code of conduct so that students know, in detail, what we expect from them when using AI.

  • Change the format. Full-frontal ex-cathedra teaching made sense in the Middle Ages, when books were expensive and access to knowledge was limited. Every new tool has given students more independence and agency. None more so than AI. We need to assume students use AI and give them the skills to do it right, so it becomes another tool in their arsenal rather than a crutch.

  • We need to teach with AI. We can use it for case studies, simulations, and data analysis in the classroom, as well as for specialized tutoring and TA work. AI is not just a tool for students; it is also for us.

  • Rethink assessments. Traditional methods assessing students’ ability to regurgitate knowledge on command no longer serve us. We need to design assignments that require students to critically interrogate AI output, to justify their choices, verify evidence, and explain how they arrived at a conclusion. At Bocconi, for instance, faculty can now choose for each assignment whether it can be solved with the help of AI or not. Applied fields can design independent projects with AI in mind. Oral exams do not scale well, but they are a better form of assessment than multiple-choice in an AI-suffused world.

  • Finally, we need to reconsider our role as teachers, from the sole arbiter of knowledge to a sparring partner and guide. That will mean stepping away from pre-scripted lectures and linear slides. And yet, students still tell us they want professors in the classroom, because self-learning is hard and rarely successful, as the low completion rates of otherwise excellent MOOCs show.

In practice, that means less reliance on pre-scripted material and more workshop-style inverted classroom. Here are some methods that are already gaining traction: We can give students AI prompts for a task and ask them to improve upon them, either in preparation (flipped classroom) or in class, alone or in groups. We can then live-critique and compare their solutions and guide them through critical audits of the AI output to test its sources, assumptions, bias, and logic.

It builds on some existing alternative pedagogical concepts: studio-style sessions, flipped problem-based work, team-based clinics, Socratic debates, in-class case simulations, and live data analysis, and we see versions of it cropping up already.

However, to be most effective, that approach requires us to start treating the classroom less like a structured end-to-end event and more like an improv class: use an outline of intended learning outcomes, give students agency, and see where it takes us––together. That shifts our role from transmitting prerecorded answers to provoking better questions, giving us a more active, participatory role. It requires us to embrace open-endedness and, yes, challenges to our own knowledge. Granted, it is initially stressful to give up so much control over the lesson plan, but it has led to some of the most rewarding classes I have taught. If we embrace it, it can put us on the learner side again, helping us to see old material in a fresh light, and rediscovering the joy of learning our field anew.

None of this will be easy, and there is bound to be an uncomfortable transition period. Some disciplines and formats will lend themselves more easily to the profound change, and some educators will be more successful in adapting than others. Eventually, higher education will look wildly different to how it did even a few years ago. At Bocconi, AI has become a core part of our strategic plan for the next five years, including for hiring and teaching offers.

There are, though, plenty of reasons to be hopeful: the calculator, the Internet, Wikipedia, and MOOCs have not destroyed higher education. Learning has a key social component, as the pandemic made painfully clear. And students, like all people, are inherently curious and easily bored. Most will not use AI to shortcut thinking, but to learn and study in novel ways, and expand their own and our collective knowledge.

It is up to us to guide their curiosity toward deeper understanding, not toward simpler answers. That will have to take new forms, but it will not spell the end of higher education. Nor of humanity as a whole.


An edited version of this post was published on Times Higher Education