We should be very worried about AI taking over the classroom

ChatGPT and other applications appear like neutral tools, but they are deeply entangled with the interests of companies that own them

The responsibility lies with all of us - educators, policy makers and students - to resist the framing that paints AI as unstoppable progress. File photograph: Mikayla Whitmore/The New York Times
The responsibility lies with all of us - educators, policy makers and students - to resist the framing that paints AI as unstoppable progress. File photograph: Mikayla Whitmore/The New York Times

Much has been written recently about artificial intelligence (AI) in education. Commentators, academics and students find themselves caught between camps; they are either enthusiastic advocates who celebrate its efficiency, accessibility and breadth, sceptics who warn of its risks − and the rest, who are caught somewhere between the tech enthusiasts and those with serious concerns about its impact on education.

While the current debate often revolves around issues like the need to develop AI-proof assessments, plagiarism concerns and AI’s impact on creativity, much more worrying and far-reaching implications of the technology on our education system are at risk of being ignored.

Many of the concerns expressed are about surface-level issues and overlook the deeper transformations within higher education. These include a question we and our co-authors explore in our recent research, namely: why and how is AI reshaping the very ownership and production of knowledge in higher education? And what does this mean for society and democracy?

This is not just about production, but is a question of knowledge governance itself. At heart is an issue fundamental to academia: the question of who decides what counts as legitimate knowledge and who is accountable for the social effects of AI-generated knowledge.

READ MORE

Our analysis shows that AI applications like ChatGPT are slowly reshaping education, moving the production and dissemination of knowledge away from researchers and the educator into the hands of so-called “Big Edtech” companies. What appears like a neutral tool is, in fact, deeply entangled with the economic and ideological interests of the companies that design, own and profit from it. Meanwhile, the role of researchers and educators will change to being mere verifiers and facilitators of knowledge, according to advocates of AI. But whose knowledge are they verifying and facilitating?

Our concern is pressing as leading institutions, such as the University of Oxford, partner with Big EdTech to embed and distribute ChatGPT Edu to faculty and students free of charge. While this may seem innovative at first, there is an urgent need to examine whether such partnerships may be quietly locking education systems into a dependency of corporate platforms. What costs will arise for the freedom of knowledge creation and dissemination when they are defined by the limitations of AI and its reliance on codified information?

The problem is not that AI exists as such, but that we lack robust empirical evidence and so much of the conversation is centred around often-overconfident marketing claims. AI poses significant risks to fundamental aspects of education: the cultivation of critical thought, the freedom to question, doubt, debate and seek truth on an individual’s own terms. When algorithms structure and filter knowledge – and when both academics and students alike make liberal use of it – they turn from producers of knowledge to mere consumers of this limited, watered-down version of it.

AI-generated outputs are constantly recycled into new outputs, creating feedback loops that reinforce what already exists, narrowing diversity of thought even further – and ultimately normalising particular ways of thinking. The result is a form of knowledge that is flattened, less dialogical, less contextual and more homogenous. Is this the direction we want knowledge in higher education to take?

The answer is, of course, no: knowledge is a social accomplishment and knowledge is vital to democracy. When users of AI transform from knowledge producers to knowledge consumers, this has important implications for what is known in academic circles as – and what we refer to in our research as – “organised immaturity”.

This means that technology like AI constrains individual capacities for public use of reason. This happens in three ways: through infantilisation, when we defer our reasoning to automated systems; through reductionism, when human judgment and creativity are replaced by statistical pattering; and totalisation, when technology becomes so embedded in everyday work it feels impossible to imagine research or teaching without it.

If left unchecked, these AI trends risk narrowing the space for critical, original and context-rich thinking in higher education. That space is essential for producing a skilled labour force for a digital economy, but also more crucially, for cultivating citizens who are able and willing to participate meaningfully in democratic life. When knowledge production shifts from humans to technology, democratic processes become vulnerable to knowledge that is increasingly codified, context-stripped and scaled in ways that reduce diversity of thought.

Universities must actively safeguard the human capacity to create, question and interpret knowledge. This means fostering critical digital literacy, encouraging students and academics to scrutinise AI outputs, and ensuring that human judgment remains central in teaching and research.

The responsibility lies with all of us – educators, policy makers and students – to resist the framing that paints AI as unstoppable progress. Instead, there are some urgent questions we need to ask.

Who benefits from its integration into our classrooms and academic institutions? What kinds of knowledge are being privileged and what kinds are being erased? How do we ensure that higher education remains a space for independent thought rather than allowing it to become merely an extension of corporate ecosystems?

The answers are not simple, but the questions are essential. To ignore them is to surrender ownership of education itself. Are we willing to do that?

Eimear Nolan is an Associate Professor in International Business at Trinity Business School, Trinity College Dublin. Dirk Lindebaum is Professor of Management and Organisation at the School of Management, University of Bath.