The integration of artificial intelligence (AI) into our higher education institutions is proceeding with startling speed. Universities, once deliberate and reflective spaces, are now scrambling to appear cutting-edge, reshaping curricula and reallocating resources in response to economic and ideological pressures to embrace the technological future. Amid this transformation, the humanities are being steadily deprioritised. Literature departments are shrinking. Philosophy courses are being cut. History programs are fighting for survival. Universities worldwide are downsizing humanities programs in response to financial pressures and the rise of AI-focused education. Kingston University initiated plans to shut down its entire humanities department, including English and Philosophy, to prioritise vocational training. Similarly, Cardiff University announced plans to cut around 400 staff positions and discontinue programs in ancient history and languages. The challenge is a global one, too; Fudan University in China announced a potential 20% reduction in its liberal arts cohort to emphasise AI education. The shift is not just administrative. It signals a profound cultural and existential crisis.
AI is not the direct cause of this decline, but it is accelerating a trend that has been gaining traction for years: the erosion of deep, humanistic inquiry in favour of marketable, data-driven skills. The appeal of AI in education is undeniable. It promises efficiency, scalability, and even creativity. It can draft papers, summarise novels, generate translations, and simulate conversations in the style of any historical thinker. For overburdened students and academic staff alike, these capabilities seem to offer a solution. But the cost of that convenience is dangerously high.
To understand what’s at stake, we must begin with the essence of the humanities themselves. These disciplines—literature, history, philosophy, religious studies, the arts, and others like them—are not just collections of facts or methods. They are practices, embodied and dialogical, that train individuals in the slow, sometimes painful work of interpretation, reflection, and expression. Writing an essay is not merely about producing a correct answer. It is about discovering what one believes, making sense of texts, histories, and human experiences that resist easy summary. It is a crucible for identity, ethics, and imagination.
What happens when that crucible is outsourced to a machine? What becomes of the intellectual formation that occurs not in the final product, but in the process of wrestling with ambiguity and complexity? Increasingly, students are turning to AI not as a tool of support, but as a substitute for that process. According to the Higher Education Policy Institute (HEPI) and Kortext, 92% of undergraduates reported using generative AI tools, such as ChatGPT, for various academic tasks in 2025. This is significantly higher than the 66% figure recorded in 2024. With AI, papers are generated in seconds. Difficult readings are condensed into shallow overviews. Critical thinking becomes optional. And this isn’t just about cheating or academic dishonesty, although this is certainly a concern as well. It reflects a broader epistemological shift in how we understand knowledge itself.
We are witnessing the rise of what might be called ‘proxy thinking’, where students no longer learn how to think, but how to prompt. They learn how to query a machine to simulate the appearance of understanding, rather than cultivating understanding themselves. AI becomes a mirror, reflecting simplified versions of their questions, but never challenging their assumptions, never pushing them into the discomfort where real learning begins. The result is not just a decline in academic rigour—it is a flattening of intellectual life.
The humanities have always resisted flattening. They insist on nuance, context, and contradiction. They remind us that human meaning is not reducible to patterns, data, or probabilities. Shakespeare is not just a sequence of lexical tokens to be replicated; he is a witness to the human condition, whose works must be read in time, with care, and in dialogue with others. Likewise, history is not just a timeline of events. It is an ongoing debate about power, memory, and justice. Philosophy is not a database of arguments, but a method of inquiry that begins in doubt and leads—sometimes—to insight.
AI cannot participate in these traditions in any meaningful way. It can simulate their surfaces, but it cannot inhabit their depths. It does not feel the weight of history or the burden of responsibility. It does not mourn or hope. It does not read with wonder or write with conscience. And yet, in classrooms around the world, we are beginning to accept its simulations as sufficient. That shift—subtle, often unspoken—is what threatens to flatten the humanities: not just the replacement of human labour, but the erosion of the values that make that labour worthwhile.
This crisis is especially acute for those whose work emerges from positions of marginality. The humanities have long been a space where those outside dominant cultural paradigms could speak, disrupt, and imagine alternatives. Feminist theory, postcolonial studies, Black literature, queer theory—all of these have challenged the academy, demanding recognition for voices historically silenced or distorted. AI, trained on the vast digital archive of human writing, often reproduces the biases and exclusions of that archive. AI cannot deconstruct the centre; it assumes the centre.
When universities begin to rely on AI to teach or assess humanistic knowledge, they risk enshrining a mechanised version of culture that is less diverse, less critical, and less alive. The stakes are not just academic—they are political and ethical. Who gets to define what counts as knowledge? Who gets to speak? Who is listened to? These are questions that the humanities are uniquely equipped to ask. AI, by design, cannot.
This is happening not because the humanities have failed, but because their success is hard to measure. The work of forming a thoughtful, articulate, ethically aware person does not fit neatly into performance metrics or job placement rates. The insights of a philosophy seminar may not yield immediate returns on investment (ROI). A poem may not be patentable. But these are the things that sustain a society’s soul. They foster empathy, critical citizenship, historical awareness, and the ability to imagine alternatives to the status quo.
And so, what we face is not merely a pedagogical problem, but a cultural reckoning. Are we willing to fight for an education that prioritises depth over convenience, thought over output, voice over velocity? Are we willing to assert that some things must remain stubbornly, beautifully human?
This is not a call to ban AI from the classroom. That would be both impractical and unwise. Instead, it is a call to reassert the purpose of the humanities in a time of technological acceleration. We must teach students not just to use AI, but to question it. To understand its limitations. To resist its flattening effects. We must reaffirm that the slow, uncertain work of interpretation is not an obstacle to learning, but its very heart.
In the end, the humanities are not about information—they are about transformation. They ask us to become more aware, more attuned, more responsible. If we lose them, we lose more than departments or disciplines. We lose part of our capacity to be human. Let us not make that trade too easily.
Words by Cass Fong
Support The Indiependent
We’re trying to raise £200 a month to help cover our operational costs. This includes our ‘Writer of the Month’ awards, where we recognise the amazing work produced by our contributor team. If you’ve enjoyed reading our site, we’d really appreciate it if you could donate to The Indiependent. Whether you can give £1 or £10, you’d be making a huge difference to our small team.
