ÍøÆØÃÅ

AI risks undermining the heart of higher education

<ÍøÆØÃÅ class="standfirst">If students don¡¯t make the effort to comprehend, synthesise and relate ideas for themselves, they will miss out on meaningful academic growth, says Zahid Naz
April 21, 2025
A human rest next to a computer with AI on its screen
Source: Diy13/iStock

The debate surrounding AI in higher education often centres on issues like plagiarism and the potential for more efficient learning. However, it tends to overlook deeper concerns regarding the erosion of academic rigour and the exacerbation of inequality. It is vital to approach the integration of AI into education with caution, as there are two key issues that require attention.

Over the past year, I¡¯ve attended several workshops on AI in teaching, where the focus was on its ability to help students summarise texts and paraphrase ¨C functions generally deemed acceptable in academic guidelines. But while large language models are praised for streamlining research tasks and clarifying complex ideas, they overlook a crucial point: when AI generates summaries, students bypass the cognitive engagement required to actively interpret academic work.

It is precisely this engagement ¨C the effort to comprehend, synthesise and relate ideas ¨C that catalyses learning. The more students rely on AI for these tasks, the less they engage in the intellectual struggle that underpins meaningful academic growth. This tendency towards passive consumption of information risks reducing learning to a mere act of ¡°sorting¡±, rather than critically engaging with complex ideas. As a result, it engenders intellectual laziness, not only in reading and writing but, more importantly, in thinking.

The process of learning involves much more than reading texts; it requires students to grapple with intricate concepts, compare and contrast ideas, and navigate the nuanced arguments presented in academic literature. This sharpens critical thinking, cultivates original thought, and builds the foundation for intellectual independence. The tools provided by AI, while efficient, cannot replicate this process of active cognitive engagement and may lead students to forgo the critical, though often lengthy and challenging, process of reflecting on their learning and identifying areas in need of further attention.

ÍøÆØÃÅ

ADVERTISEMENT

This failure to actively engage with content could impair memory formation and hinder the consolidation of knowledge into long-term memory, ultimately undermining students¡¯ ability to retain and apply what they have learned. It is no surprise to me that a involving 494 university students found that frequent use of AI tools like ChatGPT correlates with reduced academic performance and poorer memory retention.

In short, the unchecked use of AI could ultimately undermine the very intellectual rigour that makes higher education meaningful.

ÍøÆØÃÅ

ADVERTISEMENT

The other key issue we need to address around AI is that the digital literacy and technical expertise that are prerequisites for engaging with AI technologies are unevenly spread. That is because access to the education and training necessary to develop these skills is also unevenly distributed. Affluent individuals, households and organisations are better positioned to invest in and capitalise on AI technologies. This digital divide risks exacerbating pre-existing social and educational disparities.

Although many young people are familiar with digital tools such as smartphones and computers, their proficiency often remains limited to basic or social uses. Advanced competencies, including programming, data analysis and understanding AI technologies, are inconsistently taught and often inaccessible in underfunded or low-income schools.

A particular concern in universities is AI-mediated assessments: those who can afford advanced, paid-for versions of AI are better positioned to perform well in these, which is patently unfair.

To ensure that higher education remains a space for critical thought and inclusive opportunity, are required to ensure that disadvantaged students don¡¯t suffer from barriers such as high costs, low digital literacy, poor internet access and limited availability of essential resources.

ÍøÆØÃÅ

ADVERTISEMENT

We must balance the benefits of AI with safeguards that preserve academic rigour and address systemic inequities. Without this balance, we may be putting higher education¡¯s role as a space for intellectual growth and inclusion at serious risk.

?is senior lecturer in academic and professional education at?Queen Mary University of London.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.
<ÍøÆØÃÅ class="pane-title"> Related articles
<ÍøÆØÃÅ class="pane-title"> Reader's comments (6)
IF shown how to use AI well and responsibly, students can achieve all these goals. It is NOT AI yes or no, but how AI is useful. Academics are far too late in paving that path. Stop whining: learn and lead or at least collaborate!
Showing students how to use AI well and responsibly is an idea that hardly anyone disputes, and it is what most of us already do. However, we must remember that not every classroom is the same, and not every student in those classrooms is the same. How educators and students respond to the risks of uniformity of ideas and shortcut thinking will also be context-specific.
Do you think so? Why didn't you mention it? And why is it actually done insufficiently?
Excellent article, and good to see the equity/access issues taken seriously. To these points I'd add the obvious third one of energy/water consumption and environmental impact. This adds to the cognitive imperative to work out (in collaboration with students) when and how what types of AI are beneficial or counterproductive. There is a real paucity of subject and module-level evidence (as opposed to fiercely held opinion) on the impact of reliance on AI summaries. Surely as research organisations universities should be systematically analysing this.
new
Indeed, of all the tasks that AI can help students with, I'd think that summarizing texts and paraphrasing are exactly the two tasks we should not be allowing/promoting. Personally I had the idea of "paraphrasing" either way, whether done by AI or by the student themselves. Students shouldn't be paraphrasing sources. The information should be going into their brains, to be encoded symbolically, and independently of words, and then coming out afresh.
new
Realistically if you haven't done the summary, planning or analysing the text then that's one bit of cognitive workout that you've missed - even though we have cars we see people going to gyms to keep their bodies healthy despite the 'leisure' of the current age. Those who don't have access or who are put off from exercising have less fit and often unhealthy obese bodies and are often despised. The AI parallel with cooking is that just as some people can no longer cook in the future some people will no longer be able to do the summarising, planning, organising and thinking done by AI and just go for a ready made answer without understanding - remember 'Little Britain' and 'computer says no' it's not so far from the truth. This is one step towards a Brave New World where the top of the intelligence based social hierarchy is taken by AI with a human elite to give it a human face.
<ÍøÆØÃÅ class="pane-title"> Sponsored
<ÍøÆØÃÅ class="pane-title"> Featured jobs
See all jobs
ADVERTISEMENT