Subscriber OnlyOpinion

Big tech is bullying teachers into premature adoption of AI. We need to slow down

Most of the Irish debate concerns potential for cheating in the assessment components of new senior cycle. Why are widespread calls for a delay being ignored?

If AI performs task for us, it removes learning. Photograph: Gabriela Bhaskar/The New York Times
If AI performs task for us, it removes learning. Photograph: Gabriela Bhaskar/The New York Times

Ever wonder why it proves that you are human if you click on the squares containing cars, buses, bicycles, or bridges? Those irritating “captchas” (Completely Automated Public Turing Test to tell Computers and Humans Apart) owned by Google are suspected to be collecting your responses and using them as labelled data to train machine-learning models used in self-driving cars.

Manually labelling data is time-consuming and expensive for a company, although poorly paid for the humans who typically carry out this mind-numbing task. If the suspicions about captcha technology being used to train AI are correct, Google is crowdsourcing this task for free.

GenAI is all about monetising our data. DeepSeek’s disruption of the market by claiming to cost a fraction of previous models’ expenditure does not take away the ethical dilemmas.

How do we educate young people to navigate a world that is being reshaped by AI?

READ MORE

Leon Furze is an author, consultant and PhD candidate on GenAI, who is a co-author of an influential AI Assessment Scale (AIAS), which has been adapted by UCC and believes it can be used effectively in education. However, he has also said that “it’s hard to view a technology that’s inherently racist, classist, sexist, bad for the environment and basically designed to line the pockets of a handful of billionaires and trillionaires as anything but a tool for corporate greed and oppression”.

What Ireland can learn from Finnish schools in tackling disinformationOpens in new window ]

Assessment is broader than terminal assessments like the Leaving Cert. Every time a student submits an essay to a teacher, that constitutes an assessment.

Level 1 of the AIAS, as adapted by UCC, means no usage of AI at all. Level 2 allows AI for brainstorming or outlining, but no AI-generated material can be used in the final submission. At Level 3, AI can be used for grammar and fluency; however, the original work could be provided for comparison.

Level 4 moves into meta-cognition, reflecting on the process of production. AI is used to complete elements of the task, with students then documenting how good, accurate or biased the AI-generated material is, and how it helped to shape their thinking. At Level 5, full use of AI, the focus of assessment is how well the student directed the AI while using critical thinking.

Most of the Irish debate concerns the abundant potential for cheating in the additional assessment components worth 40 per cent in the reformed senior cycle. There are widespread calls for an essential delay but this week, Minister for Education Helen McEntee refused.

A recent survey by Studyclix, a popular exam-focused study website, has found that 20 per cent of 1,300 respondents admitted already using ChatGPT or other generative AI tools for existing exam project work.

It did not establish if they had referenced this use, but given that current guidelines state that using material generated by any outside source will be considered plagiarism, it is unlikely.

Brighid Hennessy, a teacher and former adviser with Oide (a State support service for educators), recently wrote in The Irish Times that the EU AI Act labels educational AI systems as high risk. From this month, schools must train teachers and leaders on how AI tools work and ensure compliance with the Act. This includes safeguarding student privacy and data security.

While Oide has a good introductory online course on GenAI which 1,000 teachers have taken, it only begins to address the issues. In this light, the proposal to proceed with additional assessment components in their current form begins to look reckless.

Learning depends on an optimum level of friction. We practise maths problems to embed mathematical principles. We write essays to improve thinking. If AI removes that friction by performing the task for us, it removes the learning.

‘We’re not going back to a world before AI’: how technology is changing the classroomOpens in new window ]

There could be a possible process akin to Level 4 of the AI Assessment Scale, where students would handwrite a brainstorm on an assessment under supervision. They would then submit it to AI for elaboration, reflect on what the AI tool suggests, write a draft under supervision, and have it polished by AI. The final step would be reflection on what changed and why, and whether the changes are better or worse. This process would embed reflection at every step. The problem is that this would demand a lot of time and equitable AI access. To use and document the use of AI in this way, students would need a sophisticated level of higher-order thinking.

The irony is that the most able students would receive the greatest benefit from this process, not to mention those who could afford paid subscriptions to higher-tier GenAI. Similarly, teachers would need levels of GenAI expertise, which most do not yet have.

Teachers need time to wrestle with GenAI, explore its possibilities and create guidelines as professionals and educational experts on how it should be used.

The corporations that ruthlessly scraped every iota of creative work and knowledge from the internet are busily engaged in bullying schools into premature adoption of AI. Delaying the introduction of the new senior cycle by a year and introducing intensive professional development for teachers that honours their insights is the only way forward.