Why are teachers so worried about the threat posed by artificial intelligence (AI) to the Leaving Cert?
Most Leaving Cert subjects nowadays require projects, practicals or other forms of coursework to be completed in advance of the written exams in June.
Under State Examination Commission (SEC) rules, students’ coursework must be completed under the supervision of a teacher and authenticated as the candidate’s own work by the teacher and school principal.
Teachers, however, are worried that tools such as ChatGPT could easily be used by students to generate this work.
If a teacher cannot confirm that coursework is a student’s, the SEC will not accept it for assessment. This could cause students to lose marks or have their results withheld.
Conversely, a teacher could confirm coursework as a student’s which is later found to be AI-generated, causing the candidate to lose marks.
Either way, some educators fear they could get caught up in civil proceedings, which is why the Association of Secondary Teachers Ireland is seeking an indemnity against legal actions.
But ChatGPT has been around for a few years. Why are teachers suddenly worried?
Not all subjects have projects or practical work, but that is about to change.
Under Leaving Cert reforms due to roll out from next September for fifth years, all subjects will eventually be required to complete additional coursework components worth a minimum of 40 per cent during the senior cycle.
The first group of subjects to be reformed includes biology, business, chemistry and physics.
How real is the AI threat?
AI is not, of course, a threat for oral exams in languages, performances in music or demonstrations in other subjects such as Physical Education where students must display their skills.
Teachers are more concerned about potential cheating in written projects.
For example, science students will be required to produce “in-practice investigations” involving research and experimentation during fifth and sixth year; these will include investigative logs where students will be expected to give an “authentic account” of how their work unfolded.
Some teachers worry that tools like ChatGPT and others could generate convincing-looking project work within minutes, which they would present as their own.
Surely teachers supervise their students’ work and would spot this kind of blatant cheating?
Yes, under SEC rules, students’ project work must be completed under the supervision of a teacher and authenticated as the candidate’s work by the teacher and school principal.
Students will be allocated about 20 hours in class to complete these projects.
However, some teachers say students will inevitably end up working on these projects at home, where their projects cannot be supervised.
Are students allowed to use AI in any form?
Yes, SEC rules state that any work generated by AI will be treated in the same way as any other material the candidate has not generated themselves.
Including it without quoting it as the work of AI will be considered plagiarism. Where AI output is included and properly referenced, no credit will be awarded for any of that material.
Is there any way to test work to see if has been generated by AI?
There are tools which claim to do so, but given the rapid advances in the area, few are reliable.
As a result, many third-level institutions are adjusting their assessments and are moving away from essays and towards more oral presentations and problem-solving tasks.
Is this issue of indemnity caught up in wider concerns about education reform?
Both second-level teachers' unions want a pause in the Leaving Cert planned reforms, citing others concerns over lack of training, resources for schools and equity issues. Minister for Education Helen McEntee has pledged to press ahead with the changes but says she will engage with unions around their concerns.