Secondary teachers are seeking an indemnity against any legal actions arising from students losing Leaving Cert marks over the improper use of artificial intelligence (AI) in their project work.
Under State Examination Commission (SEC) rules, students’ coursework must be completed under the supervision of a teacher and authenticated as the candidate’s own work by the teacher and school principal.
Teaching unions, however, say members will struggle to authenticate students’ work due to a growing array of sophisticated AI services that can generate convincing-looking project work within minutes.
[ Q&A: Why are teachers worried about AI and the Leaving Cert?Opens in new window ]
Students have been warned use of tools such as ChatGPT in their project work must be disclosed and failure to do could result in candidates losing marks, having their results withheld or being debarred from entering State exams.
Updated rules have been in place since 2023, but anxiety is growing among teachers in advance of the rollout of Leaving Cert reforms which will soon see at least 40 per cent of marks allocated to coursework across a growing number of subjects.
The Irish Times understands that Association of Secondary Teachers of Ireland (ASTI) has raised the issue of an indemnity with Department of Education officials.
[ Artificial intelligence poses cheating risk to Leaving Cert, teachers warnOpens in new window ]
In addition, a motion due to be debated at the union’s annual conference next month demands “all teachers be indemnified against any subsequent actions or legal cases taken by students against teachers” in the event of suspected cheating, while other motions call on members not to co-operate with the changes.
“It is a high priority for the ASTI to ensure that teachers are indemnified in their work in relation to additional assessment components,” a union spokeswoman said.
The Teachers’ Union of Ireland (TUI) also has concerns over how to validate students’ work, given the array of new AI tools at candidates’ disposal.
“Our members are very nervous about AI and its possibilities,” said TUI general secretary Michael Gillespie. “We have serious concerns about authenticating students’ work in some of the new additional assessment components, given the rapid expansion of AI.”
He also said members were also worried about equity, given that students from more affluent homes or with greater parental support could benefit disproportionately when it comes to completing project or coursework.
A spokeswoman for the department said it was “not considering an indemnity at this time”.
The SEC said regular engagement with each candidate’s coursework “enables teachers to authenticate any work being submitted” and that “candidates themselves, as well as teachers and school leaders, all have roles in ensuring the authenticity and integrity of the work completed and submitted to the SEC for assessment”.
Students are also advised that all exam coursework must be the candidate’s “unaided work, verified by the candidate”.
Since the advent of AI tools, SEC instructions have been updated to state that any material generated by AI software will be treated in the same way as any other material a candidate has not generated themselves.
[ ChatGPT essay cheats are a menace to us allOpens in new window ]
Including AI-generated input without quoting it as such will be considered plagiarism, which may result in the forfeit of all marks for the coursework component.
“Where any material generated by AI software is included in a coursework submission and is properly quoted or referenced, no credit will be awarded for any of that material itself. Credit can only be awarded for the effective use of this material in the support or development of the candidate’s own work,” the SEC rules state.
The issue of a legal indemnity last arose during the use of predicted grades for Leaving Cert students when the ASTI expressed concern that members could be liable for legal costs in the event of civil proceedings arising from students.
The union said it ultimately secured necessary assurances, although the department at the time said it merely clarified existing arrangements.