Education authorities do not want to draw attention to the “worrying” volume of upgrades in the Leaving Cert as it could undermine public confidence in State exams, according to a UK academic.
Prof John Gardner of the University of Stirling, who has studied the UK and Irish exam systems, said almost one in five (18 per cent) Leaving Cert results appealed last year resulted in upgrades.
“These 1,600 successful appeals represent considerable anxiety and worrying level of misclassification for the students involved, and their families, due primarily to judgment errors by examiners,” he said.
A breakdown of the upgrades by individual subject this year show the proportion of upgrades vary significantly across individual subjects.
For example, upgrades were more likely in agricultural science (22 per cent), biology (19 per cent) and geography (17 per cent). They were significantly less likely in subjects such as higher-level Irish (11 per cent) and higher-level English (12 per cent).
The only subject where results were appealed without a single upgrade was religious studies, though just 26 grades were appealed in this subject.
Main factors
Prof Gardner said research indicates that some of the main factors that lead to errors include assessor experience, fatigue, speed of working, handwriting or the contrast effect of a good piece of work followed by a bad piece.
“However, in some quarters there is a reluctance to raise public awareness on the issue,” he said. “Some feel that attempting to increase public understanding of the uncertainty in assessments would precipitate a decrease in public confidence in national examinations.”
Prof Gardner questioned whether we should “turn a blind eye to the continuing erosion of trust and confidence in national examination systems” or focus on more transparent systems such as those used in the United States.
In response, the State Examinations Commission (SEC) said the Leaving Cert was among the most transparent exams globally with published marking schemes and marked scripts available for viewing by candidates prior to deciding on an appeal.
In a statement, it said in system as large as the Leaving Cert – with 400,000 individual grades – it fully acknowledged that examiner error can happen.
“For this reason, the SEC has a robust and transparent appeals process in place,” it said.
The commission pointed out that while the proportion of upgrades has been about 18 per cent, it fell to 14.5 per cent this year, reflecting the move to broader grade bands.
“The SEC is satisfied that it operates a transparent, accessible and effective Leaving Certificate appeals system in which candidates can raise issues of concern regarding the marking of their work and have these fully and transparently addressed,” he added.
Fallible process
Prof Michael O’Leary, an assessment expert at DCU’s institute of education, said it was important to strike a balance between maintaining confidence in the exam system and accepting that creating and marking exams is a fallible process.
“I am very struck by how the public has come to accept the idea of sampling error in polling – the plus or minus 3 per cent concept – while at the same time accepting that opinion polls provide us with valuable information,” he said.
He added: “The reality of measurement error in educational assessment can also be used to argue for the importance of using multiple sources of evidence about learning – teacher judgments, class assignments, in-school tests, terminal exams.”
Case study
When Ryan Bell opened the envelope with his Leaving Cert results last August, he got a shock.
The straight-A student was convinced he had scored enough to secure a top grade in his strongest subject, higher maths.
As he scanned through the results, he saw he had scored H1s in all his subjects – bar maths.
Instead, he had received a H3, between 70 and 80 per cent.
“It was always my favourite subject,” says Bell, who was then a student at Oatlands College in Stillorgan. “I had taken extra maths classes in UCD, I had taken part in maths competitions across the country, so to see the H3 in maths – of all the subjects – was really shocking,” he said.
“I asked myself, what if I misread something? What if I made mistakes I hadn’t realised I had made in the exam? I felt as if I had let myself down and I felt I had let my teacher down as well.”
When he inspected his exam paper a couple of weeks later, he saw he had actually scored 99 per cent. A simple clerical error was to blame.
The resulting upgrade placed him in an elite group of students nationally to secure top grades across eight subjects.
“I was lucky. I had the points required to get into my course, but someone else could easily have missed out and might never have thought to check their result,” says Bell, who is now studying mathematics in Trinity.
“Another student in my college course also had an error in her maths paper and didn’t have the points.
“She was subsequently upgraded and ended up joining the course five or six weeks later. That really puts you on the back foot.”
Bell agrees that most students simply assume the result they get is correct and feels there is little appreciation that errors can occur in significant numbers.
However, he feels the facility to view scripts that was introduced in recent years is a very positive development.
“It makes it much more transparent, at least. If a mistake has been made, at least you have a chance to see it,” he says.