In seeking legislation to allow the use of facial recognition technology (FRT) in policing, Ireland risks introducing a technology that scientific evidence has demonstrated is ineffective, inherently flawed, opaque and discriminatory. Rights and civil liberties advocates across the globe have formed coalitions to warn of its dangers. If Ireland goes ahead with this controversial technology, it is a matter of time before the country finds itself in another cautionary international headline.
The idea of teaching a computer to recognise a face emerged as an offshoot from computer vision research, with the first notable proposal – eigenfaces – put forward in 1991 by Matthew Turk. The early versions were practically useless. Funded by the US defence agency Darpa and fuelled by America’s fear of being bested by the Soviet Union, computer vision techniques improved over time. The idea of using FRT to surveil the public gained traction.
The early trials on crowds were at Super Bowl games in the US. The technical, legal and ethical red flags were obvious and enthusiasm for the technology lapsed following interventions from groups including the American Civil Liberties Union. But renewed interest in FRT surged following 9/11.
[ Gardaí spend €250,000 on drones ahead of legislation permiting use in policingOpens in new window ]
The Government here first announced plans for the use of FRT by An Garda Síochána in May 2022. But this was met with a wall of opposition from experts across Irish universities and 13 NGOs, including the Irish Council for Civil Liberties (ICCL) and Digital Rights Ireland (DRI). However, the Government injected fresh energy into its plan in the wake of the Dublin riots.
Donald Trump is changing America in ways that will reverberate long after he is dead
The jawdropper; the quickest split; the good turn: Miriam Lord’s 2024 Political Awards
The mystery is not why we Irish have responded to Israel’s barbarism. It’s why others have not
Enoch Burke released from prison as judge doubles fine for showing up at school
On Christmas week last year, a heads of Bill for FRT was published.
Garda Commissioner Drew Harris argued that in order to “vindicate the human rights of citizens in a digital society, An Garda Síochána must have access to modern digital image analysis and recognition tools”. But FRT, as experts have demonstrated, impedes human rights, including the rights to privacy and freedom of assembly, expression and movement.
The Joint Committee for Justice held a series of hearings on the Bill as part of pre-legislative scrutiny, during which numerous bodies – including the Law Society of Ireland, the Data Protection Commission, Rape Crisis Network of Ireland, ICCL, DRI and several academic experts – raised serious concerns. Issues with the Bill ranged from glaring gaps, incompatibility with European Union laws, to FRT’s ineffectiveness, disproportionate negative impact on communities at the margins of society, and human rights violations.
Criticism came thick and fast. As data-protection lawyer Simon McGarr points out, “One way to gauge if a piece of legislation has been successfully drafted is whether the State’s data regulator, civil society, a wealth of leading international academic experts and the representative for an entire legal profession comes in and points out that it would be illegal. And that’s what happened.”
One significant peculiarity emerged during the hearings when An Garda Síochána chief information officer Andrew O’Sullivan asserted that the technology would not be used to make “definitive identifications. We do not have a reference database, as is sometimes misunderstood, that we can compare against. We are not attempting to make identifications.” This, as McGarr points out, is “akin to a bald man demanding legislation for free combs”. Yet, the use of FRT for identification by reference to a database of images is provided for in the Bill.
In order to run FRT in public, a reference database to match images against is essential. Despite repeated questions from experts, An Garda Síochána has not provided clear answers regarding the reference database, its size, content or source. However, given how broadly the Bill is worded, it could allow the public services card database, a biometric database containing the majority of the Irish population’s facial images, to serve as this reference database.
Audits from Cambridge researchers have found that deployments of FRT in England and Wales failed to meet all minimum legal and ethical standards and requirements
FRT mislabels and miscategorises faces with dark complexions. In the US alone, six people have so far been wrongfully arrested and detained due to errors, all of whom are black. It is unknown how many misidentified people may have taken plea deals. Robust scientific research confirms higher error rates of FRT identifying darker-skinned people. Yet, Mr O’Sullivan claimed FRT has a 99 per cent accuracy rate, citing a US National Institute of Standards and Technology (NIST) report, at the pre-legislative hearing.
For this 99 per cent claim, Mr O’Sullivan cherry-picked the best-performing algorithm out of hundreds of algorithms that NIST evaluated, an algorithm called “cloudwalk_mt_007″. This algorithm is developed by CloudWalk Technology, a Chinese company known for aiding mass surveillance of the Uyghur population. Crucial information on how their algorithms operate is kept secret.
NIST evaluated the algorithm using high-quality, clean images collected from US visa applications, border kiosks and mugshots. But An Garda Síochána’s stated intention is to use any imagery it legally holds or can access. The gardaí's input images will be drawn from CCTV (unreliable due to camera angle, distance, position and lighting), often containing multiple faces in a single image (a major challenge for FRT algorithms), and blurry images. To cite NIST’s laboratory study for the proposed Irish real-world use is, by accident or design, misleading.
Still, even with clean test data, the “cloudwalk_mt_007″ algorithm fails on border kiosk images, which are the closest to real-world application of FRT. The algorithm also shows discrepancies by ethnicity. West African women are by far the most affected, meaning its use could have a catastrophic impact in misidentification of black women.
For relevant studies of FRT’s performance, audits of deployed FRT in contexts similar to Ireland should be considered. Audits from Cambridge researchers have found that deployments of FRT in England and Wales failed to meet all minimum legal and ethical standards and requirements.
The debate on “accuracy” only obscures the bigger issues: FRT is a threat to fundamental rights, accurate or not. Global expert groups with hundreds of stakeholders – including Access Now, Amnesty International, European Digital Rights, Human Rights Watch, Internet Freedom Foundation and Instituto Brasileiro de Defesa do Consumidor – have formed coalitions to ensure that FRT in public spaces is banned and the technology has been rolled back in many US cities.
The Garda Commissioner’s framing of the introduction of this tech as “vindicating the human rights of citizens” when leading human rights experts have demonstrated how the tech diminishes human rights, is ironic to say the least.
The Government’s attention should be on providing enough resources to the Garda, currently beset by numerous problems affecting retention and recruitment of gardaí, including low pay, pensions, rostering and training. Introducing controversial facial-recognition technology is a misstep that will only backfire.
Abeba Birhane is a cognitive scientist, currently a senior adviser in AI accountability at Mozilla Foundation and an adjunct assistant professor at the School of Computer Science and Statistics at Trinity College Dublin