Coimisiún na Meán, the new incarnation of the former Broadcasting Authority of Ireland (BAI), is approaching its first year in operation as regulator for broadcasting and online media.
It’s an odd mix. There’s no compelling reason as to why two sectors with significant and complex regulatory demands, superficially related by video, are best lumped into one body.
Especially so when it’s the body overseeing the Irish implementation of the massive Digital Services Act, one of the most sweeping pieces of online regulation in the world. Given the commission’s complex brief – one likely to rival the Data Protection Commission (DPC) in international impact, because so many of the companies are based here – online safety and services regulation should have been the responsibility of a regulator tasked with this sole area.
But this hybrid is what we got. The new commission was launched last March with the immediate job of producing an enforceable Online Safety Code, which it published in draft form in December. The foreword states the code is intended to create “concrete steps platforms will take to build a safer space” online, such as “preventing the sharing of illegal and harmful content, operating effective age-verification systems [and] requiring effective content moderation and complaint handling.”
That’s about as fraught a challenge as could be imagined, especially on this short timeline. The whole world has grappled for years with how to define and ensure online safety.
Coimisiún na Meán hasn’t yet determined what will be covered by the code, either, so coming up with solutions to still-unspecified problems is a confusing cart before horse endeavour
Two outcomes of that undertaking offer some insight into the commission’s functionality. First the (potentially) good. One code proposal is that “recommender systems” be turned off by default for sensitive data. Currently, big tech algorithms serve people video content based on their data profiles gleaned from personal information, which may include sensitive data such as sexuality, political or religious beliefs, ethnicity or health status.
This proposal has been lauded by the Irish Council for Civil Liberties, which has argued it be adopted globally. The council recently conducted a national poll, in which 82 per cent of Irish people said they wanted such “toxic algorithms” shut off. Commenting on the poll, the council’s senior fellow Johnny Ryan said: “These findings show that the vast majority of the Irish public do not want toxic algorithms interfering in their online lives.”
Generally, I fully agree. But “generally” is the problem. This is just a general proposal. The new commission doesn’t yet have a code. Consultation on the draft document was to have ended last Friday (one has to question why the draft code was released, and a relatively brief consultation period announced and deadlined, during a big holiday period).
[ Reddit takes legal action against Coimisiún na Meán over new online measuresOpens in new window ]
Coimisiún na Meán hasn’t yet determined what will be covered by the code, either, so coming up with solutions to still-unspecified problems is a confusing cart before horse endeavour. So, while I agree in principle to throttling recommender systems, we need concrete detail.
On the bad side – oh, so woefully bad – is the suggestion from the commission’s executive chairman, Jeremy Godfrey, that the State build a porn user register, based on people submitting passport details and uploading facial scans. The idea is to verify ages and identity to protect children from the harm of accessing such sites. But this utter nonsense would create greater harm and risk.
As solicitor and data expert Simon McGarr states in his online newsletter The Gist: “This is the national internet regulator proposing that it would require that everyone, adult and children alike ... upload their state ID and live selfies, to porn sites to have biometric processing of their facial images performed. Resulting, among other things, in an effective register of porn preferences for adults and a collection of selfies of children kept by the porn sites for six years (required to prove they have complied with the regulation, you see).”
And, McGarr adds, the age-verification element could be applied to large video-sharing platforms more broadly - handing, say, Elon Musk’s X/Twitter your biometric data and passport details. It’s all absolutely, mega-scale bonkers. Imagine the many rogue porn and video sites based in uncontrollable regions such as Russia thus receiving people’s – children’s! – passport details and biometrics.
The good news is, this will never come to pass because the proposal will not meet the GDPR’s requirements for proportionality and necessity, creates a gobsmacking data privacy and protection risk and produces an information security nightmare.
That such a proposal has even been exposed to public oxygen – that it wasn’t immediately vaporised at conception on internal legal advice – brings us full circle to questions about the commission’s odd incarnation as a bolt-on to a former media regulator.
Seriously: does the commission have the needed structure or expertise to regulate this sector and its deep, democracy and society-roiling problems? Because sure, I’d like to see recommender algorithms tightly regulated, but not at the unthinkable cost of also having to upload my passport details and a facial scan to companies just to view a cat video.
The commission has decided its original January consultation deadline was, ahem, too short. So you’re in luck. Until January 31st, you can offer your own views on the draft code to VSPSregulation@cnam.ie.
- Sign up for Business push alerts and have the best news, analysis and comment delivered directly to your phone
- Find The Irish Times on WhatsApp and stay up to date
- Our Inside Business podcast is published weekly – Find the latest episode here