The decision by Google and Facebook to block abortion-related ads in the run-up to the last referendum in the Republic was "too little, too late," according to a leading expert on fake news.
Speaking at an event in Dublin, Samantha Bradshaw, a researcher at the Oxford Internet Institute, said the inability of online platforms to manage fake news during the abortion referendum was of great concern.
“If the platforms can’t handle it [fake news] in a small country like Ireland, how are they going to deal with it on a global scale?” said Ms Bradshaw, whose research into the manipulation of social media has been widely reported.
Facebook announced plans to block ads related to abortion that didn’t originate from advertisers in Ireland, just over two weeks before the referendum was held on May 25th. A day later, Google said it would suspend all ads related to the plebiscite.
Ms Bradshaw said she was “with the critics” who argued that Google and Facebook’s move to ban ads related to the abortion referendum had been ineffective.
Complicated
“I think it was a little bit too late and it points to just how complicated these problems really are. You can’t just leave it up to the platforms to resolve [them]. There is a role for regulators here, as well as for industry,” she said.
Ms Bradshaw said this was particularly the case as a look at user and advertising agreements of the leading social media giants show these rarely change – even when the big players express remorse over incidents such as the Cambridge Analytica scandal.
“There are a lot of announcements that go out as reaction to events but if you actually read through to see the changes [in agreements], there is usually no difference,” she said.
Ms Bradshaw was co-author of a report published in July which found that the problem, of computational propaganda is growing at a fast pace despite efforts to combat it. The report suggests manipulation of public opinion over social media platforms is now a critical threat to public life. And it asserts that many government agencies are now taking part in promoting fake news.
The number of countries experiencing formally organised manipulation of social media has jumped from 28 to 48 over the last year alone, she says. Much of this growth is due to political parties spreading disinformation and junk news around election periods.
Ms Bradshaw said political parties are increasingly learning from strategies deployed during Brexit and the 2016 US presidential election, with campaigns using automated bots, junk news and disinformation to polarise and manipulate voters.
Game the system
“We’re increasingly seeing government actors trying to game the system. The techniques they are using have already been used in marketing and advertising for a long time but are now being used to sell us politics,” she said.
She considers micro-targeted ads for political purposes are particularly dangerous. “Propaganda has changed. Back in the day, you had to have access to printing presses and maybe have planes to distribute leaflets across a wide area. Now you can just spend $100 and target thousands of people with a message, get instant feedback on it, find out how they engaged with it, and then refine if necessary,” she said.
“Social media platforms have not just lowered the cost for spreading propaganda, they also provide a feedback mechanism to allow you to do both mass distribution and micro-targeting. This poses huge risks to democracy,” she said.