As the seasons change and the tide turns, so we face another Facebook furore involving a feature prompting angst and outrage for misleading and/or irritating users.
The latest instalment in the ongoing series concerns revelations on tech site Gizmodo, citing former Facebook staff, that the “trending topics” widget, which displays a series of potentially interesting headlines to capture your attention and your clicks in the top right corner of the desktop version of Facebook, was not an automated feed but was actually curated by humans. Worse, those humans are said to have exhibited biases, including suppressing conservative news media and promoting a liberal agenda.
"This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics, from appearing in the highly influential section, even though they were organically trending among the site's users," Gizmodo reported.
Facebook fired back with an emphatic denial, with Tom Stocky, the company's vice-president of search, writing that "Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin, and we've designed our tools to make that technically not feasible".
Inquiry
That wasn't enough to stop Republican senator John Thune, of the US Senate's commerce committee, launching an inquiry into the allegations.
Certainly, there are some caveats to the Gizmodo story, with the main source sounding as if he was primed to see bias everywhere he looked. However, this is uncharted territory for all involved. For one thing, Thune would never have launched such an investigation into editorial practices at the New York Times or CNN, as journalists obviously have first-amendment protections.
That illustrates Facebook’s first problem here: the word “trending” has a definite implication of abstract algorithmic authority about it, a veneer of automated neutrality. In its help section on what gets featured, Facebook says “the topics you see are based on a number of factors including engagement, timeliness, pages you’ve liked and your location”.
If it had openly acknowledged from the beginning that a curation team audited topics for quality control – a perfectly reasonable safeguard – this scandal would be far more muted.
What is also getting slightly lost in the debate is that the “trending topics” section is a peripheral feature with limited engagement compared with the main Facebook news feed. For some reason, I never see “trending topics” when I log in to Facebook on a desktop computer, while, on mobiles, it is buried far down the search results.
The news feed, on the other hand, is a far more powerful tool of political and social influence, providing most of the media links people see and engage with.
The algorithm that determines what you see is utterly inscrutable, to the point that an estimated 60 per cent of users don’t realise it is an algorithm at all, believing the news feed to be an unfiltered stream of links and updates from their friends.
The news feed is also subject to rather heavy moderation, with seemingly arbitrary enforcement of community guidelines and a puritanical zero-tolerance policy towards nudity. (No satirical portraits of a naked Donald Trump allowed, thank you very much.)
Somewhat forgotten in all this is that Facebook itself has already proven its own ability to control voting behaviour and political engagement. To cite just one example, back in 2010 the social network conducted “a randomised controlled trial of political mobilisation messages delivered to 61 million Facebook users” during the US congressional elections. The experiment showed that “the messages directly influenced political self-expression, information seeking and real-world voting behaviour of millions of people”.
Echo-chamber effect
The news feed is driven by your connections, and thus fuels polarisation far more than a curated list of trending topics, by creating an echo-chamber effect that reinforces tribal loyalties. We like what we like, we’re interested in what we’re interested in, we believe what we believe, and Facebook profits by catering to those impulses, not challenging them.
Ultimately, I suspect what is at issue here is the notion of objectivity. Modern journalism traditionally aspires to rigorous impartiality, but in the dialogue between journalist and reader there is an implicit understanding that true objectivity is impossible, and what we get instead is an ongoing pursuit of fairness and trust.
On the other hand, the computer, the algorithm, the automated feed, all promise a perfect sort of objectivity, rigorous and mechanical. It sounds like the holy grail in the quest for impartial news, but is based on the misconception that algorithms are objective.
Just like newspapers, they are created by humans, and similarly vulnerable to all sorts of biases.
For readers, there is a type of transparency in knowing your newspaper was edited by other people, whose viewpoints and personalities are ultimately discernible. But when that relationship becomes blurred by the unknowable personality and biases of an algorithm, that transparency becomes opaque.
The current scandal reveals a gnawing fear as the realisation dawns that Facebook wields unprecedented power, swiftly becoming the world’s pre-eminent publisher on a scale that was impossible to conceive of just a few years ago. We are just beginning to get a sense of what that power looks like, and we are understandably uncomfortable with what we are faced with.