The Irish Times view on Facebook: when revenue trumps morality

The only shift in Facebook’s typically inadequate response has been a new show of defiance rather than the usual abject apology

Amongst the charges is that Facebook made changes to its algorithm in 2018 which greatly increased the likelihood users would see inflammatory posts. Photograph: Justin Sullivan/Getty Images
Amongst the charges is that Facebook made changes to its algorithm in 2018 which greatly increased the likelihood users would see inflammatory posts. Photograph: Justin Sullivan/Getty Images

'A lot has been said about Facebook this week." So begins a September blog post by the social media giant, an attempt to refute serious accusations made in a series of Wall Street Journal articles collectively entitled The Facebook Files. These emerged from a whistleblower's allegations and leaked documents.

Wearily, the general public might well ask, “Which week?” Facebook’s statement could apply to nearly any calendar point in recent years. It’s also not exactly a convincing return salvo to explosive revelations about what Facebook goes on to call “some of the most difficult issues we grapple with as a company”. And yet the sentence exemplifies Facebook’s strangely detached, vague responses when under public fire, and reminds us how often we have been here before, and how little changes after. But perhaps, with this exposé, a moment of global reckoning has finally come.

We now have shocking evidence that inside Facebook, executives were aware the company’s own research indicated alarming problems, even as executives publicly claimed otherwise.

This week, the whistleblower revealed herself, coming forward on the US TV programme 60 Minutes. Frances Haugen formerly worked for Facebook's 'civic integrity' team until it was dissolved by the company. She left in dismay at what she'd seen. "The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook," she said. Facebook chose to "optimise for its own interests".

READ MORE

Amongst the charges is that Facebook made changes to its algorithm in 2018 which greatly increased the likelihood users would see inflammatory posts. Outrage prompts user activity, and hence, drives revenue. Instead of calming social media fires, the algorithm fuelled them.

Facebook also knew Instagram caused mental harm. One in three teenage girls, and a tenth of boys, told the company that posts – often by 'influencers' with digitally-enhanced photos – made them feel unhappy about themselves. And while new rules were introduced to police content, these were enforced against average users while high-profile figures got a pass. The only shift in Facebook's typically inadequate response has been a new show of defiance rather than the usual abject apology.

But this week’s sudden international outage of Facebook’ trio of platforms – Facebook, Instagram and WhatsApp – is a reminder of its phenomenal reach, power and hold. Facebook says a breathtaking two billion people used its platforms last month. Much of the world’s online interaction is in thrall to a company prioritising revenue and growth over the health and wellbeing of its users. Regulators must find solutions, as the company clearly will not.