Explainer: How TikTok’s algorithm ‘exploits the vulnerability’ of children

England’s data watchdog found that up to 1.4m children under 13 use app – and experts say they are being flooded with harmful content to promote addiction

TikTok's  algorithm that has been called the 'crack cocaine of social media'. Photograph: Ap/PA
TikTok's algorithm that has been called the 'crack cocaine of social media'. Photograph: Ap/PA

It is the home of dance tutorial videos and viral comedy sketches. But it is also host to self-harm and eating disorder content, with an algorithm that has been called the “crack cocaine of social media”.

Now, the Britain’s information commissioner has concluded that up to 1.4 million children under the age of 13 have been allowed access to TikTok, with the watchdog accusing the Chinese firm of not doing enough to check underage children were not using the app.

“There is self-harm content, there is nonsensical content about cures for mental health [conditions],” said Imran Ahmed of the Center for Countering Digital Hate, which produced a report last December that suggested TikTok’s algorithm was pushing harmful content to some users within minutes of their signing up.

TikTok fined £12.7m for misusing UK children’s dataOpens in new window ]

He said he found the most dangerous aspect to be the sheer prevalence of such content on social media platforms. “The truth is that they are being flooded with content that gives them an extremely distorted view of themselves, their bodies, their mental health, and how they compare to other people,” he added.

READ SOME MORE

Ahmed said his organisation’s research suggested that changing a user’s name from “Sarah” to “Sarah Lose Weight” on TikTok could result in its algorithm serving 12 times more self-harm content.

“The algorithm recognises vulnerability and, instead of seeing it as something it should be careful around, it sees it as a potential point of addiction – of helping to maximise time on the platform for that child by serving them up content that might trigger some of the pre-existing concerns.”

Ahmed said that the centre’s research, which was based on 13-year-old users in the UK, US, Canada and Australia, suggested that, within about two-and-a-half minutes of setting up a TikTok account, young people could be pushed self-harm content – and eating disorder content within eight.

A more recent investigation, he said, suggested a 14-year-old boy on the platform was likely to be pushed content from the virulently misogynist Andrew Tate in less than three minutes of setting up an account.

“That is deeply worrying [because] what underpins both of these, of course, are incredibly dangerous views of what a young girl should look like, and how a young man should behave and see young women. There is a fundamental misogyny that underpins both.”

Why is Tiktok under fire - and should you be worried about it?

Listen | 22:36

According to England’s information commissioner, more than 1 million underage children could have been exposed, with the platform collecting and using their personal data. “That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll,” it said

A TikTok spokesperson said: “Many people struggling with their mental health or on a recovery journey come to TikTok for support, and we aim to help them do so safely.

“Our community guidelines are clear that we do not allow the promotion, normalisation or glorification of eating disorder, suicide or self-harm content. We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.

“We are open to feedback and scrutiny, and we seek to engage constructively with partners who have expertise on these complex issues, as we do with NGOs in the US and UK.”

Ahmed added that TikTok is by no means the only social media platform he saw as doing too little to police the harmful content being shared, describing an arms race among others to devise ever more effective ways of keeping users’ attention – even if it means putting them at risk. – Guardian