Subscriber OnlyTechnologyAnalysis

TikTok ruling raises more questions about Chinese company’s safety standards

Irish watchdog’s €345m fine of TikTok - the fourth-largest penalty against Big Tech - shows Beijing-owned video-sharing app is firmly in sights of data regulators

A view of a TikTok signage in the TikTok office canteen during a press tour of Bytedance and TikTok'soffices in Singapore in August. Ireland's data regulator has fined the social media giant €345million. Photograph: How Hwee Young/EPA

In an abrupt decision last April, the Government told civil servants to remove TikTok’s video-sharing app from official devices. The move reflected serious security concerns about data held by the Chinese-owned business. Now Ireland’s data regulator has imposed a huge fine on TikTok for violating children’s privacy on its app. The ruling raises yet more questions about the company’s safety standards.

The €345 million fine from data-protection commissioner Helen Dixon is the fourth-largest on a Big Tech company since she assumed sweeping European powers in 2018 to supervise social media giants such as TikTok who have their European Union headquarters in Ireland.

Each of the three larger fines were against Facebook owner Meta, the biggest being a €1.2 billion penalty in May that took total fines against that company to some €2.5 billion.

But TikTok, owned by ByteDance of Beijing, is now firmly in the sights of data regulators.

READ MORE

In addition to the fine for breaching children’s data rights, the company faces a separate investigation by Dixon’s office into transfers of data to China. That inquiry could yet turn out to be the most heated to be undertaken by her office, given swingeing clampdowns against TikTok by assorted authorities in the United States, Europe, Canada, Britain and India because of security concerns.

TikTok has struggled to win over its many critics, although some observers cast western moves against the company as a “dig” against China at the time of rising geopolitical tension.

In the case to hand, however, TikTok has been found culpable of grave privacy violations against child users and failing to protect their data. In total there were eight infringements of Europe’s general data-protection regulation (GDPR), a body of law in force for five years which is supposed to impose better control over the exploitation of personal data by big business.

Dixon’s office found “unverified” adult users of the app could enable direct messages for some teenagers with whom they had no family connection. TikTok’s “family pairing” feature could also link child accounts to adults who were not their parents or guardians.

These especially serious findings point to a lax or cavalier attitude to child safety during the period under inquiry, six months in 2020 (July-December) when TikTok was growing fast in the first year of the global pandemic. Yes, the company insists the offending features were removed “well before the investigation even began”. But that is cold comfort for parents who see teenagers and even pre-teens in thrall to TikTok’s wildly popular videos.

The findings are damning. TikTok failed to tell child users its public-by-default processing of accounts “meant that an indefinite audience, including non-registered users, would be able to view their personal data.”

Moreover, Dixon’s office found TikTok’s language seemed to “trivialise the decision to opt for a private account” on the app. That was despite the fact that the implications of having a public account were “particularly severe and wide-ranging” because published content “could be accessed, viewed and otherwise processed beyond the control of the data subject” and TikTok.

The investigation cast light on the “cascading implications” of TikTok’s public-by-default account on other settings for child users, with videos and comments posted publicly by default and some features enabled by default. It also found the selection of a “skip” button on the registration feature used by children had a “cascading effect of allowing many further platform settings [to] be rendered public – including the accessibility of comments on video content” that children made.

In addition, the language and colours used on the settings incentivised children to select of the posting of videos publicly. Where videos were posted publicly and the user held a public account, this had the effect of making it viewable and accessible by an “unlimited audience.”

No matter what TikTok claims about the size of the penalty, all of this is very bad for its reputation. If the company didn’t exercise due care then with children’s data entrusted to it, how can anyone be certain it does now?

As with her previous Meta inquiries, Dixon’s TikTok findings ran into to resistance at the European Data Protection Board in Brussels. This is the powerful pan-European assembly of national and regional data bodies, which must approve draft cross-border penalties handed down by the Irish regulator before they are made final.

The objections this time were from the Italian national regulator and German regional regulators in Berlin and Baden-Württemberg. After a cumbersome dispute resolution procedure in Brussels, the number of TikTok GDPR infringements in the TikTok ruling was increased by one to eight from seven in Dixon’s original draft.

The additional infringement related to TikTok’s use of “dark patterns” on the app to nudge child users in a particular direction. Such features, the Germans said, made it harder for children “to make a choice in favour of the protection of their personal data, rather than to the detriment of their data protection.”

Still, the European board rejected a push by the German and Italian regulators to increase the fine that Dixon proposed.

That divisions between regulators took no less than 12 months to settle seems tardy, but that is the fact of it. The Irish regulator first circulated her draft TikTok ruling last September, setting in train the unwieldy process that culminated only on Friday.

Data rights campaigners have always said enforcement should be sharper and swifter, criticism Dixon is quick to dismiss. But the cause of data protection and the interests of TikTok’s many child users were not served by inordinate delays.

This may be but one battle in a simmering global conflict over the reach and role of TikTok, yet it no less important for that.