TikTok isn’t just facing financial penalties in the US over claimed child privacy breaches. The UK’s Information Commissioner’s Office (ICO) has warned TikTok that it might face a £27 million (about $29.2 million) fine after the watchdog determined that the social network may have broken data protection law by “failing to protect” kids’ privacy between May 2018 and July 2020. The company may have handled the data of children under 13 without parental consent, processed “special category” data (such as ethnicity, sexual orientation or health) without a legal foundation and didn’t offer necessary information to users in a “concise, transparent and easily understood” fashion.
ICO began investigating TikTok in February 2019, soon after the US’ Federal Trade Commission fined the social media heavyweight $5.7 million over reported child privacy infringements. At the time, the UK overseer was concerned about both TikTok’s “completely open” direct messaging as well as its transparency tools. Sexual predators were found messaging users as young as eight years old, and it was relatively easy for kids to bypass the app’s age gate.
The office stressed that these were preliminary findings, and that there was no definitive conclusion that TikTok broke the law or will pay a fine. ICO added it would “carefully consider” TikTok’s stance before making a final decision. We’ve asked the company for comment, and will let you know if we hear back.
There’s mounting pressure on TikTok to protect kids. In the US, members of Congress and state attorneys general are grilling TikTok over possible harms to child users, including attempts to keep them riveted to using the app. A UK fine might not be the end of the company’s troubles, at least until politicians and regulators are satisfied it’s keeping young people safe.