The UK on Tuesday imposed a 12.7-million pound fine on Chinese video app TikTok for a number of breaches of data protection law, including failing to use children’s personal data lawfully. The Information Commissioner’s Office (ICO), the country’s information watchdog estimates that TikTok allowed up to 1.4 million UK children under the age of 13 to use its platform in 2020, despite its own rules not allowing children that age to create an account.
The move follows a UK government move last month to ban TikTok from all government phones amid security concerns around the Chinese-owned social media app.
The ban brought the UK in line with the US, Canada, the European Union (EU) and also India – which has banned TikTok entirely from the country, even as the company strongly denies sharing user data with the Chinese government.
UK data protection law says that organisations that use personal data when offering information services to children under 13 must have consent from their parents or carers.
“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws,” said John Edwards, UK Information Commissioner.
“TikTok should have known better. TikTok should have done better. Our 12.7 mn pounds fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform,” he said.
TikTok said it is reviewing the decision and its next steps.
According to Edwards, under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering “harmful, inappropriate content at their very next scroll”.
TikTok is also accused of failing to carry out adequate checks to identify and remove underage children from its platform. The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view, TikTok did not respond adequately.
Giving details of the contraventions, the ICO found that TikTok breached the UK General Data Protection Regulation (UK GDPR) between May 2018 and July 2020 by providing its services to UK children under the age of 13 and processing their personal data without consent or authorisation from their parents or carers.
It also breached UK laws by failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand.
Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it and failed to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner.
A TikTok spokesperson told the BBC that its “40,000-strong safety team works around the clock to help keep the platform safe for our community”.
“While we disagree with the ICO’s decision, which relates to May 2018 – July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering the next steps,” the spokesperson said.
The watchdog had previously issued the Chinese social media firm with a “notice of intent”, or a precursor to handing down a potential fine, warning TikTok could face a 27 million pound fine for its breaches.
The ICO said that after taking into consideration the representations from TikTok, it had decided not to pursue the provisional finding related to the unlawful use of special category data.