TikTok fined £12.7m for misusing children’s data

25 April 2023

The Information Commissioner’s Office (the ICO) has recently concluded its investigation into Tiktok’s use of the personal data of children. On 4 April 2023, the ICO announced that TikTok had breached a number of requirements under data protection law, and fined TikTok £12.7m for these breaches, making this one of the largest fines issued by the ICO to date.

Under Article 8 of the UK GDPR, organisations are not permitted to offer “information society services” (e.g. social media sites and gaming sites) to children under the age of 13 without the consent of an adult with parental responsibility and, in such cases, the organisation must make “reasonable efforts” to verify that the person providing parental consent does, in fact, hold parental responsibility for the child.

The ICO found that between May 2018 and July 2020, TikTok breached the UK GDPR by:

  • allowing approximately 1.4 million UK children under the age of 13 to use the platform without the required parental consent, even though “it ought to have been aware” that children under the age of 13 were using the platform;
  • failing to conduct adequate checks to identify and remove underage children from its platform;
  • failing to provide proper information to all of its UK users about how their data was collected, used and shared in a way that was easy to understand. Based on the information provided, children in particular were unlikely to be able to make informed choices about how to engage with the platform; and
  • failing to process the personal data of all its UK users in a manner that was fair, lawful and transparent.

As mentioned above, the fine is one of the largest issued by the ICO to date (a £20m fine was levied against British Airways, and an £18.4m fine was levied against Marriott in 2020). However, the TikTok fine is much lower than the proposed figure of £27m set out in the ICO’s original notice to TikTok last September. This is likely due to the ICO’s initial figure being based on the argument that there had been unlawful use by TikTok of “special category personal data” (i.e. personal data revealing certain protected categories of data such as health data, racial origin or religious beliefs) – an argument that was ultimately not pursued following representations from TikTok to the ICO. Nonetheless, the decision serves as a reminder to all organisations that the ICO is willing to impose substantial financial penalties for breaches of the UK GDPR.

Against this backdrop, it should be noted that, in September 2022, the ICO published the Children’s Code (we previously reported on its implementation phase and scope and the ICO's draft guidance on “likely to be accessed by Children”) which provides guidelines for protecting the personal data of children which are aimed at online services such as apps and social media sites.

There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform. John Edwards, Information Commissioner