As a result of TikTok’s failure to shield underage users’ content from public view as well as violating EU data laws, the company has been fined €345 million (£296 million) for mishandling children’s accounts and for breaking the laws.
Data watchdogs in Ireland, which oversee the Chinese video app TikTok across the EU, recently told legal watchdogs that the video app had violated multiple GDPR rules in its operation.
In its investigation, TikTok was found to have violated GDPR by making it mandatory for its users to place their accounts on a public setting by default; failing to give transparent information to child users; allowing a parent to view a child’s account using the “family pairing” option to enable direct messaging for those over 16; and not considering the risks to children who were placed on the platform in a public setting and not considering that.
Children’s personal information was not sufficiently protected by the popular Chinese-owned app because it made its account public by default and did not adequately address the risks associated with under-13 users being able to access its platform, according to a decision published by the Irish Data Protection Commission (DPC).
In a statement released on Tuesday, the Irish Data Protection Commission (DPC) said the company violated eight articles in the GDPR, the EU’s primary regulatory authority for the company. There are several legal aspects of data processing which are covered by these laws, and they go from the legal use of personal data to protecting it from unlawful use.
In most children’s accounts, the settings for the profile page are set to public by default, so that everyone will be able to see any content that they post there. In an attempt to allow parents to link to their older child’s account and use Direct Messages, this feature called Family Pairing allowed any adult to pair up with their child’s account.
There was no indication the child could be at risk from this feature.
In the process of registering users and posting videos, TikTok did not provide the information it should have to child users and instead resorted to what’s known as “dark patterns” to encourage users to choose more privacy-invasive options during their registration process.
According to a DPC decision, the media company has been fined £12.7m after the UK data regulator found TikTok had illegally processed 1.4 million children’s data under the age of 13 who were using its platform without their parent’s consent in April.
Despite being a popular social media platform, TikTok has done “very little or nothing, if anything” to ensure the safety of the platform’s users from illicit activity.
According to TikTok, the investigation examined the privacy setup the company had between 31 July and 31 December 2020, and it has said that it has addressed all of the issues raised as a result of the investigation.
Since 2021, all new and existing TikTok accounts that are 13- to 15-year-olds as well as those that are already set up have been set up as private, meaning that only people the user has authorized will be able to view their content.
Additionally, the DPC pointed out that some aspects of their decision had been overruled by the European Data Protection Board (EDPB), a body made up of data protection regulators from various EU member states, on certain aspects.
The German regulator had to propose a finding that the use of “dark patterns” – the term for deceptive website and app design that leads users to choose certain behaviours or make certain choices – violated the GDPR’s provisions for the fair processing of personal data, and this was the reason why it had to include the proposed finding.
TikTok has been accused of unlawfully making accounts of its users aged 13 to 17 public by default, which effectively means anyone can watch and comment on the videos that individuals have posted on their TikTok accounts between July and December 2020, according to the Irish privacy regulator.
Moreover, the company failed to adequately assess the risks associated with the possibility of users under the age of 13 gaining access to its platform through marketing channels. Also, the report found that TikTok is still manipulating teenagers who join the platform by requesting them to share their videos and accounts publicly through pop-up advertisements that manipulate them.
A regulator has ordered the company to change these misleading designs, also known as dark patterns, within three months to prevent any further harm to consumers.
As early as the second half of 2020, accounts of minors could be linked to unverified accounts of adults.
It was also reported that the video platform failed to explain to teenagers previous to the release of their content and accounts to the general public the consequences of making those content and accounts public.
It has also been mentioned by the board of European regulators that there were serious doubts in their minds about the effectiveness of TikTok’s measures to keep under 13 users off its platform in the latter half of 2020.
As a result, the EDPB found that TikTok was failing to check the ages of existing users “in a sufficiently systematic manner” even though the mechanisms could be easily circumvented. Because of a lack of information available during the cooperation process, the group was unable to find an infringement, according to the group.