‘Parents and children are being duped’: Why ex-children’s commissioner Anne Longfield is suing TikTok for billions

Ms Longfield has accused the platform of collecting “excessive” amounts of data on children, including geolocation and draft videos.
Anne Longfield, Children's Commissioner for England, pictured at her home in Ilkley.Anne Longfield, Children's Commissioner for England, pictured at her home in Ilkley.
Anne Longfield, Children's Commissioner for England, pictured at her home in Ilkley.

In May 2018, millions of online businesses scrambled to adapt to a tough new EU-wide General Data Protection Regulation law (GDPR) aimed at protecting the privacy of citizens online.

Within a single day of the regulations coming into force, Facebook and Google were hit with $8.8 billion in lawsuits for their failure to comply, while major US news sites temporarily withdrew EU access as they struggled with new legal requirements.

Hide Ad
Hide Ad

GDPR brought about tough rules for adult web users, and even tougher protections for children, who “require specific protection with regard to their personal data”, according to the UK version of GDPR.

Yet while other online businesses and social media platforms adapted their offering to become compliant, video sharing platform TikTok has continued to collect an “excessive” amount of data on children since the law came into force, alleges former English Children’s Commissioner Anne Longfield.

Ms Longfield, who lives in Ilkley and is now embarking on a legal claim against TikTok, is joining the ranks of several others who have accused the platform of breaching data protection laws in recent years. The company said today it intends to “vigorously defend” itself against the action and that it believes the claims “lack merit”.

In February 2021, the European Consumer Organisation (BEUC) accused the app of breaching users’ rights on “a massive scale”, while the company was fined $5.7m (£4.2m) by American courts for illegally collecting personal information from children under 13.

Hide Ad
Hide Ad

Late last year, a 12 year-old girl, granted anonymity by an English court, began a similar claim against TikTok, accusing the platform of using her and other children’s data illegally.

Taking up the case with Scott+Scott law firm as a “litigation friend” to the young claimant, Ms Longfield is now bringing a billion-dollar claim against TikTok on behalf of the millions of children whose data she says has been harvested without sufficient warning, transparency or the consent required by law.

“They’re collecting what we would call an excessive amount of data on children” explained Ms Longfield, whose post as England’s Children’s Commissioner ended last year.

“That includes data on names, contacts, habits and interests, who children follow on TikTok, but also information around childrens’ geolocation, videos they’ve made, even draft videos, and potentially, we believe, facial recognition.”

Hide Ad
Hide Ad

This data, some of which could reveal “where a child is at any time while using the app”, is being collected from children younger than 13, claims Ms Longfield, in spite of these children not legally being able to consent to processing of their data under UK GDPR law.

While TikTok’s own policy says its users should be 13 or older, a wealth of evidence points to high numbers of under-13s using the app in the UK. In a recent Ofcom survey, 44 per cent of British 8-12 year olds were revealed to be regular TikTok users.

“The site’s not meant for children under 13, but huge amounts are using it. They can’t in any way consent to the use of their data because they’re not meant to be on there,” explains Ms Longfield.

Data collection is just one half of the story, with even more opacity over where it’s ending up, claims Ms Longfield, citing a “lack of transparency” over which third parties who may be purchasing data for advertising purposes.

Hide Ad
Hide Ad

TikTok’s parent company, the Cayman Islands-based ByteDance, made over two-thirds of its $30 billion revenue in 2020 from advertising. “It’s a reasonable question”, says Ms Longfield, “to ask how data is being used with regards to advertisers.” The problem, as it stands, is that “literally nobody knows” where collected data is going, says Ms Longfield.

The claim accuses TikTok and ByteDance of being “deliberately opaque” about who has access to the data it collects, but notes that the company makes billions of dollars from advertising revenue generated by providing advertisers with information about users.

Such “excessive” data collection could have consequences for children later in life, she adds, with young people “profiled in a way that will follow them into adulthood”. More immediately, data on information like geolocation, ordinarily afforded only to the closest of family members, could have “highly dangerous consequences in the wrong hands,” says Ms Longfield, with bad actors potentially given access to childrens’ location data.

Ms Longfield, who has long been passionate about digital rights for children, points out that children have “been at the forefront” of testing out what it’s like to live highly digital lives, with society still in a “disorganised and often dangerous” stage of working out how to live in a digital world without compromising human rights.

Hide Ad
Hide Ad

“As an adult, when your data is being collected you usually, though not always, have the reference points in your experience to realise that it’s happening. But as a child you absolutely don’t,” she says.

If successful, Ms Longfield’s case could see TikTok forced to pay out billions in compensation to the children affected, delete data held on young users and improve their own data collection policies and transparency.

“This isn’t about trying to stop people using TikTok,” explains Ms Longfield, who notes “how important it’s been, particularly over the last year” for young people.

“It’s about getting the organisation to live up to its responsibility for those users. That means being open, transparent and acting within the law. At present I feel that parents and children are being duped by this company.”

Hide Ad
Hide Ad

Ms Longfield hopes the claim, if successful, could prove a “landmark” case with implications for other social media platforms and for the future protection of children’s digital rights.

“What I hope is that the balance of power shifts - so children are able to realise their rights to transparency, the right to be able to choose what happens to information about their lives.”

TikTok is failing to live up to its responsibilities to protect its young users, claims Ms Longfield.

“I want to ensure families, parents and children know what’s happening to their information,” she explains.

Hide Ad
Hide Ad

“I want this to be a wake-up call to TikTok and other social media platforms that there needs to be a better balance of power - being so close to children’s lives comes with great responsibility.”

When contacted for comment, a TikTok spokesperson said: “Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to help protect all users, and our teenage users in particular.

“We believe the claims lack merit and intend to vigorously defend the action.”

Related topics: