US Federal Trade Commission and Department of Justice TikTok and ByteDance are suingTikTok, the app’s parent company, has been accused of violating the Children’s Online Privacy Protection Act (COPPA). The law requires digital platforms to notify parents and obtain their consent before collecting and using personal data from children under 13.
in press release In a report released Friday, the Federal Trade Commission’s Bureau of Consumer Protection said that TikTok and ByteDance were “aware” of the need to comply with the Children’s Online Privacy Protection Act, yet they spent “years” knowingly allowing millions of children under 13 to use their platforms. The FTC alleges that TikTok did this even after settling with the FTC in 2019 over COPPA violations; as part of that settlement, TikTok agreed to pay $5.7 million and take steps to prevent children under 13 from registering.
“As of 2020, TikTok had a policy of retaining accounts of children it knew were under 13 unless the child explicitly acknowledged the age and met other strict conditions,” the FTC wrote in the press release. “TikTok’s human reviewers allegedly spent an average of just five to seven seconds reviewing each account to determine whether the account belonged to a child.”
According to the FTC, TikTok and ByteDance retained and used data from underage users, including for ad targeting, even after employees raised concerns and TikTok reportedly changed its policy to no longer require users to explicitly state their age. Even more damning, TikTok continued to allow users to sign up using third-party accounts, such as Google and Instagram, without verifying that they were over 13, according to the FTC.
The FTC also found an issue with TikTok’s Kids Mode, a mobile experience that is supposed to be more compliant with the Children’s Online Privacy Protection Act. The FTC alleges that Kids Mode collected “significantly more data” than required, including information about users’ in-app activities and identifiers that TikTok used to build profiles (and share with third parties) to try to prevent attrition.
The US Federal Trade Commission said TikTok made it difficult for parents to delete their children’s accounts, and often failed to comply with those requests.
“TikTok has intentionally and repeatedly violated children’s privacy, threatening the safety of millions of children across the country,” FTC Chair Lina Khan said in a statement. “The FTC will continue to use the full scope of its authorities to protect children online — especially as companies deploy increasingly sophisticated digital tools to monitor children and profit from their data.”
TikTok shared this with TechCrunch via email: “We disagree with these allegations, many of which relate to past events and practices that were inaccurate or mishandled. We are proud of our efforts to protect children, and we will continue to update and improve the platform. To that end, we provide age-appropriate experiences with strict safeguards, proactively remove suspected underage users, and have voluntarily launched features like virtual screen time limits, family pairing, and additional privacy protections for minors.”
The FTC and DOJ are proposing civil penalties against TikTok and ByteDance of up to $51,744 per violation per day and a permanent injunction to prevent future COPPA violations.