Skip to content

TikTok fined $15.9 million for misusing children’s data in Britain

    The UK’s data protection authority on Tuesday fined the popular video-sharing app TikTok $15.9 million for failing to comply with data protection rules designed to protect children online.

    The Information Commissioner’s Office said TikTok improperly allowed up to 1.4 million children under the age of 13 to use the service in 2020, in breach of UK data protection rules requiring parental consent for organizations to disclose personal information. use information from children. TikTok failed to get that permission, regulators said, even though it should have known younger children were using the service.

    The UK investigation found that the video-sharing app was not doing enough to identify underage users or remove them from the platform, even though TikTok had rules preventing children under 13 from creating an account. TikTok failed to take adequate action, regulators said, even after some senior employees of the video-sharing platform internally raised concerns about underage children using the app.

    TikTok, owned by Chinese internet giant ByteDance, is also under fire in the United States. Last month, members of Congress questioned the CEO, Shou Chew, about the platform’s potential national security risks.

    The TikTok privacy fine underlines the public’s growing concern about the mental health and safety risks that popular social networks can pose to some children and adolescents. Last year, researchers reported that TikTok 13-year-old users began recommending content related to eating disorders and self-harm within 30 minutes of joining the platform.

    In a statement, John Edwards, Britain’s information commissioner, said TikTok’s practices could put children at risk.

    “An estimated one million young people under the age of 13 improperly accessed the platform, with TikTok collecting and using their personal data,” Mr Edwards said in the statement. “That means their data may have been used to track and profile them, allowing them to deliver potentially harmful, inappropriate content on their next scroll.”

    In a statement, TikTok said it disagreed with the regulators’ findings and was reviewing the matter and considering next steps.

    “TikTok is a platform for users ages 13 and older,” the company said in the statement. “We invest heavily to keep children under 13 off the platform, and our 40,000-strong security team works around the clock to keep the platform safe for our community.”

    This isn’t the first time regulators have cited the popular video-sharing app on children’s privacy issues. In 2019, Musical.ly, the operator of the platform now known as TikTok, agreed to pay $5.7 million to settle the Federal Trade Commission’s suit alleging violation of child online privacy protection rules in the United States.

    Since then, lawmakers in the United States and Europe have introduced new rules to strengthen the protection of children online.

    In March, Utah passed a sweeping law that would ban social media platforms like TikTok and Instagram of allowing minors in the state to hold accounts without parental consent. Last fall, California passed a law requiring many social media, video games and other apps to enable the highest privacy settings — and disable potentially risky features like friend finders that allow adult strangers to contact children — by default for minors.