Does the App Tiktok Violate Child Privacy Laws?

If you are like a lot of parents, you probably have concerns about your child’s personal data being on the Internet.

The concerns get even more serious when your child is old enough (or thinks they are old enough) to start using social media on their own, and you are no longer the gatekeeper. In fact, your tween might know more about social media than you do, and while you probably "get" Facebook and Twitter, newer social media platforms such as TikTok may leave you mystified.

What is TikTok?

TikTok is a social media app that has taken the world by storm, especially among young people. In fact, the app, which allows users to create and share short, creative videos, is most common among Generation Z, with close to 70% of TikTok’s audience between 16 and 24 years old.

Owned by Beijing-based technology company ByteDance, TikTok is a global platform, with more than 1.5 billion users throughout the world. Most videos on TikTok are short, between 3 and 60 seconds, and many involve lip-syncing to popular music. TikTok actually started as the app musical.ly, an app used to produce short lip-sync videos.

Is TikTok Safe?

TikTok has been in the headlines a lot over the past year because of red flags involving security censorship. In January 2020, the cybersecurity research group Check Point Research raised concerns over several security risks, including the app’s vulnerability to hackers stealing personal information from users, deleting videos, and posting videos. TikTok claims that the issues have all been addressed and that its platform is safe for users.

Lawmakers, too, have raised concerns after TikTok allegedly censored topics considered sensitive by the Chinese Communist Party. In late 2019, U.S. senators called for an investigation into TikTok’s censorship practices as well as potential national security risks posed by the app. TikTok has insisted that the Chinese government does not have a say in the app’s content and that it intends on cooperating with regulators.

Even so, the U.S. Army banned soldiers from using TikTok in December 2019.

TikTok Faced a Class Action Over Child Privacy

Beyond security flaws, which are commonplace across social media platforms, and censorship concerns, TikTok has also been accused of violating child privacy laws. In late 2019, a group of parents filed a class action lawsuit against TikTok claiming that the app violated children’s privacy laws by collecting and exposing the personal data of minors.

The lawsuit, which TikTok settled the day after it was filed, alleged that the app did not out in the proper precautions preventing children from using the app. Specifically, the app requested minors under the age of 13 who created accounts to provide personally identifying information such as name, phone number, email address, photo, and bio ? and that information was made publicly available.

The lawsuit also alleged that the app collected location data from users, including minors.

Under the Children’s Online Privacy Protection Act (COPPA), social media companies cannot collect the data of children under 13 years of age without express consent of parents or guardians.

Also in 2019, TikTok reached a $5.7 million settlement with the U.S. Federal Trade Commission, which alleged similar COPPA violations.

Laws Protecting Children Online

To say the internet has changed a lot since COPPA was passed in 1998 is an understatement. Some believe that it has changed so much that is requires an update to the law. Recently, a U.S. representative introduced a new bill aimed at doing just that.

The proposed bill, the PRotecting the Information of our Vulnerable Children and Youth (PRIVCY) Act, would extend privacy protections to all young people under the age of 18 and ban targeted advertising to kids under 13. The bill would also remove the Safe Harbor provisions of COPPA, which allow social media companies to self regulate on these matters, and add provisions that give the FTC more enforcement power.

Another bill aimed at updating COPPA is the Preventing Real Online Threats Endangering Children Today (PROTECT) Kids Act, which was introduced by bipartisan lawmakers in January 2020.

The bill would raise the age of protection under COPPA to 16, give parents the right to delete any personal information that websites have collected about their children, and expand the types of data protected by COPPA.

How to Protect Your Child on TikTok

Common Sense Media, a non-partisan, non-profit that provides advice for parents on safe technology and media use for children, recommends that kids be age 16 or older to use TikTok, citing privacy issues and mature content. 

TikTok allows users 13 years old to use the app, and it requires that users under the age of 18 to  have approval of a parent or guardian. There is also a section of the app that is meant for kids under 13 and restricts access to adult content. However, it is easy for kids to just enter a false birthdate to get passed these safety measures. 

For these reasons, Common Sense Media recommends that if parents want to allow tweens or young kids to use the app, the parent should own and run the account so they can monitor what their kids are viewing and sharing.

TikTok also has safety functionality that parents can set up by tapping on the three dots at the top right corner of the user profile. Parents can use TikTok’s "Digital Wellbeing" features to limit time spent on the app (Screen Time Management) and limit video content that may be inappropriate (Restricted Mode). These features are password protected, so kids can’t just turn them off. 

However, Common Sense Media says TikTok’s Restricted Mode is not foolproof. Age-inappropriate videos can sometimes slip through, so parental monitoring is the safest bet.

Related Resources:


Source: Law

Leave a Reply

Your email address will not be published. Required fields are marked *