Popular video-sharing platform TikTok is facing legal action from the U.S. Department of Justice (DoJ) and the Federal Trade Commission (FTC) for alleged violations of children’s privacy laws. The government agencies claim the company knowingly allowed children to create accounts and interact with adults on the platform, while collecting and retaining personal information without parental consent.
These practices, according to the lawsuit, contravene the Children’s Online Privacy Protection Act (COPPA), which mandates online platforms to obtain parental consent before collecting data from children under 13. TikTok is also accused of disregarding a previous agreement with the government to implement stricter child protection measures.
The complaint details how TikTok allegedly collected extensive personal information from millions of children, facilitating targeted advertising and exposing minors to adult content. The company is also criticized for failing to effectively verify users’ ages, allowing underage children to easily bypass age restrictions.
TikTok has faced similar accusations in other countries. The European Union and the United Kingdom have imposed substantial fines on the company for data privacy breaches affecting children. In response to the latest lawsuit, TikTok has denied the allegations, asserting its commitment to child safety and privacy.
This legal battle comes as global regulators increase scrutiny of social media platforms and their handling of children’s data. The U.K.’s Information Commissioner’s Office has issued warnings to multiple media and video-sharing platforms to improve their child protection practices.
The outcome of the lawsuit could have significant implications for TikTok’s operations in the United States and set a precedent for how tech companies handle children’s data worldwide.
TikTok’s Response
TikTok has vehemently denied the allegations brought forth by the DoJ and FTC. The company maintains that it offers age-appropriate experiences with robust safeguards in place. It also claims to proactively remove suspected underage users and has voluntarily introduced features like screen time limits, parental controls, and enhanced privacy settings for minors.
Despite these claims, the mounting legal pressures and public scrutiny highlight the challenges faced by social media platforms in balancing user growth with child safety responsibilities.