fbpx

2019 – Another Historic Year for Children’s Privacy

Written by John Falzone, VP Privacy Certified
February 24, 2020

2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if passed, would reshape COPPA. Outside the United States, the United Kingdom’s Information Commissioner’s Office (ICO) has been driving the conversation with its game-changing Age Appropriate Design Code (Code), which is awaiting Parliament’s approval. And the world’s largest mobile storefronts—Apple and Google Play—have taken steps to be more protective of children’s privacy.

FTC Sets COPPA Record, Then Breaks It
The FTC remains the world’s top cop when it comes to children’s privacy. It began 2019 by breaking the record for the largest monetary penalty in a COPPA case when it agreed to a settlement with the operators of TikTok (f/k/a Musical.ly) for $5.7 million. TikTok demonstrated a more aggressive approach both in how the FTC defines an online service “directed to children” and also in how it applies COPPA’s “actual knowledge” standard.

The record set in TikTok, however, did not last long. In September 2019, the FTC and the New York Attorney General announced a COPPA settlement with Google and YouTube, which included a $136 million penalty paid to the FTC and $34 million penalty paid to New York—either of which on its own would have been the largest ever monetary penalty in a COPPA case by a significant margin.

More significantly, the settlement required YouTube to materially change its business. Going forward, all YouTube channel owners must specify whether their channels are directed to children. If a channel is not child-directed, the channel owner must still identify individual videos that are child-directed. When content is identified as child-directed, YouTube turns off several features of the platform, including (i) the collection of personal data for behavioral advertising, and (ii) the ability for users to leave public comments.

The FTC also settled a third COPPA case in 2019 against the operators of i-Dressup.com, a dress-up website that had already been shut down. While that case settled for the relatively modest sum of $35,000, it has important symbolic value insofar as it shows even small companies can land on the FTC’s radar.

FTC Solicits COPPA Comments
In addition to its enforcement activities, the FTC was extremely busy in 2019 soliciting public comments on its application of the COPPA Rule and hosting a COPPA workshop. The request for comments, which was published in July 2019, surprised the privacy community and industry because it occurred three years before the FTC’s typical 10-year cycle. The FTC received over 175,000 submissions by the December deadline, including one submission from us.

U.S. Lawmakers Seek to Update COPPA
In March 2019, Senators Markey and Hawley introduced what became known as COPPA 2.0. The bill would, among other things:

    • •Extend COPPA protections to minors 13 to 15 years old;
    • •Extend COPPA to operators of online services that have “constructive knowledge” they are collecting personal information from children or minors; and
    • •Prohibit the use of children’s personal information for targeted marketing and place limits on the use of minors’ personal information for that purpose.

(Bonus material: In January 2020, lawmakers introduced two bills that would amend COPPA: The Protect Kids Act, which was introduced by Congressmen Tim Walberg and Bobby Rush, would, among other things, extend COPPA’s protections to all children under 16 years old. The PRIVCY Act, introduced by Congresswoman Kathy Castor, would essentially re-write COPPA. Among other things, it would extend protections to children under 18 years old and remove the concept of “child-directed” online services in favor of an “actual and constructive knowledge” standard.)

Outside the U.S., the ICO is Trying to Re-Define How Online Services Approach Children’s Privacy
In 2019, the ICO released its long-awaited proposal for an Age Appropriate Design Code—a set of 15 “standards” aimed at placing the best interests of the child above all other considerations for online services “likely to be accessed” by children under 18 years old. The standards would require, among other things, high-privacy default settings, communicating privacy disclosures and choices in ways that are appropriate to the ages of the children likely to access the online service, and eliminating uses detrimental to children’s wellbeing.

Following the initial consultation period, in November 2019, the ICO revised the Code and submitted the final version to the Secretary of State. The Code now awaits approval by Parliament.

Storefronts Take Steps to Strengthen Children’s Privacy
Apple and Google Play also took steps to strengthen children’s privacy. In May 2019, for example, the FTC announced Apple and Google Play removed three dating apps from their storefronts after the FTC warned the apps were violating COPPA.

In addition, both Apple and Google Play revised their developer policies. On Google Play, developers must now identify the target audience for each of their apps. Apps targeted to children have certain restrictions, including with respect to the types of adverts served and networks used. For its part, Apple has placed restrictions on the use of third-party analytics and advertising in apps directed to children.

Have more questions about recent developments in the area of children’s privacy? Feel free to reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

Share