Wrapping Up 2022 with A Huge (Epic) Fortnite Privacy Case

Written by Stacy Feuer, Sr. Vice President, Privacy Certified
December 21, 2022

With 2022 almost behind us, we’d planned on easing out of work mode and into festive celebrations this week for the end of this hectic and challenging privacy year. But Stacy’s former employer, the Federal Trade Commission (FTC), had other ideas. So, instead of wrapping presents, we’re wrapping up the year with an analysis of the FTC’s record-breaking $520 million settlements with Epic Games (Epic) for privacy and consumer protection violations in its wildly popular Fortnite video game.

The “s” in settlements is not a typo: On Monday, the FTC announced two separate enforcement actions against Epic. Consistent with ESRB Privacy Certified’s focus on privacy compliance, though, we’ll limit our analysis to the FTC’s privacy-related case. In short, the FTC (represented by the Department of Justice) filed a two-count Complaint and a Stipulated Order in federal court alleging that Epic violated the Children’s Online Privacy Protection Act (COPPA) and the related COPPA Rule. COPPA protects the personal information of children under the age of 13. The FTC asserted that Epic knew that Fortnite was “directed to children” and unlawfully collected personal data from them without verifiable parental consent (VPC).

The FTC also charged Epic with violating the FTC Act, which prohibits unfair and deceptive practices, by using unfair “on by default” voice and text chat settings in Fortnite that led to children and teens being bullied, threatened, and harassed within the game, including sexually. It charged that Epic’s privacy and parental controls did not meaningfully alleviate these harms or empower players to avoid them. If approved, this settlement will require Epic to pay $275 million in civil penalties. (The other $245 million is for the other case and is allotted for consumer refunds.)

Apart from the epic fine, the Fortnite action provides insight into the FTC’s thinking on children’s and teens’ privacy. Here are seven takeaways from a case that will likely reverberate far past the New Year:

  1. Declaring that your services are not directed to children is not enough: The FTC’s action makes clear that you can’t disclaim COPPA. In a paragraph that appeared on the next-to-last page of Epic’s lengthy global privacy policy, the company stated that it does not direct its websites, games, game engines, or applications to children or intentionally collect personal information from them. Although many companies make this claim in their privacy policies, it won’t help you if the facts show that your product is, in fact, child directed. (Remember, a mixed-audience product is one that targets children but not as the primary audience.)
  2. COPPA’s “actual knowledge” standard doesn’t allow you to ignore evidence that children are using your services – especially internal and empirical evidence: While many advocates and lawmakers have criticized COPPA’s “actual knowledge” standard, seeking to replace it with “constructive knowledge,” the Fortnite action shows the FTC will construe the standard broadly. The agency cited several of the standard COPPA Rule factors – subject matter, use of animation, child-oriented activities and language, and music content, evidence of intended audience, and empirical evidence about the game’s player demographics – to determine that Fortnite is directed to children. The key evidence, though, came from empirical evidence and Epic’s own internal documents including:

    • Demographic data: The FTC provided examples of public survey data, which Epic had reviewed, to demonstrate it knew a considerable portion of Fortnite players were under the age of 13. It pointed to publicly available survey results from a 2019 report showing that 53% of U.S. children aged 10-12 played Fortnite weekly, compared to 33% of U.S. teens aged 13-17, and 19% of the U.S. population aged 18-24. The agency alleged that these results also matched Epic’s internal data.
    • Advertising and marketing: The FTC homed in on Epic’s product licensing deals with a wide variety of companies for Fortnite-branded costumes, toys, books, youth-sized apparel, and “back to school” merchandise, many of which were targeted to the under-13 crowd. As in the FTC’s previous record-breaking COPPA matter, Google/YouTube ($170 million fine), the agency cited numerous internal statements and documentation that Epic had generated to emphasize Fortnite’s appeal to children to potential advertising and marketing partners.
    • Internal statements and events: The FTC also cited “ordinary course of business” communications such as consumer complaints and conversations among Epic employees that acknowledged explicitly that many of its users skewed younger. The FTC strung a number of them together (perhaps unfairly) but the phrases – “a large portion of our player base” consists of “underage kids,” / “high penetration among tweens/teens,” / “Fortnite is enjoyed by a very young audience at home and abroad” – convey, unmistakably, that Epic knew that it had a large user base of tweens and younger kids.
  3. Implement VPC and age gates from the get-go or make sure you apply them retroactively: The FTC faulted Epic for failing to obtain VPC for the personal information it collected from child users. In addition to data like name and email, the agency pointed to Epic’s broadcast of “display names” that put children and teens in direct, real-time contact with others through voice and text communication, as personal information that required parental consent. It also charged that even after Epic deployed age gates, it failed to deploy them retroactively to most of the hundreds of millions of Fortnite players who already had accounts. This is pretty much the same conduct that got TikTok (then Musical.ly) in trouble in an earlier, FTC COPPA case. (The $5.7 million civil penalty there was the largest ever fine at the time the case settled in 2019.) Like TikTok, Epic didn’t go back and request age information for people who already had accounts and adjust their default social features and privacy controls to comply with COPPA.
  4. Privacy by default is not just a catchphrase: Although the FTC has long emphasized privacy by design, the FTC hadn’t previously focused on “privacy-protective” default settings in games and other online services. Now it has. The FTC alleged that Epic’s default settings, which enabled live text and voice communications for all users – including children and teens – constituted an unfair practice that led kids and teens to be bullied, threatened, and harassed, including sexually, through Fortnite. Moreover, the agency, citing evidence from Epic’s own employees, alleged that Epic’s parental controls were insufficient. Even when Epic eventually added a button allowing users to turn voice chat off, the company made it difficult for users to find, according to the FTC.
  5. Injunctive relief can be tough – and retroactive: In addition to the whopping $275 million civil penalty, the proposed Stipulated Order sets out the standard injunctive relief the FTC has long obtained in privacy cases – requirements for FTC monitoring, reports, a comprehensive privacy plan, and regular, independent audits. The Order also requires Epic to implement privacy-protective default settings for children and teens. Following the agency’s newer trend of using injunctions to remedy past harms, the Order requires Epic to delete personal information previously collected from Fortnite users in violation of the COPPA Rule’s parental notice and consent requirements unless the company obtains parental consent to retain such data or the user identifies as 13 or older through a neutral age gate.
  6. Real-world harms matter a lot: Commissioner Christine Wilson, the only Republican currently on the Commission, issued a concurring statement supporting the agency’s action. Although she has cautioned the agency’s majority against overly-expansive uses of the FTC’s unfairness authority, Commissioner Wilson noted that the “elements of the unfairness test are clearly satisfied — because Epic Games allegedly opted children into voice and text communications with players around the world, children were exposed to bullying, threats, and harassment, and were enticed or coerced into sharing sexually explicit images and meeting offline for sexual activity.” Wilson also approved of the “novel injunctive mechanisms, which require Epic Games to implement heightened privacy default settings” for children and teens because they “directly address the privacy harms fostered by the company’s alleged business practices.”
  7. Failing to comply with COPPA can be expensive: There’s a clear upward trajectory from the $5.7 million civil penalty in the FTC’s TikTok/Musicl.ly action to the $170 million fine in Google/YouTube to the $275 million civil penalty that Epic will pay to resolve the FTC’s charges. That’s definitely something to remember as you make your plans for the New Year!

Following the FTC’s announcement, Epic explained that it had accepted the settlement agreements “because we want Epic to be at the forefront of consumer protection and provide the best experience for our players.” It set out – as a “helpful guide” to the industry – principles, policies, and recommendations that the company has instituted over the past few years to protect its players and meet regulators’ expectations globally. On the children’s privacy front, Epic recommended that game developers “proactively create age-appropriate ways for players to enjoy their games” – advice that mirrors our own. Maybe we can tie that up with a ribbon!

* * * * *

Wishing you and your loved ones a joyful and relaxing holiday season without any more blockbuster FTC announcements until 2023!

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.