Skip to Content

Privacy Madness: NetChoice ‘s Challenge to the California AADC Goes to Overtime

Written by Stacy Feuer, Sr. Vice President, Privacy Certified
March 20, 2026

It’s March, which means the National College Athletic Association’s (NCAA) annual basketball tournament is underway.  Over two weeks, 68 teams divided into four regions (East, West, Midwest, and South) compete in up to seven games through single-elimination rounds for the national championship title.  The games are  usually exciting with lots of fancy dribbling and passes, momentum swings, and unexpected lead changes, giving the tournament its own, trademarked moniker – March Madness.

Privacy Madness: March Madness isn’t just happening on the basketball court this year. Privacy world has had its own share of relentless action, so much so that I’m calling it Privacy Madness.”  (Honestly, it seems like more months than not could earn that title lately.)  In the U.S. Congress, youth privacy and online safety bills are racing down the court with both chambers scrambling to score wins. On the state level, Alabama, Kentucky, and Oklahoma are among the states advancing comprehensive privacy laws to the next round. Meanwhile, privacy regulators around the world are running a full court press,  implementing and enforcing sweeping new children’s online privacy and safety laws that are reshaping business practices. Brazil is in the top position this month, with its updated law to protect the privacy and safety of children and teens online, known as the Digital ECA, entering into force earlier this week.

The West – California: But right now, all eyes are on the  Western region, where California, as usual, is dominating its Privacy Madness bracket. It’s now requiring data brokers to register and to honor consumers’ deletion requests through a centralized DROP (Delete Request and Opt-Out Platform) mechanism. The Attorney General and California’s dedicated privacy agency, CalPrivacy, have stepped up the pace of enforcement of consumer privacy opt-out rights (settling recently with Disney, youth platform PlayOn Sports, and Ford), flexing their regulatory muscle in ways that are putting businesses on notice nationwide. The California team is, in many ways, setting the tempo for the entire privacy game.

NetChoice v. Bonta: And then last week, the U.S. Court of Appeals for the Ninth Circuit issued its second major ruling in NetChoice’s challenge to the California Age Appropriate Design Code Act (CAADC), NetChoice v. Bonta. The CAADC is California’s landmark online privacy and safety law, enacted in 2022, that requires businesses to prioritize the “best interests of the child” when designing, developing, and providing online services likely to be accessed by minors under the age of 18. NetChoice, the tech trade group challenging the law, and the California Attorney General, which is defending the law, have already been through several rounds of litigation over the constitutionality of the CAADC.

In the latest round, the Ninth Circuit put points on the board for both sides and remanded  the case to the district court for further proceedings. Net Choice and California both claimed victory, but there’s still no final score. Let’s take a look at the state of play for each team:

  • Points for NetChoice: The Court agreed with Net Choice that the data use provisions of the CAADC and the “dark patterns” sections of the AADC were “void for vagueness” and upheld the lower court’s injunction against enforcement of those sections of the law.  Interestingly, the Court did not focus on the substance of the restrictions of either of those sections but instead finding that those provisions’ reliance on key undefined terms such as “material detriment,” “well-being,” and “best interests of children” did not give businesses clear notice of what conduct is prohibited. The Court also affirmed its earlier ruling that the DPIA (data protection impact assessment) provision, requiring companies to assess and “mitigate” harms to children violates the First Amendment’s prohibition against compelled speech.
    • Possible redraft?: The Court’s approach raises the question whether the prohibitions on data practices or dark patterns in the CAADC could withstand challenge if uncoupled from such terms. For example, the CAADC provision on dark patterns could be read in the disjunctive. It prohibits dark patterns that “lead or encourage children to provide personal information beyond what is reasonably expected to provide that online service, product, or feature,” or that lead children to “forego privacy protections or to take any action that the business knows, or has reason to know, is materially detrimental to the child’s physical health, mental health, or well-being.” Reading the provision this way, by severing it at the first “or” and discarding the rest, is consistent with many existing privacy laws. Of course, that will ultimately be a task for the legislature.
  • Points for California: The Ninth Circuit vacated the district court’s injunction on the remaining provisions, ruling that Net Choice had failed to meet its burden of proving that the AADC, on its face, violated the First Amendment. Emphasizing the “high bar” of a facial challenge (as opposed to an “as applied” challenge, where a party challenges the law based on a specific application), the Court ruled that that NetChoice had not adequately shown it was likely to succeed on a facial First Amendment challenge to the coverage definition and the age estimate requirement because it focused only on companies that publish content, a subset of covered businesses.
    • Coverage definition: The Ninth Circuit held that NetChoice was unlikely to succeed on its argument that coverage definition in the law –  “likely to be accessed by a child” – is a content-based regulation on speech that invalidates the law in full. The law lists six indicators for making this determination, including whether the service is directed at children under COPPA, whether it features content like cartoons or games that appeal to children, or whether internal company research shows a significant child audience. The Ninth Circuit rejected NetChoice’s argument that applying these factors would require content evaluation in every case, reasoning that the statute does not require application of all factors and that some factors are entirely content-neutral, such as age-based determinative factors. Although the law may have some content-based implications as applied to certain services, the Court found that NetChoice had not identified enough real-world examples across the full range of businesses and activities the law could cover to demonstrate that the coverage definition invalidates the entire statute.
    • Age estimation: The Ninth Circuit vacated the lower court’s preliminary injunction against the age estimation requirement, finding that NetChoice had again failed to develop a sufficient record to demonstrate how the requirement to screen users by age would prevent access to content, imposing an unconstitutional burden on speech. The court rejected the challenge, not on the merits, but because NetChoice had not built enough of a factual record to prevail on a facial challenge. Critically, the Court also noted that because California had not yet issued regulations defining what NetChoice members must do to verify users’ ages, it is too early for a court to assess how burdensome the requirement actually is in practice. The Ninth Circuit wrote: “Absent a developed record on the various applications of the age estimation requirement – including applications that do not prevent access to content or require data collection for compliance –Net Choice cannot succeed on this facial attack

Overtime: The CAADC game is far from over. Expect further district court proceedings and a likely third appeal. And maybe even a trip to the Supreme Court.  In the meantime, businesses should be aware that several key provisions of the CAADC are now in effect and can be enforced. They include (see cites to the enacted legislation):

  • Age estimation: Estimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers. ( §1798.99.31 (a) (5))
  • Default settings: Configure all default privacy settings provided to children by the online service, product, or feature to settings that offer a high level of privacy unless the business can demonstrate a compelling reason that a different setting is in the best interests of children. (Note: NetChoice is arguing this remains enjoined due to the “best interests” language in the provision.) (§1798.99.31 (a)(6))
    Geolocation prohibition: Don’t collect, sell, or share any precise geolocation information of children by default unless the collection of that precise geolocation information is strictly necessary for the business to provide the service, product, or feature requested and then only for the limited time that the collection of precise geolocation information is necessary to provide the service, product, or feature. (§1798.99.31 (b) (5))
  • Privacy information: Provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature. (§ 1798.99.31 (a) (7))

The upshot: For now, the shot clock’s still running on the CAADC. If kids and teens are using your services, now’s not the time to sit on the bench. Even if the CAADC doesn’t make it to the finals, there’s COPPA on the federal front and a lot of laws at the state level that contain similar requirements. So, stay tuned for more Privacy Madness in the months ahead.

Stacy Feuer, ESRB Sr. Vice President, Privacy CertifiedAs senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She is an IAPP Fellow of Privacy, holding its CIPP/US, CIPP/E, and AIGP (Artificial Intelligence Governance Professional) certifications.

Share