Skip to Content

The ABCs of the 2025 Privacy Playground: Age Assurance, Bots, and COPPA

Written by Stacy Feuer, Sr. Vice President, Privacy Certified
December 17, 2025
Photo by Loegunn Lai on Unsplash
Photo by Loegunn Lai on Unsplash

Video games and toys are all about play, and while play remains wildly popular, the privacy playground feels topsy-turvy as we close out 2025. Since last January, the pace of privacy developments has been a whirlwhind, leaving us feeling like we’re on a fast-moving spinner-thingy that no one can quite stop. The rules of the game keep shifting, making it hard to stay upright, let alone keep up.

To help regain some balance, here are a few highlights, focused mainly on youth privacy, organized by playground-appropriate ABCs: A for Age Assurance, B for Bots, and C for COPPA and Children’s Privacy. For each, we’ll start with key developments and then look ahead to how they might play out over the year ahead.

Now, let’s play!

A
Age Assurance – State of Play: This year saw a surge in age-assurance requirements. Although age assurance laws are generally separate from privacy laws, they complement them by providing companies with age information that triggers stronger protections for minors’ data.

Global: Online age verification mandates for kids and teens exploded around the globe in 2025. The European Commission issued a blueprint for minor’s age verification while the UK’s Online Safety Act  rules requiring platforms to implement “highly effective” checks to block minors from accessing pornography, self-harm, and eating disorder content came into effect in July. Canada also joined the age assurance movement with the Office of the Privacy Commissioner (OPC) releasing a report in March outlining principles for proportionate, privacy-protective age verification. Although OPC hasn’t yet issued formal guidance, in September, it emphasized the need for effective age assurance in an opinion stemming from its joint (with provincial authorities) investigation of TikTok, finding that the platform’s voluntary age gate was insufficient. OPC’s opinion required TikTok to implement three enhanced mechanisms – visual signals, behavioral data, and natural language processing – combined with a privacy impact assessment, to effectively keep underage users off the platform.

o Australia Focus: Just last week, Australia’s landmark “social media minimum age” amendment to its Online Safety Act – commonly referred to as the social media ban – took effect. The law requires designated social media platforms to take reasonable steps to prevent minors under 16 from creating accounts, effectively mandating age assurance. To accomplish this, Australia’s eSafety Commissioner oversaw a massive trial of age assurance technologies, involving 48 vendors and over 60 distinct technologies across various sectors, including social media, gaming, adult content and online retail, conducted by the UK Age Check Certification Scheme. The trial report concluded that age assurance systems can be private, robust, and effective. Australia’s privacy authority published related guidance for platforms on applying privacy-by-design principles, including minimizing the collection, use, and retention of personal and sensitive data for age assurance purposes.

U.S. Laws: At least some U.S. states joined this trend, with Texas, Utah, and Louisiana enacting “App Store Accountability Acts” that require app stores (e.g., Apple, Google) to verify user ages and obtain parental approval for minors under the age of 18 to set up accounts and make in-app purchases. Although there have been questions about whether such laws will survive constitutional challenge, the Supreme Court’s June decision in Free Speech Coalition v. Paxton, upholding a Texas law requiring adult websites to implement age verification for sexually explicit content that is “harmful to minors” gave such laws a boost. Paxton focused on porn, but, as we observed at the time, the “decision is likely to encourage states that have passed (or are considering passing) laws requiring minors to obtain parental consent or verify age to access social media platforms or other online content so long as those laws don’t fully ban access for adults.”

o Texas: The Texas App Store Accountability Act is scheduled to take effect on January 1, 2026; however, both the Computer & Communications Industry Association and a student group have challenged the law as unconstitutional, and are seeking a preliminary injunction. The district court has not yet ruled; however, it held a preliminary injunction hearing on December 6, with a decision expected soon. In the meantime, both Apple and Google have released APIs for compliance with the law. (Our friends at Frankfurt Kurnit have a helpful operational guide.)

o California: California, like the kid on the playground who always makes their own rules, enacted a different law, the Digital Age Assurance Act (AB 1043). Beginning in January 2027, the California law creates a framework requiring operating system providers, broadly defined as entities that develop, license, or control operating system software, to transmit age-based signals in four brackets (under 13, 13-16, 16-18, 18+). Computer and mobile app developers in turn are required to request such signals from the operating system provider or a covered application store when their application is downloaded and launched. The California law permits a variety of ways to verify a user’s age, through methods like ID checks, credit card verification, or trusted third-party services, before granting access to age-restricted content.  Although the California law on its face doesn’t cover the operating systems of gaming consoles, the law applies to other parts of the industry. When California’s Governor Newsom signed the bill into law, he issued a signing statement urging the legislature to amend the law to address concerns from streaming services and video game developers of the impact of the law on their existing age verification systems, including the “complexities such as multi-user accounts shared by a family member and user profiles utilized across multiple devices.”

FTC Action: The Federal Trade Commission (FTC) also joined the action, using its September settlement with Disney for alleged violations the Children’s Online Privacy Protection Act (COPPA) Rule as an opportunity to flag its support for age assurance. (COPPA, of course, only requires companies to verify that the person providing consent is the child’s parent, not to verify the child’s age.) Identifying age assurance technology as the future for protecting kids online, the FTC announced that, “Effective age assurance technologies that reliably identify users’ ages can ease the burden on parents, allow kids to have an age-appropriate experience online, and protect kids from harmful content online.” Now, the FTC is poised to do more. It recently announced a workshop on age assurance technologies scheduled for January 28, 2026. Topics will include navigating the regulatory contours of age verification, how to deploy age verification more widely, and the interplay between COPPA and age verification technologies.

2026 Predictions: Setting aside constitutional concerns, expect age assurance to move to a frontline obligation, as legislators and regulators increasingly demand that users provide their age and companies voluntarily adopt age assurance to meet global expectations and reputational and liability concerns. In the U.S., the Federal App Store Accountability Act, introduced in both houses of Congress, will likely move forward early in the year. Regardless of what happens on the legislative front, companies will need to navigate a maze of verification options, each with its own privacy trade-offs – from low-friction signals and probabilistic checks to high-assurance methods that raise concerns about data collection, accuracy, bias, and exclusion. Experiences abroad, especially in Australia and the U.K. will provide lots of takeaways for implementation in the U.S.

B
Bots – State of Play: We couldn’t walk across the 2025 privacy playground without bumping into bots, specifically AI chatbots. The intersection of artificial intelligence and consumer privacy and safety is an emerging policy and regulatory frontier.

• Concerns: As more people turn to AI chatbots for everything from information to entertainment to companionship – including roughly two-thirds of American teens (64%) (Pew Research Center) and even many younger children – concerns about the their potential harms are on the rise. To date, legislative and regulatory attention has focused more on general consumer deception risks and physical safety and mental health harms from sycophantic and delusional chatbot outputs than on privacy issues. That may be starting to change.

Developments over the past year, including the Italian data protection authority’s €5 million fine and ongoing investigation into Replika AI’s data practices, announced in May, have brought the privacy risks for consumers into sharper focus. A study released this fall from Stanford University’s Institute for Human-Centered AI, which analyzed the privacy policies of six major chatbot developers, found that many collect and use personal information disclosed in chats by default to train their systems. This includes sensitive personal information such as biometric and health data as well as children’s chat data, with some companies retaining that data indefinitely. Reflecting this concern, the FTC included questions about companies’ data collection, usage, and retention practices as part of its industry-wide investigation on the impact of companion chatbots on children and teens, which seeks, among other things, to assess chatbot developers’ compliance with the COPPA Rule (see in particular Q.22 of the agency’s Section 6(b) Order).

• 2026 Predictions: Looking ahead, expect privacy and security concerns to surface front and center as consumers realize that personal information shared in what they thought was an intimate “private” conversation with a chatbot may be mined and used for targeted advertising, cross-platform profiling, and other unrelated commercial purposes. Meta’s October announcement that it will start using data from consumers’ conversations with its AI (along with data from its other services) to personalize content, ads, and other recommendations – and that consumers will not be allowed to opt-out – has already sparked backlash and prompted a complaint to the FTC from privacy advocates. Meta has said that it will exclude data from conversations about consumers’ religious views, sexual orientation, political views, health, racial or ethnic origin, philosophical beliefs, or trade union membership from this initiative, but the episode underscores how quickly chatbot privacy concerns are likely to escalate in the coming year.

C
COPPA and Children’s Privacy – State of Play: Despite its advancing age, and competition from more modern state laws protecting the data of minors up to the age of 18, COPPA is still “king of the hill” in the U.S. when it comes to protecting kids and their personal information.

Amended COPPA Rule: The FTC started the year announcing its long awaited changes to the Children’s Online Privacy Protection Rule (COPPA Rule) in January 2025. As we detailed in our Top 5 Impacts article published by the IAPP in February, the Rule made several consequential changes. First, it requires separate verifiable parental consent for “non-integral” purposes, which the FTC made clear includes behavioral advertising, profiling, and the development and training of an artificial intelligence. It also requires companies to make granular disclosures about third party data recipients and specify the precise internal operations purpose for which a company has collected a persistent identifier. Significantly, the new Rule imposes heightened, specific data security and data retention obligations on companies subject to COPPA, as well as extensive oversight obligations on Safe Harbor programs like ESRB Privacy Certified, including new reporting requirements, to ensure their members’ compliance with these new provisions. The amended Rule, which was published in the Federal Register in April,  came into force in June 2025 with compliance expected by April 2026. 

COPPA Enforcement: The FTC also was active on the enforcement front this year, announcing four new COPPA actions. The one with the most implications for how video game companies engage with children and teens was the first one of the year. In January, the FTC announced a settlement with HoYoverse, the developer of the wildly popular Genshin Impact video game, resolving alleged violations of both COPPA and the FTC Act. The settlement agreement required HoYoverse to pay a $20 million fine and to delete all data of players under the age of 13 that it previously collected unless it obtained verified parental consent required by the COPPA Rule. It also banned the company from selling loot boxes to children under the age of 16 without parental consent.

While the FTC didn’t bring any new COPPA actions for the next nine months, it ended the drought in September announcing two new COPPA settlements – one alleging that Disney allegedly mislabeled child-directed YouTube videos and the other claiming that a Chinese-owned company, Apitor, which sells programmable robots intended for kids ages six to 14 through online retailers such as Amazon, collected children’s personal information without VPC and sold it through a third-party software development kit (SDK) to advertisers. The FTC also filed a case at the end of that month against Iconic Hearts/Sendit, an anonymous question app popular with Gen Z and younger kids, which collected children’s personal information without consent, deceived kids about the sender of the messages, and tricked them into buying premium memberships.

o States: The FTC wasn’t alone in pursuing COPPA claims, which can also be enforced by state Attorneys General. In 2025, several states filed wide-ranging complaints alleging a variety of online harms to children under state and federal causes of action, including alleged COPPA violations. Examples include New Jersey’s suit against Discord and Michigan’s case against Roku. Other states, such as California, Florida, and Texas, filed cases alleging violations of minors’ privacy, but didn’t assert COPPA causes of action. For example, in November, California’s Attorney General announced a $1.4 million settlement with mobile video games developer Jam City settling allegations, among others, that Jam City collected personal information for advertising from teens who were at least 13 and less than 16 years old (outside the COPPA demographic) without their affirmative opt-in consent. The sizeable penalty and COPPA-like conduct relief underscores that, for game developers and other companies, COPPA compliance is critical but not enough in today’s privacy landscape.

· Children’s (and Teen) Legislation: A new wave of proposed and enacted COPPA bills and other kids’ laws and regulations  piled onto the field this year.

o Federal: Congress has been mostly quiet this year on comprehensive privacy legislation with a Republican-led House data privacy study group yet to announce any deliverables. (The IAPP has an extremely helpful summary of stakeholder comments to that group.) It’s continued to push forward, though, on youth-focused online privacy and safety, holding several hearings focusing on digital dangers to children and teens. On the Senate side, Senate Democrats reintroduced COPPA 2.0 at the beginning of March, and, on the House side, the Committee on Energy and Commerce last week marked up a slew (18!) of online safety and privacy for kids and teens including (yet again) the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0).

o COPPA 2.0 Focus: This year’s version of COPPA 2.0, H.R. 6291, differs from the Senate bill and last year’s bipartisan House bill, in several ways. It contains a broad preemption provision that would preempt any state law that “relates to the provisions of this Act,” not just those that are in direct conflict with COPPA. Plus, it does not contain a modified knowledge standard, such as “willful disregard” for companies other than large social media companies. Democrats on the committee were opposed to the bill, signaling that even if it passes the full House, it may die in the Senate unless House leadership makes some changes to restore the earlier bipartisan consensus. (Tech Policy Press has a “lightly edited” transcript of the markup.)

o State: Although no state passed a new comprehensive data privacy law this year, several states including Arkansas, Vermont, and Nebraska, passed new laws centered on kids’ privacy. The wildly disparate bills (Arkansas’s is styled as a “COPPA 2.0” law while Vermont’s is a “design code”) extend a range of privacy protections to teens, some up to the age of 16, and others up to the age of 18. Other states, including California (as part of its automated decision-making/cybersecurity regulatory update), Connecticut, Colorado, Montana, and Oregon, amended their laws to provide additional protections for teen data. Of note, Colorado clarified its knowledge standard, setting out factors for determining when a company willfully disregards information that a consumer is a minor (e.g., a credible report from a parent or information in the consumer’s bio).

o Court Challenges: Despite all the legislative activity on the kids front, the fate of even existing children’s privacy laws – at least the California Age Appropriate Design Code (CAADC) and laws modeled on it – is unclear. Like a game of red light, green light, courts keep enjoining kids’ legislation and legislators keep passing new laws. In March, the California federal court overseeing NetChoice’s challenge to the CAADC once again issued a preliminary injunction. This time, the Court enjoined the entire CAADC from taking effect, finding that NetChoice is likely to succeed on its First Amendment challenge to the Act as a whole. California’s appeal of this second preliminary injunction is currently pending before the Ninth Circuit. And in late November, a federal district court in Maryland ruled that NetChoice’s First Amendment challenge to the Maryland Kids Code could move forward.

2026 Predictions: Whether or not Congress passes COPPA 2.0 legislation in 2026 expect to see lots more COPPA enforcement from the FTC, especially on the new provisions of the amended rule. The FTC’s current leadership has stated again and again that protecting kids from online harms, including privacy and data security harms, is its biggest priority, and it will take on the “biggest and most powerful companies to protect kids’ privacy by vigorously enforcing COPPA.” And count on the states to continue to pass new laws and use a variety of legal theories, including COPPA in some contexts, to bring suits to protect the privacy and safety of kids in their jurisdictions.

* * * * *

That wraps up our ABC review. Of course, we can’t capture everything that happened or will happen in privacy this year just using just the first three letters of the alphabet — you’d probably stop reading if we added developments from D to Z. Even as the bell rings for holiday recess, the playground is far from quiet. The days ahead will likely bring even more privacy news. And with no end in sight to the privacy hopscotch, we’ll be back in 2026, ready to keep playing and help make sense of sense of the next set set of playground moves.

Stacy Feuer headshot

As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule. She holds the IAPP’s CIPP/US and CIPP/E certifications.

Share