fbpx

P.S.R. Reinforces Fundamental Privacy Principles in a Changing World

Written by Meghan Ventura
October 20, 2022

After a busy few days in Austin, I’ve pulled together my key takeaways from last week’s International Association of Privacy Professional’s (IAPP) Privacy. Security. Risk. 2022 conference (P.S.R.). P.S.R. is a one-of-a-kind conference that focuses on the intersection of privacy and technology. And there certainly was lots of tech, from content dealing with bias in AI to privacy engineering. But given the location in Texas, one of many states that now place significant restrictions on women’s reproductive rights, the effect of the U.S. Supreme Court’s recent decision in the Dobbs case on the constitutional right to privacy, was a strong undercurrent throughout the event.

Starting with the keynote session (which you can watch here, if you’re an IAPP member) and going through sessions on geolocation, cybersecurity, and advertising, many speakers grappled with new privacy challenges arising from Dobbs. Much of the conversation, though, focused on applying privacy basics to new and emerging technologies. This year’s P.S.R. highlighted that it’s an important time for companies to be good and responsible stewards of data. Here are more details on three topics that came up repeatedly at the conference: (1) Kids and Teens; (2) Data Minimization; and (3) Deidentification.

Kids and Teens
It’s clear that the UK Children’s Code and its offshoot, the recently passed California Age Appropriate Design Code (CA AADC), are top of mind. Companies are looking for more guidance and best practices from regulators on how to best comply. Both the UK and California codes feature similar concepts such as “the best interests of the child,” and privacy by default and prohibit behavioral ads/profiling. There are some differences, of course, but they are more technical than conceptual. If you’re looking for further analysis, we recommend checking out our post on the CA AADC and reading through the Future of Privacy Forum’s excellent analysis here.

During the keynote session featuring Federal Trade Commission (FTC) Commissioner Rebecca Kelly Slaughter, the IAPP’s Chief Knowledge Officer, Caitlin Fennessy, asked her if there are questions from the FTC’s 95-question Advance Notice of Proposed Rulemaking (ANPR) on commercial surveillance and data security that people should focus on when submitting comments. Commissioner Slaughter mentioned issues of tech addiction and psychological harms to teens that traditionally aren’t thought of as privacy problems, but stem from the same data sets. While the Commissioner did not have any updates on the FTC’s review of the Children’s Online Privacy Protection Act (COPPA) review to share, she strongly encouraged the public to submit comments on the ANPR. Many attendees interpreted the Commissioner’s COPPA comment as yet another signal that the FTC has effectively abandoned the COPPA Rule Review in favor of the ANPR. The FTC just extended the comment period, so you have plenty of time to file your comment.

Sensitive Data and Data Minimization
With five new state privacy laws (California, Virginia, Colorado, Utah, Connecticut) coming into effect next year, there was a lot of discussion about privacy law basics. It’s no surprise then that that the panels focused on defining personal data. In particular, sensitive data came up at nearly every session.

The state laws have similar definitions of sensitive data, but there are some key differences privacy professionals must pay attention to. For example, all states consider special category data like ethnic origin, religious beliefs and sexual orientation to be sensitive data. Virginia, Colorado, and Connecticut all consider personal data collected from a known child to be sensitive information. Each of the state laws specifies precise geolocation as sensitive data, except for Colorado. Colorado instead, is planning to cover geolocation information under its proposed rules for processing “sensitive data inferences.” Sensitive data inferences are “inferences made by a [c]ontroller based on [p]ersonal [d]ata, alone or in combination with other data, which indicate an individual’s racial or ethnic origin; religious beliefs; mental or physical health condition or diagnosis; sex life or sexual orientation; or citizenship or citizenship status.”

And just about every time someone spoke about sensitive data, they stressed the importance of data minimization. This concept that goes back to the Fair Information Practice Principles (FIPPs), first developed in the 1970’s, which contained the collection limitation principle, designed to prevent overcollection of information. As many speakers made clear (referring in part to the Dobbs decision and fears about the use of reproductive data), data can’t be breached, hacked, or turned over to law enforcement if it’s not collected in the first place.

Deidentification
The issue of deidentification also came up frequently, often in relation to data minimization. Deidentification refers to actions that organizations can take to remove identifying characteristics from their data.

Where can you look for deidentification standards? P.S.R. panelists mentioned governmental sources, such as the Health Insurance Portability and Accountability Act’s (HIPAA) deidentification standards in the medical privacy context and the FTC’s three-part test for deidentified data (pasted below from page 10 of this report) as good starting points. The FTC standard states that deidentified data is not:

“reasonably linkable” to the extent that a company: (1) takes reasonable measures to ensure that the data is de-identified; (2) publicly commits not to try to reidentify the data; and (3) contractually prohibits downstream recipients from trying to re-identify the data.

(The California Privacy Rights Act, which comes into effect in January 2023, also uses a similar standard.) That said, deidentification may not last long as a privacy-enhancing tool. As one speaker noted, some data scientists predict that technological advances will allow most data sets to be identifiable within three to five years. Our takeaway: It’s best to err on the side of minimizing the data you collect, use, and share from the outset. This is a principle we’ve long preached to members of ESRB Privacy Certified program.

* * *

Although P.S.R. explored newer technologies from biometrics to data clean rooms, much of the conference focused on core privacy practices: Have you done your risk assessments and data protection impact assessments, and implemented mitigating factors? Do you apply best practices for cybersecurity and have documentation for how and why you might deviate from those best practices and standards? Are you keeping the FIPPs in mind? These, of course, are the types of questions we think about all of the time at ESRB Privacy Certified. Amidst all the changing laws and technologies, it’s reassuring to know that sticking to privacy fundamentals can boost your compliance efforts. And don’t forget, we’re here to help our members with the issues I summarized above – child and teen privacy, sensitive data and data minimization, deidentification – and more.

Photo credit: Meghan Ventura

Share