fbpx

Ready Player Go: Getting Your Privacy Program Metaverse-Ready in 2023

Written by Stacy Feuer, Sr. Vice President, Privacy Certified
January 26, 2023

As ESRB Privacy Certified celebrates Data Privacy Week 2023, the hype about the metaverse – a single, immersive, persistent and three-dimensional space that enables people to socialize, play, shop, and work in ways that transcend the limits of the physical world – is reaching a crescendo. It’s easy to ignore some of it as baseless buzz. But, from a practical standpoint, the video game industry has long created metaverse-like experiences, building expansive virtual worlds with players using their custom-designed digital avatars to connect, socialize and play with one another. Some companies are experimenting with Web 3.0 features like extended reality (XR), an umbrella term for virtual (VR), augmented (AR), and mixed reality (MR). Others are considering the use of the blockchain to prove “ownership” of virtual goods and property and non-fungible tokens (NFTs) to enable purchases. So, even though there’s no consensus on exactly what the metaverse is, or how, when, and whether it will transform our lives, now’s the time – from our privacy compliance perspective – for companies and consumers alike to get ready.

How much time is up for debate. A recent study by Pew Research Center and Elon University’s Imagining the Internet Center surveyed over 600 experts about the trajectory and impact of the metaverse. More than half of the experts (54%) predicted that the metaverse will be part of daily life for a half billion people globally by 2040. Slightly less than half (46%) disagreed. They predicted that even though more people will embrace XR tools and experiences by then, the fully immersive world that people imagine as “the metaverse” will take more than 20 years to come to fruition.

Whichever group is right, it’s certain that privacy (and data security) issues will loom large. The array of XR technologies that enable the metaverse will create vast new troves of digital data and real-world privacy concerns. Companies will be able to collect enormous amounts of biometric data such as user’s eye movements and hand positions and bodily data such as blood pressure, and respiration levels. Through the emerging area of inferential biometrics, XR technologies, combined with AI, could be used to make inferences about user’s emotions, mental and physical health, and personality traits, among other things.

Even if users create virtual life identities without providing real-world personal information or using their personal characteristics such as gender, race, or age in avatars, they will likely share information with other digital avatars. As with today’s smart phones and IoT devices, this may allow others to piece together users’ real-world identities or obtain sensitive information from them. Companies may sweep up data from bystanders who happen to be in the range of an XR user’s sensors. If companies choose to incorporate blockchain technologies and NFTs into their metaverse plans, they will present their own privacy and security challenges.

It’s critical for companies in the video game industry and beyond to start addressing these challenges now. In a KPMG survey of 1,000 U.S. adults conducted last fall, 80% percent of respondents said that privacy is their top concern in the metaverse, while 79% said that the security of their personal information is their biggest worry. So, while we’re waiting for the real metaverse to stand up, you can make sure your company is using today’s XR technologies in privacy-protective ways and getting ready for the next iteration(s) of the metaverse.

Photo credit: Julien Tromeur via Unsplash

Here are three ways to start:

  1. Incorporate global laws and “best practices” into your current privacy compliance strategy: The metaverse is likely to be more fully global than even today’s internet. This makes it unlikely that any one data privacy regime will apply clearly to metaverse platforms or companies that operate on those platforms. There aren’t any metaverse-specific privacy rules or standards, and there likely won’t be for a long time. Companies should therefore prepare by analyzing and adopting responsible and transparent “best practices” from existing data protection and privacy frameworks. Instead of complying only with the current law in any one jurisdiction or trying to avoid other laws through a “choice of law” clause in your terms of use, you should look to a variety of laws, international standards, and global best practices to provide a high level of privacy protection for your metaverse users. (You can’t just choose your favorite law, though: You’ll need to continue complying with privacy laws that do exist in your jurisdiction.)

    Informational privacy principles, contained in global guidelines such as the OECD (Organization for Economic Cooperation and Development) Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, can form the core of your metaverse data protection strategy. These concepts, such as data minimization, purpose limitation, use specification, data security, individual participation, and accountability, can guide your implementation in a specific metaverse application or environment. Indeed, most modern data protection laws, such as the European Union’s General Data Protection Regulation (GDPR), California’s Consumer Privacy Act (CCPA), and the proposed bipartisan American Data Protection and Privacy Act (ADPPA) from the last Congress, all incorporate these concepts. Of course, there may be situations when laws conflict or these principles will simply be inadequate to deal with new technological developments. By considering how you can use laws, standards, and best practices in your privacy program now, though, you’ll have a head start on compliance.
  2. Do the “Ds” – Deploy Privacy by Design and Default Principles and Data Protection Impact Assessments: The concept of the metaverse as an open, multi-layered universe means that existing methods of privacy protection that rely on privacy disclosures and user consent may be difficult to deploy. But privacy by design – the idea that companies should design products, processes, and policies to proactively manage and avoid privacy risks – seems tailor-made for this new medium. (And it’s long been a core part of our program’s compliance approach.) Privacy by default, a closely related concept, may be even more salient. It requires companies to protect their users’ personal data automatically, embedding privacy into technology and systems from the beginning, not after-the-fact. (The UK Information Commissioner’s Office has a helpful guidance and checklist that addresses these principles in the context of the GDPR.)

    An important piece of privacy by design and default is assessment. Many modern data protection laws, such as the GDPR and California’s Age Appropriate Design Code Act, require companies to conduct data protection impact assessments (DPIAs) to identify, analyze, and minimize privacy risks. Even if you’re not required to conduct DPIAs now, you should start to do them (if you’re not doing so already) for technologies like XR and features like NFTs that may be part of your metaverse offerings. (The International Association of Privacy Professionals (IAPP) maintains an extremely useful resource page on DPIAs.)
  3. Don’t forget children and teens: As complex as data privacy in the metaverse will be for adults, the challenge of protecting the privacy of kids and teens in the metaverse will be even greater. Companies will need to follow a mélange of rules and laws such as the Children’s Online Privacy Protection Act (COPPA) in the U.S., and the newer Age-Appropriate Design Codes in the UK, Ireland, and California. They will also need to follow related laws and rules on safety, advertising and marketing, and digital wellness to protect children and teens from real and perceived risks in the metaverse. As the Federal Trade Commission’s recent settlement with Epic Games for COPPA and other privacy violations involving Fortnite makes clear, poor privacy practices in virtual worlds can lead to real-life harms. One of the FTC commissioners explained how the company’s alleged practices, such as opting children into voice and text communications with players around the world, exposed children to bullying, threats, and harassment, and even coerced or enticed them into sharing sexually explicit images and meeting offline for sexual activity.

Companies must double down on privacy by design and default for children and teens, build sophisticated privacy and parental controls, implement multi-layered age verification methods, and develop mechanisms to obtain parental consent (when required). Some companies may want to build out child-and-teen friendly metaverse spaces and experiences. Given the complexities of doing so, it’s a good thing that a Ready Player One-like universe that crosses over physical and digital realms doesn’t really exist. Yet.

• • •

If you have more questions about kids’ privacy in the metaverse or you want to learn more about our program, please reach out to us through our contact page. Be sure to follow us on LinkedIn for more privacy-related updates.

• • •

Stacy Feuer Headshot As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.


Featured image credit: BrianPenny on Pixabay.

Share