Safe and Secure VR: Policy Issues Impacting Kids’ Use of Immersive Tech

After Oculus Quest commercials blanketed the airwaves before the holidays, a number of folks at Common Sense Media raised concerns about Facebook’s take on virtual reality. I decided to seize on this interest to offer up some thoughts on how to improve virtual reality for kids, putting out a short paper: Safe and Secure VR: Policy Issues Impacting Kids’ Use of Immersive Tech.

To guide tech companies’ decisions as they create immersive content aimed at kids, I suggest several ways to ensure kids experience these technologies in a safe, secure, and responsible environment, including:

  1. Parental controls should be effective and account for the unique features of VR games, such as its immersive nature. For example, providing clear time-limit mechanisms to prevent overuse.
  2. VR platforms must create safer virtual environments. We need a strong set of standards for rating and moderating VR experiences so families can choose what is appropriate for their children.
  3. Companies must step up their protection of kids’ data, especially because immersive tech like VR requires the collection of so much sensitive behavioral information.

A number of colleagues and VR enthusiasts offered feedback, and I remain thankful to Lindsey Barrett, Mary Berk, Jon Brescia, Jeff Haynes, Girard Kelly, Joe Newman, and Jenny Radesky for their thoughtful feedback — and willingness to read the paper.

// Download the full paper here

ITIF: How to Balance Privacy and Innovation in Augmented and Virtual Reality

In the wake of my white paper on the implications of virtual reality for kids, I joined a panel hosted by the Information Technology & Innovation Foundation to discuss the privacy risks posed by AR/VR, what XR companies can do to mitigate these concerns, and how existing law and regulation impact immersive technologies. I’ve been longtime fan of Brian Wassom and Brittan Heller, and it was a lot of fun to talk immersive tech with them:

Privacy Nuts and Bolts: How Washington lawmakers can protect our digital privacy

For the past two years, I have been personally and professionally invested in the Washington Privacy Act. The 2021 iteration, SB 5062, is the third iteration of the bill. Tough questions exist about the scope of information that should be protected, how individuals should consent to data practices or whether companies should minimize how they use information, and ultimately, how these protections will be policed.

I recently hosted a webinar for Washington lawmakers with an array of privacy academics, advocates and experts. This hour long conversation features me moderating a conversation with Prof. Ryan Calo, University of Washington School of Law; Stacey Gray, Senior Counsel at the Future of Privacy Forum (FPF); Jennifer Lee, Technology & Liberty Project Manager at ACLU-WA; and Maureen Mahoney, Policy Analyst at Consumer Reports.

A Voice of VR, Episode #951

Several weeks ago, I was invited to join Kent Bye’s podcast to discuss the state of U.S. privacy law. Kent’s Voices of VR Podcast is mandatory listening if one is interested in virtual reality and immersive technologies, and I’m a huge fan. Getting to geek out with Kent for a few hours was a personal treat.

We cover a lot of ground from the EU’s General Data Protection Regulation, the history of U.S. privacy law, and the ongoing state and federal debate about comprehensive privacy laws. I occasionally try to inject my own asides about the importance of biometrics laws and my concerns about invoking surveillance capitalism. The full episode is available here — and a citation-filled transcript is also available to download here.

Project Aria and Mapping Augmented Reality

On the heels of Facebook’s announcement that Reality Labs would be deploying smart glasses to both assist in mapping and create “digital twin” of the real world:

Maps hold tremendous power. They not only help people navigate the world, but they also establish boundaries and shape our perceptions. Mapping technology is equally important. Global navigation systems are military assets, and Apple publicly apologized for the shaky launch of its mapping app in 2012. We have gotten used to mapping roads, but AR changes the game by encouraging us to map every square foot of space on the planet.

// Read the piece at Slate here

Some Initial Ideas on Improving Privacy in AR, VR, and XR

The time to begin developing XR privacy guidelines and controls is now. Growing numbers of consumers are worried about how data collected via VR headsets and AR apps are used, and privacy compliance has emerged as the top legal risk impacting XR companies. XR industry surveys have found that companies are more concerned with consumer privacy and data security than product liability, health and safety, or intellectual property.

In this post for IAPP’s Privacy Perspectives, I offer some initial areas that should be top of mind. As a privacy advocate and XR enthusiast, I suggest there’s a real need for AR/VR platforms and developers to (1) improve transparency and begin making XR-specific data disclosures, (2) embrace transparency reporting and technical solutions to restrain data sharing, and (3) commit to diversity and inclusion.

Discussing Location Data in Kotaku

There’s a quick blurb from me in a feature story on Kotaku about the implications of PokemonGo-developer Niantic’s vast collection of geolocation data for its Real World Platform. It’s an interesting read into location-based apps, augmented reality, and gaming as a sort of gatekeeper into what comes next:

“Any time an app collects location data, you have to hope the app developer has given some thought to the risks involved, but this is hard to tell from a privacy policy alone,” said Joseph Jerome, a privacy consultant. “A lot of companies will say that IP addresses and other technical information is not personal. Niantic is not making [these claims], which is a good thing.”

Federalist Society Podcast: California Consumer Privacy Act

In this episode of the Federalist Society’s Tech Roundup, I join the Mercatus Center’s Adam Thierer and TechFreedom’s Ian Adams to bring a privacy advocate’s perspective to the looming California Consumer Privacy Act. It’s a good discussion of the relative merits of privacy laws at the state or federal level, and I only interrupt Ian a couple times:

This podcast features a fascinating back-and-forth on the implications of new amendments to California’s privacy law, CCPA (California Consumer Privacy Act). Is California setting the law of the land? How will the FTC respond? What will this mean for interstate online commerce? These and other questions are explored in the episode.

Privacy and Private Rights of Action

As Congress continues to slog through the process of crafting a comprehensive federal privacy framework, two intractable issues have emerged: federal preemption and private rights of action. These two issues are intertwined because they get at the core of how privacy rights and obligations should be enforced. While preemption has received most of the attention, a carefully constructed private right of action could also play an important role in advancing privacy rights at the national level. Instead, any inclusion of a private right of action has been treated as an all-or-nothing proposition.

Privacy advocates recommend individuals be permitted to privately enforce federal privacy protections through a statutory private right of action without any showing of harm. Meanwhile, industry-friendly proposals treat private rights of action as a non-starter. Both sides are locked into absolutist positions, and lawmakers’ efforts to craft an impactful privacy law have been hurt in the process.

In this post for IAPP’s Privacy Perspectives, I get into the nuance of private enforcement and offer up several ideas for how lawmakers could incorporate private rights of action into a national privacy law.

A Few Resources Regarding VPNs…

After filing a complaint with the FTC about the data handling practices of a VPN provider in 2017, I led a project to work with VPN providers to improve trust in their products. Together, we created a questionnaire document to advance transparency while encouraging providers to undergo independent audits of their practices in mid-2018. Ever since, I’ve been fascinated by how this industry has evolved, whether its launching prominent television commercials or engaging in substantive security audits.

I’ve also managed to find myself with a front row seat to journalists’ efforts to identify what makes a good VPN:

  1. The New York Times’ Wirecutter recently updated its VPN guide, reviewing and building upon some of the work of VPN accountability project I started at CDT with several VPN providers. It’s an incredibly detailed overview of what to consider when purchasing a VPN.
  2. TIME, on the other hand, assessed whether using a VPN was even worthwhile, highlighting that VPNs ultimately “shift your risk.”
  3. For one of his final pieces with Slate, Will Oremus explored the dynamics of the VPN industry and got me to give him a pretty stark description of the ecosystem: “It is fascinating the amount of sniping that goes on” between VPN companies . . . “They are very quick to pull out knives and shiv each other.”

// Finally, my original primer on VPNs in available here and a summary of considerations I wrote up to mark Data Privacy Day is here

Joining PBS NewsHour to Discuss Facial Recognition Apps

An invitation to join PBS NewsHour to discuss the privacy implications of facial recognition apps was enough for me to push aside a lingering illness. I spoke with Amna Nawaz about public concerns with FaceApp, a photo filter app that allows users to transform their features by adding or removing wrinkles. I address questions about how images of people’s faces could be used, the (non)-implications of the app being based in Russia, and encourage viewers to reach out to allegedly concerned lawmakers to push for federal privacy legislation.

New York State Public Hearing on Online Privacy

I joined fellow privacy experts Prof. Ari Waldman, IPR’s Lindsey Barrett, and CCPA-author Mary Stone Ross to testify in support of the New York Privacy Act at a hearing on Tuesday, June 4th. Our panel, which starts about 1:15 minutes into the video, forcefully responded to a line-up of industry representatives. My prepared testimony is available here and a complete transcript of the event is available here.

1 2 3 6  Scroll to top