Privacy in Virtual and Augmented Reality
Devin Forbush, Christopher Guay, & Maggie Shields
A. Introduction
In this paper, we set out the basics of Augmented and Virtual Reality. First, we discuss how the technology works and how data is collected. Second, we analyze what privacy issues arise, and specifically comment on the gravity of privacy concerns that are not contemplated by current laws given the velocity and volume of data that is collected with this technology. Third, the final section of this paper analyzes how to mitigate these privacy concerns and what regulation of this technology would ideally look like. Through the past decade, the advent of augmented reality (AR), mixed reality (MR), and virtual reality (VR) has ushered in a new era of human-computer interactivity. Although the functions of each reality platform vary, the “umbrella term” XR will be used interchangeably to address concerns covering all areas of these emerging technologies.[1] The gaming community might have initially popularized XR, but now, broad industries and economic sectors seek to impose the new technologies in a variety of contexts: education, healthcare, workplace, and even fitness.[2]
B. Augmented and Virtual Reality Background
Augmented Reality is “an interface that layers digital content on a user’s visual plane.”[3] It works by overlaying certain images and objects within the users’ current environment.[4] AR uses a digital layering which superimposes images and objects into their real world environment.[5] Software developers create AR smartphone applications or products to be worn by users, such as headsets or AR glasses.[6] In contrast, Virtual Reality seeks to immerse users within an “interactive virtual environment.”[7] VR seeks to transport the user into a completely new digital environment, or reality where users can interact with, move within, and behave as if they would within the real world.[8] To enter VR, a user wears a head-mounted device (HMD) which displays a “three-dimensional computer-generated environment.”[9] Within the environment created, the HMD uses a variety of sensors, cameras, and controls to track and provide sights, sounds, and haptic response to a user’s input.[10] Mixed reality offers a combination of virtual reality and augmented reality.[11] In function, mixed reality creates virtual objects superimposed in the real world, and behaves as if they were real objects.[12]
C. Data Collection Practices
In order to create immersive experiences, XR software necessarily collects massive amounts of aggregate information and sensitive data to understand and respond to a user to create for them a fully immersive experience.[13] Examples of data collected by XR software includes:
- Immediate physical environment and physical movements of the user;
- Audio information from devices such as microphones that can capture a user’s voice;
- Sound from the device’s surroundings;
- Biometrically-derived information (such as eye tracking);
- Approximate location and precise geolocation information; and
- Information about the apps used and purchased on XR platforms.[14]
XR technology relies on biometric information about its user, as well as the user’s precise location in space, relative to its surroundings and other XR devices, and records the external world, potentially capturing other people and places.[15] A VR headset tracks biometrically derived data including “posture, eye gaze, gestures, facial expression, and interpersonal distance” up to ninety times per second.[16] As a user moves about their physical environment, XR “use[s] sensors to track their motion, position them in space, and refresh their perspective.”[17] The sensors on the hardware have the ability to calibrate and measure a user’s head and eye movements and respond accordingly with a change in the user’s point of view.[18] For example, the HMDs “measure head position and rotation via spatial mapping techniques.”[19] XR tracks a user’s movement in six ways: forward, backward, up, down, side-to-side, and measures the tilt angle of the user’s head using X, Y, and Z axes to determine the user’s position and direction[20] Using both optical and non-optical sensors, the XR technology positions the body in space and helps prevent the user from tripping or bumping into objects while in an immersive experience.[21] A twenty-minute virtual reality experience can generate up to “two million data points and unique recordings of body language.”[22]
Among the various data points collected, eye tracking is especially sensitive. Eye tracking in XR can use “an infrared camera to monitor the user’s eyes and the direction of the user’s gaze inside the HMD.”[23]> The resulting sensor data “translates into more precise computer reactions in VR content and a more realistic point of view.”[24] Eye tracking allows the XR application to direct and control content to users by collecting information about the user’s real identity combined with their reactions to particular stimuli.[25]
XR platforms can generate data about an individual’s likes, dislikes, preferences, and interests by tracking his or her pupil dilation, which can be measured by eye-tracking.[26] Pupil tracking can also generate sensitive health data.[27] Research has shown that irregular eye motion patterns can be an indication of autism.[28] Other illnesses such as schizophrenia, Parkinson’s disease, ADHD, and concussions can also be diagnosed through eye tracking.[29] The collection of sensitive health data becomes an issue if this data can be freely transferred and purchased by third parties, such as insurers, advertisers, and government agencies absent any limitations.[30] XR technology allows companies to provide personalization that is nearly unnoticeable to its users.[31] XR developers identify and respond to the user’s subconscious needs in real-time relying on data that it collects from the user’s subconscious reactions to certain stimuli.[32] Developers then use that data to market to the user within the XR platform without the user “being aware of the influence from the personalization.”[33]
D. XR Privacy Concerns
Not only is highly sensitive data being collected by XR technology, but privacy concerns also stem from the enormous amounts of highly sensitive data being generated in a matter of seconds.[34] Given the “volume, variety, veracity, and velocity” of data collection through XR technology, virtual reality generates complex and large data sets.[35] Those data sets can then be analyzed in the aggregate to provide valuable insights for businesses and organizations at the expense of individual privacy.[36] Given the highly sensitive nature of the data collection and the amount of data being generated, not to mention the value of the data, many of the current laws regulating XR are inadequate to protect against the grave consumer privacy risks associated.[37] As a practical matter, it is extremely difficult to limit the data collected by XR technology because creating immersive and indistinguishable environments requires companies to constantly collect and process sensitive data.[38] Therefore, one of the most pressing issues for big data collection in the XR space “is its analytic power to psychologically persuade people.”[39]
In XR, data is collected and aggregated in real-time.[40] In contrast with more traditional forms of targeted advertising, where inferences based on consumer profiles tend to be delayed and can result in inaccuracies in psychological traits over time, XR gathers real-time information with the latest psychological indicators based on the user.[41] These inferences, along with real-time responses to stimuli in the environment “can reveal both a person’s physical, mental, and emotional state, and the stimuli that caused him or her to enter that state.”[42] This allows XR developers to create profiles that help them understand users’ buying preferences over time, how stimuli will affect those preferences, and provide actionable insights in real-time to XR providers.[43]
XR has revolutionized targeted advertising. Before XR, social media targeting techniques were limited toward broad interest areas or general characteristics of a user, or specific products or items they may be interested in.[44] In contrast, the rise of XR and aggregate data collection has allowed companies to create “sophisticated and subtle targeted advertising” in real-time.[45] Notably, physiological and biometric data can be analyzed at the same time it is collected to detect certain moods and states of users, respond accordingly with advertisements and capitalize on such vulnerabilities, and even manufacture moods and stimuli within the XR environment to persuade users.[46] This makes XR particularly suitable for subtly persuading consumers without them even realizing that they have been influenced.[47]
E. Proposed Solutions
The United States has two potential approaches to solving the privacy issues involved with using XR. The first approach takes a broad-based approach, similar to laws like the European Union’s GDPR and California’s CCPA/CPRA, which gives individuals broad (depending upon the exact law) privacy rights across many sectors of the economy.[48] The second approach relies upon specific sectoral regulations appropriate to XR.[49] This could take the form of self-regulation or regulations specific to the privacy dangers associated with XR such as those that have been implemented in areas like healthcare.[50] Neither method, alone, would effectively curtail the burgeoning issues associated with XR. Instead, the United States should propagate both broad based standards and specific sectoral standards to ensure that companies protect the privacy rights of individuals. This analysis will not go into the feasibility of any legislative changes.[51]> Nor will it encompass every requirement for a proper regulatory scheme due to the constrained nature of this assignment.
Of course, new policy initiatives are only necessary in the case that existing laws and policies do not cover the issues noted above. The GDPR is one particular regulation that might apply to XR, but it is unclear whether the data that companies like Meta collect while individuals use XR products constitute biometric data under the GDPR.[52] Even if the data that XR companies collect fall under the umbrella of biometric data (thus requiring informed consent), the standard for this type of consent might be fulfilled by simply checking a box when setting up the product.=[53] This indicates that simply obtaining informed consent does not do enough to ameliorate privacy issues.[54] In addition, the GDPR imposes a requirement that processors minimize the amount of data used and only for specified purposes.[55] It’s quite possible that Meta (formerly Oculus) does not fulfill this requirement.[56] Even if Meta and other XR companies currently comply with the GDPR, using written consent comes with significant issues. [57]
It is rare that a user ever fully reads or even understands the terms in which they are agreeing to.[58] The current consent-regulatory frameworks available to solve this problem, such as the GDPR, have proved to be inadequate.[59] Although the GDPR requires explicit consent prior to processing of sensitive or biometric data, there still remains a large gap between a users’ understanding of the consent and the larger repercussions of their actual consent.[60] Further, consent in XR is markedly different from consent in the real world: the user’s ability to switch environments quickly necessitates that certain data and consent used in one particular environment will be vastly different in another environment.[61] With the complexity of data collection, XR platforms may not even provide different terms for which a user can understand to control their data or even the “appropriate context for which an individual can make an informed decision.” [62]
Given the broad gaps in understanding, along with the ability for the type of data and consent changing quickly on an environment-dependent basis, the onus should be placed on XR organizations and platforms to create true informed consent which establishes “a level of genuine understanding by the users about how their data is collected, applied, stored, and brokered.”[63] Yet, this issue of consent is still problematic in the space of XR where data collection is highly necessary for the proper functioning of the technology.[64] In sum, due to such complexities, opt-out processes in contrast to the opt-in settings currently in place, are strongly urged by many privacy advocates.[65]
If the GDPR fails to provide sufficient privacy protections for XR users, other solutions become necessary. Both sectoral and broad based regulatory schemes are necessary for capturing the unique issues associated with XR.[66] Starting with self-regulation, the primary stakeholders in the XR have signed on to the XR Association (XRA) as an advocacy and trade group.[67] The XRA’s suggestions include that XR manufacturers “work toward privacy protective defaults”; that XR companies should provide information on why data “is needed to create the immersive experience”; that “giving users control over XR sensors will be important”; that privacy policies should be laid out to allow “user[s] to decide what level of data collection they are comfortable with”; that XR companies should give bystanders of those using XR “to learn XR technologies are operating around them”; and that XR companies need to empower parents and guardians of children to allow them to “use…XR technology for safe, positive, and age-appropriate experiences.”[68]
The XR Association is a good starting point for a sectoral approach to the regulation of XR, but does not do enough to ensure the privacy of users. Other potential ameliorative policies include physical switches on XR devices to cut them off from the internet and requiring that processing of personal data be restricted to local devices as much as possible.[69] In addition, as marketplaces for applications and other uses of XR emerge, XR companies will need to ensure that they have policies that ensure that third parties will not abuse access to private data.[70]
In combination with the sectoral regulation, broad based regulation would ensure that privacy is protected while using XR. Representative Suzan DelBene introduced the Information Transparency and Personal Data Control Act (Data Control Act) which contains significant guidance on how to implement broad based privacy controls.[71] It requires the Federal Trade Commission (FTC) to promulgate regulations which give users more control over their data.[72] Furthermore, this bill would establish six individual privacy rights.[73] One of the main pillars of the regulation would require explicit opt-in consent to the use and collection of private data.[74] As noted previously, this might prove problematic due to the unique nature of XR and the way that data is collected, but this bill does provide an option which solves some of the issues associated with upfront consent.[75]< It does this by giving individuals the right to “opt out at any time,” of “any collection, transmission, storage, processing, selling, sharing, or other use of non-sensitive personal information.”[76] The Data Control Act offers some considerations to XR companies, and it allows the use of “non-sensitive personal information,” for necessary purposes like preventing fraud or in the public interest.[77] The FTC and state Attorney’s Generals, under this proposed law, would treat violations of this bill as a violation of “unfair or deceptive acts or practices.”[78] This bill, in conjunction with the sectoral proposals, would go a long way towards giving individuals more control and understanding of how and why their data is used.
F. Conclusion
XR is one of the most exciting and quickly advancing pieces of technology in the world today. As an emerging sector of the economy that requires enormous amounts of data, XR presents unique privacy concerns.[79]> Solving these issues requires both specific sectoral solutions, and a broader based solution to ensure that individuals’ privacy rights are protected. If these solutions are implemented, then XR stands a good chance of protecting the privacy rights of individuals.
[1] XR Safety Initiative, The XRIS Definitions of Extended Reality, xrsi.org 8, https://xrsi.org/wp-content/uploads/2020/03/ XRSI-Standard-XR-Definitions-XR-001v002.pdf (last visited Nov. 20, 2022).
[2] See Sophie Thompson, VR Applications: 21 Industries Already Using Virtual Reality, Virtual Speech (Mar. 1, 2022).
[3] Brittan Heller, Watching Androids Dream of Electric Sheep: Immersive technology, Biometric Psychography, and the law, 23 Vand. L. Rev. 5 (2020).
[4] Id.
[5] See XR Safety Initiative, supra note 1, at 10.
[6] Heller, supra note 3, at 5.
[7] Id. at 4-5.
[8] Yeji Kim, Note, Virtual Reality Data and Its Privacy Regulatory Challenges: A Call to Move Beyond Text-Based Informed Consent, 110 Calif. L. Rev. 225, 233 (2022); Mark A. Lemely & Eugene Volokh, Law Virtual Reality and Augmented Reality, 166 Univ. PA. L. Rev. 1051, 1055 (2018).
[9] Mary Anne Franks, The Desert of the Unreal: Inequality in Virtual and Augmented Reality, 51 U.C. Davis L. Rev. 500, 504 (quoting Crystal Nwaneri, Note, Ready Lawyer One: Legal Issues in the Innovation of Virtual Reality, 30 Harv. J.L. & Tech. 601, 604 (2017)).
[10] Id. at 504; Heller, supra note 3, at 14-15.
[11] XR Safety Initiative, supra note 1, at 11.
[12] Id.
[13] Joseph Jerome & Jeremy Greenberg, Augmented Reality + Virtual Reality: Privacy & Autonomy Considerations in Emerging, Immersive Digital Worlds, Future of Priv. F. 2 (April 2021), https://fpf.org/wp-content/uploads/2021/04/FPF-ARVR-Report-4.16.21-Digital.pdf.; Kim, supra note 8, at 334.
[14] Jerome & Greenberg, supra note 13, at 16.
[15] Id. at 4.
[16] Id.
[17] Heller, supra note 3, at 14.
[18] Id.
[19] Id.
[20] Id. at 16.
[21] Id.
[22] Jerome & Greenberg, supra note 13, at 16.
[23] Heller, supra note 3, at 16
[24] Id.
[25] Id. at 28; Jerome & Greenberg, supra note 13, at 16.
[26] Jerome & Greenberg, supra note, 13, at 17; Heller, supra note 3, at 27.
[27] Heller, supra note 3, at 32.
[28]Id.
[29] Id.
[30] Id. at 32-33.
[31] Kim, supra note 8, at 229-30.
[32] Id.
[33] Id.
[34] See Kim, supra note 8, at 227.
[35] Id at 231.
[36] See id.
[37] See Jerome & Greenberg, supra note 13, at 22.
[38] Id. at 24.
[39] Kim, supra note 8, at 239.
[40] Id. at 240.
[41] Id.
[42] Heller, supra note 3, at 27.
[43] See id. at 28; Kim, supra note 8, at 239.
[44] Kim, supra note 8, at 240.
[45] Id.
[46] Id. at 241.
[47] Id. at 240-41.
[48] Commission Regulation 2016/679 of May 25, 2018, General Data Protection Regulation (GDPR); Cal. Civ Code §§ 1798.100-1798.199.100 (CCPA amended by CPRA ballot initiative).
[49] See e.g., 15 U.S.C. §§ 6501-6506 (Children’s Online Privacy Protection Act); 20 U.S.C. § 1232g; 34 C.F.R. 99 (The Family Educational Rights and Privacy Act); 5 U.S.C. § 552a (The Privacy Act of 1974).
[50] See Health Insurance Portability and Accountability Act, Pub. Law 104-191 (1996); 45 C.F.R. 160 (A), (E), 164 (2022).
[51] See Keith Krehbiel, Pivotal Politics: A Theory of U.S. Lawmaking (1st ed. 1998) (suggesting that gridlock is the natural starting point for U.S. politics and that legislative action only occurs when the policy is in accordance with the supermajority of politicians).
[52] Kim, supra note 8 at 228 (suggesting that Meta currently has fulfilled acquired consent for data processing, and potentially acquired informed consent based upon checking an agreement to have data processed when setting up the product).
[53] Id. at 243.
[54] Id. at 244.
[55] Id. at 244; GDPR, supra note 48, Art. 5.
[56] Kim, supra note 8 at 244 (suggesting that Oculus’ privacy policy did not have enough specificity when defining its use and processing of data). But see Meta Supplemental Meta Platforms Technologies Privacy Policy, https://www.meta.com/legal/quest/privacy-policy/?utm_source=l.facebook.com&utm_medium=oculusredirect (last visited Nov. 16, 2022) (Meta’s current privacy policy regarding the use and processing of personal data appears to offer much more specificity than was noted in the Kim article, and might now fulfill the requirement).
[57] Kim, supra note 8, at 251-52 (suggesting that using written consent is fundamentally flawed as a method of ameliorating privacy concerns).
[58] Joshua Tooker, Privacy in the Era of Constant Reality Capture: Informed Consent in Extended Reality (XR), 5 (April 2021) (MBA/MSI thesis, University of Michigan) (on file with the University of Michigan Library system). https://deepblue.lib.umich.edu/bitstream/handle/2027.42/168561/20210501_Tooker%2cJoshua_Final_MTOP_Thesis.pdf?sequence=1&isAllowed=y
[59] See Kim, supra note 8, at 239.
[60] GDPR, supra note 48, Art. 9; Kim, supra note 8, at 239.
[61] Lemely & Volokh, supra note 8, at 1100 (explaining that consent in the physical world, already presents significant legal issues; given that consent changes with the environment, and people may switch virtual environments more often than physical environments, legal issues around consent are highly prevalent).
[62] Tooker, supra note 57, at 26.
[63] Heller, supra note 3, at 43.
[64] Id.
[65] Id.
[66] Ellysse Dick, Balancing User Privacy and Innovation in Augmented and Virtual Reality, Information Technology & Innovation Foundation, (Mar. 4, 2021).
[67] XR Association, About Page, https://xra.org/about/ (last visited Nov. 11, 2022) (founders of XRA include Microsoft, HTC Vive, Sony, Meta, and Google among others).
[68] XR Association, Privacy, https://xra.org/public-policy/privacy/# (last visited Nov. 11, 2022) (these standards are not binding upon the members, but rather a set of basic beliefs set out by the XR Association).
[69] Jerome & Greenberg, supra note 11, at 24.
[70] Id. at 27 (suggesting that it is necessary for XR companies to police their online marketplaces to ensure no bad actors, and that these companies need to engage in “responsive feedback mechanisms,” to ensure that these policies continue to be followed).
[71] XR Association, Public Policy, https://xra.org/public-policy/rep-delbene-introduces-data-privacy-bill/ (last visited Nov. 11, 2022) (not directly endorsing the bill, but giving some support to the idea of creating a federal privacy law); Information Transparency & Personal Data Control Act, H.R. 1816, 117th Cong. (referred to the House Energy and Commerce Subcommittee on Consumer Protection and Commerce in Mar. 2021) (no further action has been taken, but Rep. DelBene was reelected to the House).
[72] H.R. 1816 § 2.
[73] Id. at § 2(6)(A-F) (1. “[C]ontrol” how “companies collect [personal data],” and “how they use it,” 2. Easily readable and understandable “privacy and security practices,” 3. Collection of data “consistent with the context in which consumers provide the data,” 4. Ensure the security of “sensitive personal information,” 5. Right to “access and correct” personal data, and 6. “Reasonable limits on the personal data that companies collect and retain.”).
[74] Id. at § 3(a)(1)(A).
[75] Kim, supra note 8, at 251-52.
[76] H.R. 1816 § 3(a)(4)(A).
[77] Id. at § 3(b)(1).
[78] Id. at § 4(b).
[79] See Kim, supra note 8, at 227.