Written by Chris McGhee, Jason Meuse, Christa Vo, & Kelsey Kenny as a final paper for the Fall 2021 session of Information Privacy Law
A. Introduction
The Internet of Things is a term that describes the network of sensors embedded in otherwise mundane consumer products which collect data and connect to the internet.1 The capabilities of this technology range from offering simple conveniences to consumers, such as a light that turns on or off at the sound of clapping, to life-saving services like real-time blood sugar monitoring and insulin administration for diabetics or instant notice to emergency responders when a car accident occurs.2 To function properly, these devices rely on constant data input and network connectivity to trigger a mechanical function, or to teach an algorithm how to tailor its functions more succinctly to suit the needs of the individual consumer.3 Whether we are aware of it or not, the use of connected devices in our day-to-day lives is growing increasingly common; several sources predict that the world will have approximately 64 billion Internet of Things, “IoT,” devices by 2025.4 The era of digital commerce has incentivized the rapid development of invasive hardware and data analytics tools because, in the world of IoT, aggregate data is incredibly valuable,5 and consumers are “data goldmine[s].”6
The discrete infiltration of IoT technology presents myriad consumer data privacy concerns.7 From a policy perspective, perhaps the greatest concern is that the more common-place IoT devices become, the more that consumers will be unaware of the true extent to which they are enmeshed in the “ecosystem of ubiquitous computing.”8 Practically speaking, IoT products present the same dilemmas as any device with the capacity of an internet connection.9 What makes IoT peculiarly sensitive to internet-related privacy harms is the interplay of highly sensitive data collection with the potential to directly affect the user’s physical well-being.10
In December of 2020, Congress signed the “Internet of Things Cybersecurity Improvement Act.”11 This bill requires the National Institute of Standards and Technology (“NIST”) to develop and publish guidelines for the federal government for the management of information privacy in IoT devices.12 The guidance authored by the NIST is not binding, rather, it serves to develop best practice guidelines.13 Consumers’ privacy interests would be best-served with binding federal legislation.
IoT privacy concerns predictably vary enormously by product and therefore must be addressed individually to afford adequate consumer protection. The following sections explore the technology, unique privacy concerns, and current regulations at play in IoT devices in the healthcare, smart home, wearables, and smart cars industries. Each section will briefly recommend an improved privacy regulation for each discrete field.
B. Healthcare
The health industry and medical technology in general, have come a long way since the archaic practice of trepanation in the 15th century.14
However, the close connection between emerging technology and healthcare presents new risks associated with treatment, one of which is patient privacy. The types of medical IoT, (also referred to as the Internet of Bodies (IoB),15 devices are as diverse and complex as our own body’s systems and functions. These devices range from the “Smart CPAP” machines that monitor sleep apnea symptoms,16 to technological ecosystems colloquially termed the “artificial pancreas.”17
On the surface, the privacy risks associated with connecting our medical information, our mobile devices, the internet at large, and likely unknown third parties is concerning. However, the risk to consumer privacy is worse than it seems at first blush. Health monitoring systems are constantly recording, storing, and transmitting patient health data through IoB devices. From a technical standpoint, IoB has the potential to create a trove of exploitable vulnerabilities. With ransomware attacks ever on the rise,18 users of IOB may find their health information exploited for economic gain. Beyond technical privacy risks, the sensitivity of health information creates an inherently heightened interest in patient privacy that will require strong regulatory and technical protections.
As a threshold matter, the current regulatory regime in the United States distinguishes IOB devices based on use. The laws distinguish between IoB devices for health and lifestyle, compared to those put toward medical purposes. 19 The former is governed by the Federal Trade Commission (FTC) focusing on whether claims made by the manufacturer violate unfair, or deceptive claims in marketing or advertising.20 Devices for medical use are governed by the Food and Drug Administration (FDA). The line between lifestyle and medical treatment is blurry. The heart rate and sleep monitoring function of your smartwatch may shift between serving a lifestyle and medical purpose depending on the day or the wearer’s general physical condition. Due to the different regulatory focuses of the FDA and the FTC, both companies and users are subjected to different legal standards.21 This difference creates gaps in oversight. A regulatory architecture predicated on the intended use of data, rather than the fact of the collection alone is at the heart22 of this issue.
To ameliorate this problem the FDA should assume all regulatory authority for wearables reasonably related to medical services. This would serve two purposes: first, companies that produce IoB devices would be on notice that the FDA’s guidelines for safety and efficacy apply to them. Second, the FDA would be a central authority for consumers to elevate concerns. This would enable more robust public feedback unencumbered by bureaucratic stove piping.
In the long term, the United States should shift their privacy regulation from the current patchwork structure that regulates privacy by industry and intended usage and instead create a privacy-focused regulatory structure where associated industries are regulated based on the “fact of collection.” The European Union’s (EU) General Data Protection Regulation (GDPR) creates a strong template for a data and privacy-focused regulatory architecture in this field.
C. Smart Homes
Virtual assistants like Amazon’s Alexa and Google Home have emerged to modernize the home. These devices assemble an infrastructure that can set alarms, monitor temperature, enable a security system, and much more. For these devices to work, they require data. The types of data collected are sensitive and may include voice recognition, users’ daily routines, or their TV viewing habits. In aggregate, companies collect such a sheer volume of data that effectively transforms “trivial data” into a “mosaic of intimate information.”23
It is estimated that IoT devices collect over 400 trillion gigabytes of data annually.24 By collecting such a vast amount of data, these devices become intimately familiar with their users. Often a device may act as a companion of sorts, to whom users share “thoughts of suicide, experiences of abuse, and other private information.”25 Given the volume and sensitivity of the personal information these devices collect, the primary concerns are manufacturers’ implementation of “reasonable security features” and the role of consumer privacy in the context of the Fourth Amendment.
Any collection, transfer, or storage of data implicates data security concerns. These concerns are amplified when connected with the high volume of data that smart home devices collect.26 Companies collect this information often without the consumer being completely aware of what information is being collected. They then store the information on a server or in the cloud, deleting it only upon user request. Companies may also sell or otherwise disclose information to third parties. Such robust use of data implicates concerns of a data breach at every stage in the data’s life cycle. Bad actors can steal information about users or even gain control of smart home devices.
To combat this, the FTC and state governments have focused on device manufacturers integrating security-by-design practices during the development of their products. The FTC encourages privacy-by-design (a proactive risk-based approach to development and persistent monitoring of security programs). This method promotes data minimization practices as well as access limitations and clear communication with consumers about data practices, especially concerning devices that may not have an interface to easily present a privacy policy such as Amazon’s Echo.27 The FTC primarily relies on its Section 5 authority to regulate unfair and deceptive trade practices, particularly concerning misleading statements in advertisements and privacy policies.28
In addition, California and Oregon have enacted statutes regulating “connected devices.”29 Connected devices are personal, family, or household devices that connect to the Internet and have an associated address that facilitates short-range wireless connection, a definition that clearly encompasses all smart home devices.30 Both statutes require manufacturers of devices to have unique preprogrammed passwords or make users change the means of authentication. However, the current Notice and Consent regime regarding data usage is inadequate for the smart home market. One author argues that the devices do not have interfaces to access privacy policies and those policies being too ambiguous or verbose prevents consumers from being adequately informed of how their data is collected, stored, and used.31
The smart home also revives concerns about the Fourth Amendment and Third Party Doctrine. Law enforcement agencies have already begun acquiring data such as video footage from Google and voice recordings from Amazon.32 In response, there is a push for the Supreme Court to modernize the Third Party Doctrine in order to protect consumers in their homes.
The Supreme Court has determined that warrantless searches of data voluntarily disclosed to third parties is permitted, but recent cases such as Carpenter v. U.S. and Riley v. California have shown the Supreme Court’s willingness to protect vast amounts of personal information, even if smart home devices are not implicated in their narrow holdings.33 The obsolescence of the Third Party Doctrine is of particular concern both because the use of smart home technology is increasing (it is estimated that 20% of Americans will have them by 2022) and users have “diminishing meaningful choice when it comes to voluntarily turning over this data.”34
The current regulations on manufacturers and emphasis on privacy by design practices are a good first step towards effective regulation of smart homes. However, the industry needs to make privacy policies more accessible to consumers to inform them of how their data is being used and what risks are associated with the product. In addition, Fourth Amendment protection should be expanded into the smart home based on current jurisprudence that protects people’s homes from law enforcement and the lack of user agency when disclosing data to their smart device.
D. Wearables
Among the latest iterations of smart objects, many are designed to be worn, both by people and animals to perform functions such as tracking location and recording information about the bodies on which they are worn. These wearable smart devices share the same risks as other types of connected objects, yet the proximity to our bodies and the nature of technology makes the associated privacy risks very personal and deserving of heightened legal protections.35
In the context of fashion, legal scholars have noted that increased regulations in the various applications of smart clothing would benefit consumers in several ways. The suggested regulations appear primarily in marketing and consumer protection; ensuring that all employees have adequate training to recognize data breaches.36 Some jurisdictions such as the European Union already have requirements for transparent disclosure processing of personal data. Once the fashion industry begins to use smart clothing more widely, it will make sense that such protections be extended to anyone who might purchase such items regardless of geographic location. Additionally, without “processing transparency” requirements, consumers are less informed in making choices between similar products.
Another suggestion in the fashion industry has been to build a culture of accountability and privacy protection as these products are developed, meaning that at every level of production, privacy would be considered and all staff would be trained in its importance and the associated risks.37
Smart devices worn by children present another layer of privacy concerns that should be addressed by regulation, especially considering additional issues such as shared custody of children by parents, potential breach of a child’s personal information by an abuser or predator, and the vulnerable nature of children in general which is already widely recognized by the heightened protection for children’s personal data both in the US and EU.39 Some smart devices such as articles of clothing might be worn by a child who is unaware of the potential risk to personal data, and an unsophisticated adult consumer may have the same unawareness, therefore part of the culture of data protection discussed above should include consumer education.
There are applications for smart wearables in the workplace where safety is the primary concern, both for wearing the items and for forbidding their use in certain situations. Smart glasses have been developed for situations where a worker needs their hands to be free, such as in a warehouse but need access to information on a screen, but some countries have already signaled that more regulation will be needed to protect workers once wearable smart devices make their way into the workplace, for example, Germany has enacted an ordinance that requires the use of smart glasses for only short periods of time and only when necessary.40 In the US, it seems likely that OSHA will step in to regulate in a similar manner, potentially to recommend certain smart devices to be used which increase safety but also to limit the use of connected 41 The US Department of Defense has banned some types of wearable smart devices from being worn by deployed 42
E. Smart Cars
Buyers of new vehicles today will find their cars equipped with the capacity of network connectivity and data collection technology.43 Modern cars are built with a variety of sensors and software programs that contribute to an ecosystem of data flowing between “carmakers, vendors and others to support individual’s safety, logistics, infotainment, and security needs.”44 This information exchange equips drivers with a variety of consumer comforts such as the ability to play music, make phone calls, engage with smartphone apps on the dashboard screen, contact roadside assistance, use voice commands to control settings, or diagnose mechanical issues.45
Beyond convenience, connected cars also provide the potential for significant safety benefits through autonomous or semi-autonomous driving functions.46 Like all IoT devices, autonomous vehicles (“AVs”) require a constant flow of enormous amounts of data to operate effectively.47 Autonomous vehicles can generate as much as 40 terabytes of data an hour from
embedded sensors.48 The constant and comprehensive nature of data collection in smart cars is unprecedented from the perspective of consumer privacy.49 Consumer privacy in the context of developing technology in the United States is at an event horizon where the risks are numerous and consumers would prospectively benefit from timely federal regulation.
One concern is that currently, the government–through a search warrant–may access records of a smart car’s precise location from any point in time since purchase.50 The mere availability and ease of access to such precise information put consumers at risk of invasive government investigations in the future. Another concern is that the complexity of the connected car data ecosystem makes it easier for nefarious actors to access the data and “manipulate [the] system[] in order to gain control” of the vehicle’s functioning.51 If a vehicle has autonomous or semi-autonomous functions, then hackers will have the capacity to potentially override the operating system and control the vehicle remotely.52 Usurpation of control is one risk among many. Hackers may access everything from GPS location, to voice print, to music preference– this undermines the consumer’s informational autonomy and can be used to commit fraud or identity theft.53
There is no comprehensive federal law regulating the use or collection of data in smart cars. The National Highway Traffic Safety Administration (“NHTSA”), an agency entrusted with regulating safety in transportation, addresses consumer privacy in connected cars on their website. They explain that the smart car industry’s privacy regulations flow from the Federal Trade Commission’s section 5 authority and the NHSTA’s own voluntary guidance on best practices.54 The NHSTA centers around providing data privacy protections in order to incentivize consumer trust in the industry and this subsequent engagement in networked vehicle technology.55
In truth, incentivizing consumer involvement in the smart car industry is a moot goal. In the future, consumers will have no choice but to buy networked vehicles. The best policy moving forward then should be a regulation imposing affirmative duties of good privacy hygiene on the companies that will be “data stewards.”56 Companies involved in smart car data collection should be transparent, seek affirmative consent before sharing sensitive data57 with third parties, and limit sharing of data with the government and law enforcement.58 Though imperfect, if a federal law imposed affirmative “data stewardship” obligations on companies that deal with consumers’ smart car data would help steer the industry in a direction more favorable to consumer privacy.
F. Conclusion
Overall, although smart devices and the internet of things is beginning to be more regulated in the U.S., further regulation at a federal level will be required to ensure that people’s personal data are adequately protected. In addition, fostering an overall culture valuing data privacy will help ensure that consumer products are developed with privacy at the forefront.
1 FED. TRADE. COMM’N, Careful Connections: Building Security in the Internet of Things (Jan. 2015) https://www.bulkorder.ftc.gov/system/files/publications/pdf0199-carefulconnections- buildingsecurityinternetofthings.pdf.
2 Id.
3 Id.
4INSIDER INTELLIGENCE, Security & Privacy Issues That Come With the Internet of Things INSIDER (Jan. 6, 2020), https://www.businessinsider.com/iot-security-privacy.
5 Joris Toonders, Data is the New Oil of the Digital Economy, Wired (2018), https://www.wired.com/insights/2014/07/data-new-oil-digital-economy/.
6 Matt Burgess, Europe Makes the Case to Ban Biometric Surveillance, Wired, (July 9, 2021), https://www.wired.com/story/europe-ban-biometric-surveillance/.
7 See, e.g., FED. TRADE COMM’N, supra n. 1.
8 Id.
9 Id. This includes privacy harms such as fraud, identity theft, cyber harassment, or the generalized loss of integrity stemming from inability to control the information about one’s self.
10 Fed. Trade Comm’n. Internet of Things, Staff Report (2015).
11 15 U.S.C.A. § 278g-3b (West 2020).
12 Id.
13 NAT’L INST. OF STANDARDS & SEC. “NIST,” NIST Releases Draft Guidance on Internet of Things Device Cybersecurity (Dec. 15, 2020), https://www.nist.gov/news-events/news/2020/12/nist-releases-draft-guidance- internet-things-device-cybersecurity.
14 “Trepanation” was the practice of boring into a patient’s skull to release demons. Alina Bradford, What is Trepanation?, LIVE SCIENCE, https://www.livescience.com/62591-trepanation- explained.html (last visited Nov 17, 2021
15 See generally Andrea Matwyshyn, The Internet of Bodies, 61 WM. & MARY L. REV 81 (2019).
16 “Smart” CPAP Machines for Better Sleep Apnea Treatment, THE SLEEP BLOG , https://www.sleepdr.com/the- sleep-blog/smart-cpap-machines/ (last visited Nov 15, 2021).
17 Connor Hale, Insulet’s upcoming artificial pancreas system shows gains at all ages in Type 1 diabetes, FIERCE BIOTECH, March 22, 2021, fiercebiotech.com/medtech/insulet-s-upcoming-artificial-pancreas-system-shows-gains-
at-all-ages-type-1-diabetes (last visited Nov 14, 2021).
18 Episode 42: Cybersecurity with Christa Vo 20 YEARS DONE PODCAST (May 18, 2021) (downloaded using Spotify)
19 Matwyshyn, supra note 2.
20 Id.
21 The FDA is predominantly focused on the safety and efficacy of a medical device or treatment; whereas the FTC is more focused on the veracity of claims made by device manufacturers about the health benefits of a device.
22 Pun intended
23 Gabriel Bronshteyn, Searching the Smart Home, 72 STAN. L. REV. 455, 465 (2020).
24 Richard Kam, Internet of Things makes big data even bigger (and riskier), IAPP (Apr. 25, 2016) https://iapp.org/news/a/internet-of-things-makes-big-data-even-bigger-and-riskier/
25 Kathryn McMahon, Tell the Smart House to Mind its Own Business!: Maintaining Privacy and Security in the Era of Smart Devices, 72 STAN. L. REV. 455, 465 (2020).
26 86 FORDHAM L. REV. 2511, 2520 (2018).
27 FED. TRADE COMM’N, CAREFUL CONNECTIONS: KEEPING THE INTERNET OF THINGS SECURE (2020) https://www.ftc.gov/system/files/documents/plainlanguage/913a_careful_connections.pdf.
28 FED. TRADE COMM’N, Canadian Maker of Smart Locks Settles FTC Allegations That it Deceived Consumers about its Security Practices (2020) https://www.ftc.gov/news-events/press-releases/2020/04/canadian-maker-smart-locks- settles-ftc-allegations-it-deceived
29 CAL. CIV. CODE § 1798.91.04-.05; OR. REV. STAT ANN. § 646A.813.
30 OR. REV. STAT ANN. § 646A.813 (1)(a)
31 86 FORDHAM L. REV. 2511, 2529-33 (2018).
32 Sylvia Sui, State v. Bates: Amazon Argues that the First Amendment Protects Its Alexa Voice Service, (Mar. 25 2017) https://jolt.law.harvard.edu/digest/amazon-first-amendment
33 If These Walls Could Talk: The Smart Home and the Fourth Amendment Limits of the Third Party Doctrine, 130 HARV. L. REV. 1924 (2017); see Carpenter v. U.S., 138 S.Ct. 2206 (2018); Riley v. California, 573 U.S. 373 (2014).
34 72 STAN. L. REV. 455, 477 (2020); 130 Harv. L. Rev. 1924 (2017).
35 Pearce, Sarah. 2016. Wearable tech and data privacy: What you need to know. UKTN. October 25. Available online: https://www.uktech.news/news/wearable-tech-privacy-issue-20161025.
36 Ziccardi, G. Wearable Technologies and Smart Clothes in the Fashion Business: Some Issues Concerning Cybersecurity and Data Protection. Laws 2020, 9, 12. https://doi.org/10.3390/laws9020012
37 Id.
38 Id.
39 See generally 15 U.S.C. §§ 6501–6506 and Regulation (EU) 2016/679.
40 Dr. Michael Bretschneider-Hagemes, Smart glasses are entering the world of work: -regulation and standardization are needed, Kommission Arbeitsschutz und Normung KANBRIEF, 1/18, https://www.kan.de/en/publications/kanbrief/inclusion-today/smart-glasses-are-entering-the-world-of-work- regulation-and-standardization-are-needed.
41 Melissa Alvarez, Wearable Technology is the Future of Workplace Safety, (Feb. 4, 2020) https://rapidsos.com/our- latest/wearable-technology-is-the-future-of-workplace-safety/
42 Jim Garamone, New Policy Prohibits GPS Tracking in Deployed Settings, DOD NEWS, (Aug. 6 2018) https://www.defense.gov/News/News-Stories/Article/Article/1594486/new-policy-prohibits-gps-tracking-in- deployed-settings/
43 Andrew Meola, How 5G & IoT Technologies are Driving the Connected Smart Vehicle Industry INSIDER (Mar. 10, 2021), https://www.businessinsider.com/iot-connected-smart-cars.
44 Future of Privacy Forum Data and the Connected Car, (June 29, 2017) https://fpf.org/blog/infographic-data- connected-car-version-1-0/.
45 BOOZ A. HAMILTON, DRIVING AWAY WITH YOUR DATA 3 (2019),
https://iapp.org/media/pdf/resource_center/Privacy_and_Connected_Vehicles.pdf [hereinafter IAPP white paper]. 46 NAT’L. HIGHWAY TRAFFIC SAFETY ADMIN. “NHTSA,” Automated Vehicles for Safety, https://www.nhtsa.gov/technology-innovation/vehicle-data-privacy.
47 Clara Bloom et. al, Self-Driving Cars and Data Collection: Privacy Perceptions of Networks Autonomous Vehicles, in Thirteenth Symposium of Usable Privacy and Security (July 12-14, 2017), https://www.ftc.gov/system/files/documents/public_comments/2017/11/00046-141905.pdf. Some of the data collection sensors used in AVs include, GPS, wheel encoders, radar on the front and rear bumpers, and cameras near the rear-view mirror. Id.
48Geoffrey A. Fowler, What Does Your Car Know About You? We Hacked a Chevy to Find Out, THE WASHINGTON POST (Dec. 17, 2019), https://www.washingtonpost.com/technology/2019/12/17/what-does-your-car-know-about- you-we-hacked-chevy-find-out/.
49 Id.
50 See generally, Carpenter v. United States, 138 S. Ct. 2206 (2018); see also United States v. Jones, 565 U.S. 400 (2012).
51 Pierluigi Paganini, Your New Smart Car is an IoT Device that Can be Hacked, CYBERNEWS, (Dec. 28, 2020), https://cybernews.com/security/your-new-smart-car-is-an-iot-device-that-can-be-hacked/.
52 Id.
53 Id.
54 NAT’L. HIGHWAY TRAFFIC SAFETY ADMIN. “NHTSA,” Vehicle Data Privacy, https://www.nhtsa.gov/technology- innovation/vehicle-data-privacy.
55 “In the context of advanced and automated safety technologies, consumer acceptance is critical to effective and timely deployment.” Id. The NHSTA likely wants to promote market saturation of connected cars because of the potential safety benefits that are available when “human error” is taken out of the equation.
56 See Id.
57 Sensitive data requiring affirmative consent before disclosure should include: (1) precise geolocation; (2) biometric identifiers; and (3) driver behavior data. Id. at 6.
58 Id.