Say “Bonjour” to New Blanket Privacy Regulations?

The FTC Considers Tightening the Leash on the Commercial Data Free-for-All and Loose Data Security Practices in an Effort to Advance Toward a Framework More Akin to the GDPR

By Hannah Grace Babinski, class of 2024

On August 11, 2022, the Federal Trade Commission (FTC) issued an Advance Notice of Proposed Rulemaking (ANPR) concerning possible rulemaking surrounding “commercial surveillance” and “lax data security practices”[1] and established a public forum date of September 8, 2022.[2] The FTC’s specific objective for issuing this ANPR is to obtain public input concerning “whether [the FTC] should implement new trade regulation rules or other regulatory alternatives concerning the ways in which companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.”[3]

It is no secret that the knit tapestry of sector-specific privacy regulations, federal and state-level unfair and deceptive practice prohibitions, constitutional interpretations of case law, and piecemeal data-specific legislation implemented by certain states like California, Connecticut, and Colorado, for example, leaves much to be desired in the realm of privacy protection. Since the US’s current privacy framework only safeguards certain people in certain situations or certain people residing within certain states, many Americans remain vulnerable to privacy trespasses in a time that is marked by the constant flow of data between consumers and corporations.

As evidenced by this latest ANPR, the FTC is not oblivious to the pitfalls of the current system. In a nearly fifty-page document published alongside the ANPR, the FTC establishes a persuasive argument in favor of adopting new domestic privacy regulations that would implement stronger privacy safeguards and data abuse limitations. These new regulations, in theory, would serve as a layer of blanket protection for consumers, which would include enhanced protection for vulnerable groups, in a style akin to the European Union’s (EU) General Data Protection Regulation (GDPR). These regulations would also limit the detrimental effects of the free flow of data. In establishing a case in favor of new regulations, the FTC addresses several deficiencies in the current legal framework as it pertains to consumer data protection, corporate collection, and enforcement of existing safeguards: (1) a lack of consumer consent, (2) an increase in possibility of harms to all individuals and specifically to vulnerable factions of the population, and (3) inconsistent and unpredictable enforceability. Taking each highlighted deficiency in turn, this paper will examine and summarize the key points of the FTC’s reasoning for new privacy regulations that, when considered together, echo the concerns of privacy professionals across the United States and underscore the necessity of enhanced privacy regulations focused on the minimal safeguards currently employed and commercial surveillance of the American consumer.

I. Lack of Consumer Consent

According to the FTC, one of the primary issues plaguing the US’s current privacy framework is a startling lack of consumer consent in regard to the collection, use, protection, and monetization of personal data. Indeed, under the current framework, large corporations rule the market with little restraints. The concerns and desires of consumers whose data is often collected and monetized in mass is often sacrificed in the name of business, and this leaves the American consumer in a particularly vulnerable position. This lopsided power dynamic means that many companies regularly collect and use personal data with either limited or absent user consent all together. The FTC acknowledges the terrifying truth that “as networked devices and online services become essential to navigating daily life, consumers may have little choice but to accept the terms that firms offer. . . .”[4] Even though consumers may personally object to certain uses of their data, emphasizing that “consumers have become resigned to the ways in which companies collect and monetize their information, largely because consumers have little to no actual control over what happens to their information once companies collect it.”[5]

What’s more, beyond the scope of express “consent” and resignation to the existing reality of data exchanges in the US, there are scores of consumers who “do not generally understand the market for consumer data . . .” and the associated risks of data sharing and collection.[6] In fact, the FTC finds that “[m]any privacy notices that acknowledge such risks are reportedly not readable to the average consumer,”[7] leaving countless American consumers incapable of making “informed decisions about the costs and benefits of using different services” and resulting in an “information asymmetry between companies and consumer[s].”[8] In addition consumers’ lack of understanding in the way in which consumer data is utilized, the FTC notes that companies often “use the information that they collect to direct consumers’ online experiences in ways that are rarely apparent—and in ways that go well beyond merely providing the products or services for which consumers believe they sign up.”[9] This includes the manipulation of dark pattern features, which can misrepresent or fail to effectively communicate how data is used, meaning that “consumer consent may be irrelevant.”[10] Furthermore, some companies go as far as to “reportedly claim to collect consumer data for one stated purpose but then also use it for other purposes,” such as selling the data to advertisers, third-parties, and data brokers for profit and without consumer knowledge.[11]

With this in mind, the question becomes: if American consumers have no control over their data and their consent is immaterial if requested at all, at what point does the interaction between consumers and companies become unconscionable to the point of unfairness?

II. Increased Possibility of Harms

The power imbalance and the non-consensual nature of data exchanges between companies and consumers allowed, if not spurred, by the US’s current regulatory framework inevitably results in harms that extend well beyond mere indignance. The second key deficiency the FTC highlights in its analysis of the standing American privacy system are the apparent harms stemming from the free flow and marketing of personal data associated with commercial surveillance practices. Likewise, companies’ “lax data security practices” also pose additional risks to consumers, including certain personal harms such as the increased risk of “cyberattacks by hackers, data thieves, and bad actors.”[12] In terms of financial harm, individual fraud and identity theft related to continued and unregulated commercial surveillance and inadequate data security practices are increasing and “cost both businesses and consumers billions of dollars.”[13] Furthermore, beyond the harms to individuals, shortcomings regarding companies’ data security practices has opened the door to cyberattacks on areas of critical American infrastructure.[14]

As if these harms were not sufficiently problematic, the FTC discusses the specific and particularized harms suffered by some of the nation’s most vulnerable populations: children and teenagers—both of which are subject to an increased risk relative to the misuse of data.[15] From “sophisticated digital advertising systems [that] reportedly automate the targeting of fraudulent products and services to the most vulnerable consumers” to “stalking apps” and social media which opens the door for “cyber bullying, cyberstalking, and the distribution of child sexual abuse material,”[16] it is clear that the “collection and use of data have significant consequences for consumers’ wallets, safety, and mental health,”[17] and children and teenagers are particularly susceptible.

In addition to harming children and teenagers specifically, unregulated commercial surveillance and poor data security practices implemented by companies under the current framework opens the door for discriminatory practices that impact women and minority groups. For instance, “‘companies’ growing reliance on automated systems is creating new forms and mechanisms for discrimination based on statutorily protected categories, including in critical areas such as housing, employment, and healthcare.”[18] In this vein, it is reported that “‘some employers’ automated systems have reportedly learned to prefer men over women” and “‘a recent investigation suggested that lenders’ use of educational attainment in credit underwriting might disadvantage students who attended historically Black colleges and universities.”[19] To make matters worse, the FTC notes that “these kinds of disparate outcomes may arise even when automated systems consider only unprotected consumer traits.”[20] Thus demonstrating that this unregulated practice is by nature flawed, discriminatory and, above all, harmful.

III. Inconsistent and Unpredictable Enforceability

Though some of the issues arising from the existing legal framework of the US in regard to privacy are actionable under select legal mechanisms—for example, a company collecting data for one purpose and purportedly utilizing it for another is a deceptive practice that could be pursued under the FTC Act’s Section 5 authority to address unfair and deceptive practices in the course of business or trade—the current framework is insufficient to effectively address all of the deficiencies highlighted by the FTC as recent “Commission actions, news reporting, and public research suggest that harmful commercial surveillance and lax data security practices may be prevalent and increasingly unavoidable.”[21] Under the current framework, the FTC works to protect the American consumer by enforcing its Section 5 authority on a case-by-case basis,[22] enforcing a number of sector-specific laws,[23] and by engaging in “broader policy work concerning data privacy and security.”[24] However, “enforcement alone without rulemaking may be insufficient to protect consumers from significant harms.”[25]

The FTC acknowledges its limited capacity to address the emerging harms sufficiently in several ways. First, “the FTC Act limits the remedies that the Commission may impose in enforcement actions on companies for violations of Section 5.1,”[26] and “the statute generally does not allow the Commission to seek civil penalties for first-time violations of that provision.”[27] Second, “while the Commission can enjoin conduct that violates Section 5, as a matter of law and policy enforcement, such relief may be inadequate in the context of commercial surveillance and lax data security practices.”[28] Third, “even in those instances in which the Commission can obtain monetary relief for violations of Section 5, such relief may be difficult to apply to some harmful commercial surveillance or lax data security practices that may not cause direct financial injury or, in any given individual case, do not lend themselves to broadly accepted ways of quantifying harm.”[29] Fourth, “the Commission’s limited resources today can make it challenging to investigate and act on the extensive public reporting on data security practices that may violate Section 5, especially given how digitized and networked all aspects of the economy are becoming.”[30]

The result of these limitations is inconsistent and unpredictable enforcement of privacy protections that does not meet the needs of the existing US market and harms the American consumer. The FTC notes that “[t]he fact that the Commission does not have authority to seek penalties for first-time violators may insufficiently deter future law violations” and “may put firms that are careful to follow the law, including those that implement reasonable privacy-protective measures, at a competitive disadvantage.”[31] Likewise, limited means of sufficient redress for harms and a lack of resources to investigate companies for privacy missteps means that many harms will go insufficiently addressed, if addressed at all.

By contrast, new regulations could address these issues by setting “clear legal requirements or benchmarks by which to evaluate covered companies,”[32] “incentiviz[ing] all companies to invest in compliance more consistently,”[33] and “provid[ing] clarity and predictability about [Section 5’s] application to existing and emergent commercial surveillance and data security practices that, given institutional constraints, may be hard to equal or keep up with, case-by-case.”[34]

IV. Conclusion

The FTC’s issuance of this latest ANPR, though merely an exploration of potential rulemaking possibilities, and acknowledgement of the key deficiencies of the current privacy framework[35]  marks a colossal shift in attitude concerning the privacy of the American consumer. With the US’s history of prioritization of a business-centric free market in the name of financial development, often at the expense of the individual, the FTC’s attention to the growing need for advanced privacy regulations demonstrates a heightened sense of consumer-focused protection in the modern technological age, offering the American public a life raft in the midst of repeated trespasses of companies in their pursuit of personal data. Following the lead of the GDPR, can the American public soon say “bonjour” to new blanket privacy regulations that would require companies to obtain user consent, abide by data minimization and limitation principles, and implement stricter data security practices?

[1] Sheila A. Millar & Tracy P. Marshall, FTC Issues Advance Notice of Proposed Rulemaking on Commercial Surveillance and Data Security, Keller & Heckman (Oct. 7, 2022, 2:30:00 PM),  https://www.consumerprotectioncxn.com/2022/09/ftc-issues-advance-notice-of-proposed-rulemaking-on-commercial-surveillance-and-data-security/.

[2] Fed. Trade Comm’n ,Trade Regulation Rule on Commercial Surveillance and Data Security, 1 (Aug. 11, 2022).

[3] Id.

[4] Id. at 4-5.

[5] Id. at 5.

[6] Id.

[7] Id.

[8] Id. at 6.

[9] Id.

[10] Id.

[11] Id. at 4.

[12] Id. at 7.

[13] Id.

[14] Id.

[15] Id.

[16] Id.

[17]Id.

[18] Id. at 8.

[19] Id.

[20] Id.

[21] Id. at 10.

[22] Id. at 16.

[23] Id. at 15.

[24] Id. at 20.

[25] Id. at 22.

[26] Id.

[27] Id.

[28] Id.

[29] Id. at 23.

[30] Id.

[31] Id. at 22.

[32] Id.

[33] Id.

[34] Id. at 23.

[35] Namely a lack of consumer consent, increased possibility of harms, and inconsistent and unpredictable enforcement of privacy regulations.