A.I., Facial Recognition, and the New Frontier of Housing Inequality

A.I., Facial Recognition, and the New Frontier of Housing Inequality

By: Caroline Aiello

 

Introduction

“As soon as Ms. Pondexter-Moore steps outside her home, she knows she is being watched.”[1] Schyla Pondexter-Moore is a D.C. resident, and has been living in public housing for over a decade.[2] In 2022, she sued the D.C. Housing Authority for violating her right to privacy, when they forcibly installed advanced surveillance systems in her neighborhood, denied her access to information about the systems, and jailed her overnight while cameras capable of peering into her living room and bedroom were mounted.[3] Schyla is one of over a million public housing residents in the United States.[4] In order to maintain security at these housing complexes, resource-strapped landlords are adopting “landlord tech” to meet their security obligations.[5] Concern for the safety of public housing residents is legitimate and pressing. However, advanced surveillance systems using new features like artificial intelligence are over-surveilling and under-protecting the people they monitor.[6] As jurisdictions in the U.S. and internationally evaluate these systems, key questions emerge about how to balance technological innovation with fundamental principles of respect, dignity, and equity in housing access.

 

Public Residents and Their Reasonable Expectation of Privacy

In 1937, FDR passed The U.S. Housing Act as part of his New Deal.[7] This Act responded to dire housing conditions and economic insecurity that plagued Americans during the Great Depression.[8] It created public housing agencies (PHAs) responsible for creating and managing local public housing complexes.[9] Currently, around 1.5 million people live in public housing.[10] Although originally populated primarily with white, median income households, most units are occupied now by low income households and people of color. [11]

Security is a compelling reason to monitor public housing. PHAs must maintain plans for safety and crime prevention for inspection by The Department of Housing and Urban Development.[12]  These security plans must involve collaboration with local police departments to effectuate.[13] Despite disproportionate levels of surveillance for public housing residents, they are not afforded the same privacy rights as private residents.

Where privacy expectations and technology intersect, the Supreme Court has limited the legality of long term, highly detailed monitoring.[14] The Court found that the public does not expect police to “secretly monitor and catalogue every single movement of an individual[],” making such a practice violative of an individual’s Fourth Amendment rights. [15] In another case, t

The first kind of accessible surveillance tech consisted mostly of CCTV cameras.[17] These cameras were expensive, conspicuous, and limited in capability for recording and storing video.[18] Cameras now are capable of so much more. They identify and label individuals,[19] decide whether someone’s behavior is “normal,”[20] and alert landlords and security personnel immediately when they detect something suspicious.[21] In private housing, constitutional protections place guardrails around surveillance, but in public housing, residents’ everyday behavior is scrutinized by programs and algorithms who capture and report what is going on around them.

 

How Over-Surveilling Disproportionately Harms Subsets of the Public Housing Population

In her 2023 book, Your Face Belongs to Us, Kashmir Hill popularized a dystopian account of Clearview A.I., a facial recognition tool used by law enforcement agencies and private companies to discover everything about someone with just a snapshot of their face.[23] The analysis relies on a database of billions of images and “faceprints” collected from the internet without the subject’s knowledge or consent.[24] European data protection authorities fined Clearview A.I. tens of millions of euros for their nonconsensual collection and abuse of European citizens’ data.[25] The ACLU sued the company on the same grounds.[26] The Circuit Court in Cooks County subsequently banned Clearview’s practice of “covert” and “surreptitious” monitoring of Illinois residents for five years.[27] Additionally, no law enforcement agency in Illinois is allowed to use Clearview because of the harms the system created.[28]

From 2012 to 2020, Rite Aid Pharmacy deployed FRT in many of their locations to identify potential shoplifters.[29] In 2023, The Federal Trade Commission concluded an enforcement action against the pharmacy for misuse of the technology. Now, Rite-Aid, too, is banned from using their FRT systems for five years.[30] The FTC cited Rite-Aid’s “reckless use of facial surveillance systems” that caused “humiliation . . . other harms, and . . . put consumers’ sensitive information at risk.”[31]

Technologies like ClearView A.I. and Rite-Aid surveillance are cheaper and more accessible than they ever were and have made their way into the landlord technology industry. “Landlord tech” refers to “technical products and platforms that have facilitated the merging of technology and real estate industries in novel ways.”[32] These technologies, like facial recognition and behavioral tracking, exist to monitor residents, and to track and categorize their everyday behavior.

Artificial intelligence systems rely on enormous sets of data to teach them to make decisions. When incomplete, incorrect, or misrepresentative data is used to train a decision maker, it learns to make bad decisions. In the Rite-Aid case, for example, their systems were trained on low quality, biased data, causing thousands of false positives.[33] Most of the time, the false positives were generated in predominantly Black and Asian communities, disproportionately harming those communities.[34]

The FTC is not the only entity that recognizes the risk created by these systems. The Colorado AI Act created a task force to investigate and understand the dangers of FRT and AI.[35] San Francisco banned the use of FRT, with several other cities following suit.[36] In the European Union, FRT is considered “high-risk”, requiring operators to test rigorously for bias and accuracy.[37]

In many more cities, however, public housing residents are left unprotected from constant, biased, and automated surveillance. The Washington D.C. housing complex that Ms. Pondexter-Moore sued operates eighty surveillance cameras.[38] In New York, the state with the highest number of public housing residents, there is approximately one camera installed per nineteen citizens, a higher camera to person ratio the Louvre, Wrigley Field, and Los Angeles Airport.[39] In a particularly concerning case, a housing complex in Rolette, North Dakota has one camera installed for every resident, just shy of the amount of cameras per capita used in Rikers Island.[40]

The proliferation of surveillance technology in public housing represents a concerning shift in how we monitor and control vulnerable populations. While security concerns in public housing are legitimate, the current implementation of “landlord tech” creates a two-tiered system of privacy rights that disproportionately affects low-income residents and people of color. The cases of Clearview AI and Rite-Aid demonstrate how facial recognition and A.I. systems, when improperly deployed, can create more harm than protection. These harms are amplified in public housing contexts where residents have limited ability to opt out or challenge such systems. we must critically examine whether these tools truly serve their stated security purposes or merely extend systems of control over already marginalized communities. The experiences of residents like Ms. Pondexter-Moore highlight the urgent need for balanced approaches that respect both security needs and fundamental privacy rights.

 

References

[1] Complaint at 1, Pondexter-Moore v. D. C.  Hous. Auth., No. 1:22-cv-03706 (D.D.C. Dec. 12. 2022).

[2] Id. at 2.

[3] Id. at 1.

[4] U.S. Dep’t of Hous. & Urb. Dev., Public Housing (PH) Data Dashboard, https://www.hud.gov/program_offices/public_indian_housing/programs/ph/PH_Dashboard (last visited Feb. 14, 2025).

[5] Alicia Frazier, 20 Stats Why Tenant Experience Tech is Top Priority for CRI Now, BuildingEngines

(July 7, 2022). https://www.buildingengines.com/blog/commercial-real-estate-tenant-experience-stats/.

[6] Sarah Miller, Reconceptualizing Public Housing: Not as a Policed Site of Control, but as a System of Support, 28 Geo. J. on Poverty L. & Policy, 95, 110 (2020).

[7] FDR Library, FDR & Housing Legislation, Nat’l Archives,

https://www.fdrlibrary.org/housing (last visited Feb. 12, 2025).

[8] Id.

[9] Nat’l Low Income Hous. Coal., A Brief Historical Overview of Affordable Rental Housing 1 (2015), https://nlihc.org/sites/default/files/Sec1.03_Historical-Overview_2015.pdf.

[10] HUD, supra note 4.

[11] Jennifer Schwartz, HUD Publishes Data on 2021 Housing Credit Tenant Characteristics, NCHSA (Aug. 10, 2023); https://www.ncsha.org/blog/hud-publishes-data-on-2021-housing-credit-tenant-characteristics/?utm_source=chatgpt.com; Terry Gross, A ‘Forgotten History’ of How the U.S. Government Segregated America, NPR (May 3, 2017), https://www.npr.org/2017/05/03/526655831/a-forgotten-history-of-how-the-u-s-government-segregated-america?utm_source=chatgpt.com.

[12] 24 C.F.R. § 903.7(m).

[13] Id.

[14] See United States v. Jones, 565 U.S. 400, 404–405 (2012).

[15] Id. at 430.

[16] United States v. Cuevas-Sanchez, 821 F.2d 248, 251 (5th Cir. 1987).

[17] Bryan Johnston, A Brief History of Surveillance Cameras,  Deep Sentinel (July 18, 2022),  https://www.deepsentinel.com/blogs/home-security/history-of-surveillance-cameras/?srsltid=AfmBOoqBpSXdgV3DJky52z_nliObx6OuS02BlOvJT2gNSbuv-BfMu8E2.

[18] Id.

[19] AWS, What is Facial Recognition?, Amazon, https://aws.amazon.com/what-is/facial-recognition/ (last visited Feb. 14, 2025).

[20]How Viisights Detects Suspicious Activity, Viisights, https://www.viisights.com/products/wise/suspicious-activity/ (last visited Feb. 15, 2025).

[21]AI Analytics, Rhombus , www.rhombus.com/ai-analytics/ (last visited Feb. 14, 2025).

[22] Complaint at 3, Pondexter-Moore v. D. C.  Hous. Auth., No. 1:22-CV-03706 (D.D.C. Dec. 12. 2022).

[23] Your Face Belongs to Us, Kirkus Reviews, https://www.kirkusreviews.com/book-reviews/kashmir-hill/your-face-belongs-to-us/ (last visited Feb. 15, 2025).

[24] Terence Liu, How We Store and Search 30 Billion Faces, Clearview AI (Apr. 18, 2023),  https://www.clearview.ai/post/how-we-store-and-search-30-billion-faces.

[25] The French SA Fines Clearview AI EUR 20 Million, Eur. Data Prot. Auth. (Sept. 3, 2024), https://www.edpb.europa.eu/news/national-news/2022/french-sa-fines-clearview-ai-eur-20-million_en; Dutch Supervisory Authority Imposes a Fine on Clearview Because of Illegal Data Collection for Facial Recognition, Eur. Data Prot. Auth. (Sept. 3, 2024), https://www.edpb.europa.eu/news/national-news/2024/dutch-supervisory-authority-imposes-fine-clearview-because-illegal-data_en.

[26] See generally Complaint, ACLU, et al. v. Clearview A.I., No. 2020CH04353 (Ill. Cir. Ct. 2020).

[27] Consent Order, ACLU, et al. v. Clearview A.I., No. 2020CH04353 (Ill. Cir. Ct. 2020).

[28] Id at 3.

[29] Fed. Trade Comm’n, Rite Aid Banned from Using AI Facial Recognition After FTC Says Retailer Deployed Technology without Reasonable Safeguards (Dec. 19, 2023), https://www.ftc.gov/news-events/news/press-releases/2023/12/rite-aid-banned-using-ai-facial-recognition-after-ftc-says-retailer-deployed-technology-without.

[30] Id.

[31] Id.

[32] Erin McElroy, et al., COVID-19 Crisis Capitalism Comes to Real Estate, Boston Rev. (May 7, 2020), https://www.bostonreview.net/articles/erin-mcelroy-meredith-whittaker-genevieve-fried-covid-19-and-tech/.

[33] Fed. Trade Comm’n, supra note 30.

[34] Id.

[35] Colo. Rev. Stat. § 2-3-1707(3)(j).

[36] Shannon Flynn, 13 Cities Where Police are Banned from Using Facial Recognition Tech, Innovation & Tech Today, https://innotechtoday.com/13-cities-where-police-are-banned-from-using-facial-recognition-tech/ (last visited Feb. 15, 2025).

[37] Eur. Parliament, Regulating facial recognition in the EU 25 (Sept. 2021), https://www.europarl.europa.eu/RegData/etudes/IDAN/2021/698021/EPRS_IDA(2021)698021_EN.pdf.

[38] Pondexter-Moore, Compl. at 2.

[39] Douglas MacMillan, Eyes on the poor: Cameras, facial recognition, watch over public housing, Wash. Post (May 16, 2023), https://www.washingtonpost.com/business/2023/05/16/surveillance-cameras-public-housing/.

[40] Id.

Profits Over Privacy: A Confirmation of Tech Giants’ Mass Surveillance and a Call for Social Media Accountability

Profits Over Privacy: A Confirmation of Tech Giants’ Mass Surveillance and a Call for Social Media Accountability

Aysha Vear

 

In an effort to better understand the data collection and use practices of major social media and video streaming services (SMVSSs), the Federal Trade Commission issued orders to file Special Reports under Section 6(b) of the FTC Act[1] to nine companies in 2020.[2] The orders sought to understand how the companies collect, track, and use their consumers’ personal and demographic information; how they handle advertising and targeted advertising; whether they apply algorithms, data analytics, and artificial intelligence (AI) to consumer information; and how their practices impact children and teens.[3] Titled, “A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services,” the 2024 report has been four years in the making and a key but unsurprising finding was that the business model of targeted advertising was the catalyst for extensive data gathering and harmful behaviors, and companies failed to protect users, particularly children and teens.[4]

Data Practices and User Rights
Companies involved in the FTC report collected a large amount of data about consumers’ activity on their platforms and also gleaned information about consumers’ activity off of the platforms which exceeded user expectations.[5] The Commission found that a massive amount of data was collected or inferred about users including demographic information, user metrics, or data about their interaction with the network.[6] With respect to specific privacy settings, many companies did not collect any information at all about user changes or updates to their privacy settings on the SMVSSs.[7]

The information came from many places as well. Some information on users collected by the companies was directly input by the SMVSS user themselves when creating a profile; passively gathered from information on or through engagement with the SMVSS; culled from other services provided by company affiliates or other platforms; inferred from algorithms, data analytics, and AI; or from advertising trackers, advertisers, and data brokers. Data collected was used for many different purposes including for targeted advertising, AI, business purposes like optimization and research and development, to enhance and analyze user engagement, and to infer or deduce other information about the user.[8] In addition, most companies deliberately tracked consumer shopping behaviors and interests.[9] Little transparency, if any, was provided on the targeting, optimization, and analysis of user data.

Continue reading

Say “Bonjour” to New Blanket Privacy Regulations?

The FTC Considers Tightening the Leash on the Commercial Data Free-for-All and Loose Data Security Practices in an Effort to Advance Toward a Framework More Akin to the GDPR

By Hannah Grace Babinski, class of 2024

On August 11, 2022, the Federal Trade Commission (FTC) issued an Advance Notice of Proposed Rulemaking (ANPR) concerning possible rulemaking surrounding “commercial surveillance” and “lax data security practices”[1] and established a public forum date of September 8, 2022.[2] The FTC’s specific objective for issuing this ANPR is to obtain public input concerning “whether [the FTC] should implement new trade regulation rules or other regulatory alternatives concerning the ways in which companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.”[3]

Continue reading

Leaning into CHAOS (Child’s Health and Online Safety Act): Revision to FTC’s Enforcement of COPPA & New Model Rule for Child Advertising

Comment

By Gabrielle Schwartz, Class of 2023

I. Introduction

A wise author once wrote, “I know, up top you are seeing great sights, but down here at the bottom we, too, should have rights.”[1] Dr. Suess not only understood the importance of inspiring children but believed it was essential to teach children valuable life lessons. As more children continue to stray away from reading as their source of entertainment, children are more likely to become fascinated by the beauty of the internet. Although the internet’s capabilities may positively impact children, there are also adverse effects through the use of the internet’s products, services, and content. Many companies, individuals (such as parents), and lawmakers are calling for action to be taken to prevent and protect against arguably toxic online content.

Continue reading