AI Tracking in Small Town Maine?: Real Life Optimization and Our Expectation of Privacy

AI Tracking in Small Town Maine?: Real Life Optimization and Our Expectation of Privacy

Viv Daniel

 

I. Introduction

Increasingly, the intangible world of the internet has been likened to physical space – the concept of the “digital town square,” the term “online space,” and the short-lived promise of the metaverse all come to mind – but recent developments beg the question: Are our physical spaces starting to resemble digital life?

This year, Old Town, Maine became the latest Bangor-area community to sign up for Placer.ai’s services through the Greater Bangor Recreation Economy for Rural Communities group, which is part of Eastern Maine Development Corporation.[1] The AI service collects location data from the smartphones of people moving in and out of these communities, alongside information about where these phones were immediately before and after moving through the monitored area.[2] The AI also collects personal data about the smartphone’s owner, including income level and other demographic information.[3]

In 2025, many Americans might expect that their movements from site-to-site online are being tracked, and their data collected along the way. In their real physical lives, even, most Americans put up with a certain degree of tracking and data collection in the form of surveillance cameras, cell-site location information (CSLI), and the like.[4] Still, many people would likely be surprised to find that their local government (or that of their vacation destination) had contracted with a private company to track their movements and income. So, why would a city or town sign up for such a tracking program?

When interviewed by Government Technology, Old Town director of economic and community development, E.J. Roach, cited the following advantages of using Placer.ai’s services: “[C]reat[ing] more personalized marketing,” obtaining for the city “more information about what people are interested in downtown,” and encouraging economic growth because prospective businesses will have a better idea of how to thrive in Old Town and feel more assured in the decision to open there.[5] The article also notes that businesses already open in Old Town will have access to Placer.ai’s data and will be able to use it to fine-tune their marketing.[6]

“More personalized marketing” has become a staple of the online world. Internet-era data collection systems involving Cookies, and sometimes even microphone access rendering household items into listening devices, have become commonplace.[7] However, it is worth asking whether most people would expect that their physical environment has learned to respond directly to their shopping and leisure habits. While responsive targeted advertising based on aggregated data profiles of internet users has made it much more efficient to be connected to your ideal wedding guest dress, it has also – alongside other forms of user targeting – opened up numerous Pandora’s boxes involving the ethics of online tracking, data retention and sharing, user manipulation, and political advertising, just to name a few.[8]

Certainly, a physical environment more sensitive to the behavior of those interacting with it and thus more capable of producing personalized marketing is not directly equivalent to the online regime of targeted advertising. For instance, no matter how personalized the city of Old Town hopes the future advertising of its businesses to be, an American main street is necessarily more responsive to the aggregate whims of the population than it could be to the specific desires of a single shopper or resident. A local boutique, even armed with troves of data about who shopped there, for how long, and where else those customers shopped, would struggle to match the efficiency of an Instagram ad pushing the exact pair of tongs a user googled twelve hours ago. Still, it is worth wondering what kinds of Pandora’s boxes will be opened if cities and towns continue to treat their commercial districts more and more like the internet. For instance, what would be the effect of such a commerce-based tracking system on the expectation of privacy in our society?

II. Location tracking and targeted advertising in the real world

Pacer.ai is an Israeli AI company which was founded to focus on AI’s application to physical location data aggregation.[9] In the years leading up to the Eastern Maine Development Corporation’s agreement with the company, Pacer.ai began entering into similar agreements both directly and indirectly with cities and towns around the country, with Philadelphia and Denver being two of the largest.[10] A quick Google search for “Pacer.ai” and whatever city one might want to know about gives an idea of how this technology is being used in practice via the company’s own blog. As an example, information gleaned by Pacer.ai about Philadelphia includes data about the local collegiate population’s patronage of Wawa convenience stores.[11] On their website, one can find articles aggregating data to discuss weekly differentials in 2025 US Open attendance, traffic patterns at Costco openings, which parts of cities received the most foot traffic, and trends in grocery store visits year to year, among many others.[12]

Pacer.ai’s website indicates the extent to which this AI tracking technology is already being implemented. Perhaps, the fact that this technology can exist at such a scale without many people even being aware of it suggests that it is unobtrusive and does not merit concern. However, instances from before the company even existed indicate potential dangers to the model. For example, in 2012, a pregnant teenage girl’s parents  were alerted to her pregnancy before she herself had told them because of purchases she had made at Target.[13] Target had a practice of creating client profiles based on card information and purchases – something of which they particularly took advantage for expectant parents – and sending these clients highly targeted coupons.[14] It is worth noting that this unfortunate incident, while leading to some rethinking of Target’s coupon-mailing strategies, was the outgrowth of one company’s proprietary data.[15] Now imagine what could be inferred from all of the location and demographic data being accessed through Pacer.ai by businesses and local governments.

III. Pacer.ai and the “reasonable expectation” problem

In Justice Kennedy’s dissent in the seminal 2018 Fourth Amendment case, Carpenter v. United States, he makes reference to a pre-AI-rush practice similar to that of Pacer.ai: Discussing the CSLI data at issue in the case, he writes, “[a]mong other things, providers aggregate the records and sell them to third parties along with other information gleaned from cell phone usage. This data can be used, for example, to help a department store determine which of various prospective store locations is likely to get more foot traffic from middle-aged women who live in affluent zip codes.”[16] According to the majority in this case, providers are well within their rights to continue this practice – and they may even share data with the government; they simply may not do so without a valid warrant.[17]

The decision to require a warrant for the government to access retrospective aggregate CSLI data was based on a person’s reasonable expectation of privacy in their “near perfect” location history, and a concern on the part of the majority about allowing the government easy access to all that can be inferred from it.[18] Additionally, the majority points to the fact that CSLI data is produced automatically and that one may not reasonably opt-out of carrying a cellphone in modern society.[19] Still, the “reasonable expectation of privacy” test originating in Katz v. United States is a malleable one, requiring both subjective and objective elements.[20] In 2018, a majority of justices found it unreasonable that a person would not have an expectation of privacy in their movement history, despite the prevalence of CSLI data.[21] However, seven years later there are even more technologies and services weighing down that reasonable expectation of privacy – Pacer.ai included.

Cellphone users subject to Pacer.ai’s tracking are assured that they “can opt out of sharing their data by limiting location access from their phone and restricting apps from using their location.”[22] Is this reasonable, especially accounting for the fact that surely not everyone in Philadelphia knows about this company? There is also a notable difference between the scenario in Carpenter, where governments must access third-party data through a warrant, and the increased governmental access to data potentially afforded by companies like Pacer.ai contracting more directly with local governments.[23]

Finally, there is no doubt that the expectations of privacy that people have online differ from those they have in the real world. For years, memes have been shared where people make reference to their personally assigned FBI agents they imagine to be watching them through their phone or laptop cameras, yet there is no equivalent joke outside of the internet.[24] When reminders of online life do encroach on physical reality, they tend also to encroach on people’s sense of privacy. Take, for example, growing backlash to the practice of filming people in public and uploading the footage to social media.[25] This backlash is usually based upon the grounds that it makes real life less rich and exciting because everyone is too afraid of being singled out for online ridicule without their knowledge – essentially an argument about chilling expression.[26] This concern is not dissimilar from legal scholarship and advocacy regarding surveillance in real life.[27]

IV. Conclusion

Pacer.ai and services like it – those which bring the functions of internet surveillance closer to physical reality – run the risk of bringing our expectations of privacy in daily life more in line with our weakened sense of privacy online. In 2018, the Court validated Carpenter’s genuine and reasonable expectations of privacy in cell site location information. But, how reasonable can an expectation of privacy remain when the forces of surveillance in the name of commercial optimization bring the same jadedness we have online into the shopping district of Old Town, Maine?

 

[1] Kasey Turman, Old Town, Maine, to Track Residents, Visitors With AI, Government Technology (Aug. 28, 2025), https://www.govtech.com/artificial-intelligence/old-town-maine-to-track-residents-visitors-with-ai.

[2] Id.

[3] Id.

[4] See, e.g., Carpenter v. United States, 585 U.S. 296 (2018).

[5] Turman, supra note 1.

[6] Id.

[7] See, e.g., In re Google Assistant Privacy Litigation, 546 F.Supp.3d 945 (N.D. Cal. 2021).

[8] See, e.g., Katie Harbath & Collier Fernekes, History of the Cambridge Analytica Controversy, Bipartisan Policy Center (Mar. 16, 2023), https://bipartisanpolicy.org/blog/cambridge-analytica-controversy/; Isabella Kwai, Citing New Rules, Meta Says It Will End Political Ads in E.U., The New York Times

(July 25, 2025), https://www.nytimes.com/2025/07/25/business/meta-eu-political-ad-ban.html; Elise Hu, Facebook Manipulates Our Moods For Science And Commerce: A Roundup, NPR (Jun. 30, 2014), https://www.npr.org/sections/alltechconsidered/2014/06/30/326929138/facebook-manipulates-our-moods-for-science-and-commerce-a-roundup.  

[9] Catherine Shu, Location data analytics startup Placer.ai raises $50M Series B, TechCrunch

(Apr. 27, 2021), https://techcrunch.com/2021/04/27/location-data-analytics-startup-placer-ai-raises-50m-series-b/#:~:text=The%20new%20capital%20will%20be,a%20$12%20million%20Series%20A.&text=Placer.ai%20collects%20geolocation%20and,labels%2C%20open%20their%20own%20stores..

[10] Turman, supra note 1.

[11] Lila Margalit, Dining in University City, Philadelphia: A Collegiate Vibe, The Anchor. (Jan. 23, 2025), https://www.placer.ai/anchor/articles/dining-in-university-city-philadelphia-a-collegiate-vibe.

[12] The Anchor., https://www.placer.ai/anchor (last visited Sep. 21, 2025).

[13] Kashmir Hill, How Target Figured Out a Teen Girl Was Pregnant Before Her Father Did, Forbes

(Feb. 16, 2012), https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/.

[14] Id.

[15] Id.

[16] Carpenter v. United States, supra, at 324 (Kennedy, J., dissenting).

[17] See id.

[18] Id. at 312.

[19] Id. at 298.

[20] Katz v. United States, 389 U.S. 347, 360-361 (1967) (Harlan, J., concurring).

[21] See supra note 4.

[22] Turman, supra note 1.

[23] See id.

[24] See Jasmine Erdener, Surveillant Companionship and the FBI Agent Meme, 22 Surveillance & Society 227 (2024).

[25] See, e.g., Jason Okundaye, Why It’s Time to Stop Filming Strangers in Public for Social Media Thrills, The Guardian (Jan. 25, 2023), https://www.theguardian.com/commentisfree/2023/jan/25/filming-strangers-public-social-media.

[26] Id.

[27] See Brief for Electronic Frontier Foundation et al. as Amici Curiae Supporting Petitioner, Carpenter v. United States, 585 U.S. 296 (2018) (No. 16-402) at 16 (“Such extensive monitoring [via CSLI] reveals a wealth of information about a person’s expressive and associational activities protected by the First Amendment…”).