Implications of New School Surveillance Methods on Student Data Privacy, National Security, Electronic Surveillance, and the Fourth Amendment

Implications of New School Surveillance Methods on Student Data Privacy, National Security, Electronic Surveillance, and the Fourth Amendment

By Amanda Peskin, University of Maryland, Francis King Carey School of Law, Class of 2024

Abstract

Since the Covid-19 pandemic, schools have escalated their use of educational technology to improve students’ in-school and at-home learning. Although educational technology has many educational benefits for students, it has serious implications for students’ data privacy rights. Not only does using technology for educational practices allow schools to surveil their students, but it also avails students to data collection by the educational technology companies. This paper discusses the legal background of surveilling and monitoring student activity, provides the implications surveillance has on technology, equity, and self-expression, and offers several policy-based improvements to better protect students’ data privacy.

Introduction

Prior to the COVID-19 pandemic, public schools were in the process of expanding their use of technology in education practices, and almost half of all schools in the United States had a computer for each student.[1] Since the pandemic, this use of educational technology in the classroom has escalated.[2]Now, schools are increasingly providing laptops, tablets, or other mobile electronic devices to their students for in-school and at-home use [3]

However, even though educational technology is positively perceived by educators and administrators, “it has serious implications on student data privacy. Not only does using technology in educational practices permit schools to surveil their students, often without the students’ knowledge, but doing so also creates many opportunities for educational technology (“EdTech”) companies to collect student data.”[4] Ultimately, the desire to monitor students to purportedly protect school safety, combined with the fact that the existing student data privacy protection laws are inefficient, results in inappropriate reliance on surveillance technology in the classroom. Section I of this paper focuses on the legal background of surveilling and monitoring student activity through relevant case law and discusses the relevant federal privacy protection laws for students. Section II provides an analysis of the rationales for student surveillance, possible legal implications for using surveillance technology on students, and social policy implications and student self-expression concerns. Section III offers policy-based suggestions for how to improve the data privacy concerns associated with surveilling students. Finally, Section IV provides a summarizing conclusion on the problematic nature of school surveillance.

I. Legal Background

a. Case Law

The Fourth Amendment plays an important role when analyzing the constitutionality of school surveillance. Specifically, the Fourth Amendment protects people against unreasonable searches and seizures.[5] In Katz v. United States, the Supreme Court established that violations of privacy can occur even without any physical intrusion.[6] Famously, Justice Harlan’s concurrence in Katz established that the Fourth Amendment protects an individual’s reasonable expectation of privacy, which is determined by assessing  (1) whether an individual has a subjective expectation of privacy, and (2) whether society would agree that their expectation of privacy is reasonable.[7] Later, the Court heard New Jersey v. T.L.O., which examined how the Fourth Amendment applies in public schools.[8] Specifically, the Court considered whether the prohibition on unreasonable searches and seizures applies to searches conducted by public school officials.[9] The Court held that to determine whether a search in a public school complies with the Fourth Amendment, the search must (1) be justified at its inception and (2) be reasonable in scope.[10] Under T.L.O., a search would be permissible “when the measures adopted are reasonably related to the objectives of the search and the nature of the infraction.”[11] Interestingly, the Court in T.L.O. stated that such  searches do not require probable cause because the reasonableness standard would be sufficient and oes not create an undue burden on school authorities to maintain order.[12] The Court also stated that the reasonableness standard would not lead to unrestrained intrusions on students’ privacy.[13]

The Supreme Court’s later decision in Vernonia School District 47J v. Acton continued to lower the standard for a student’s expectation of privacy in school.[14] In this case, the Court determined that, in a case involving drug testing in schools, students ultimately have a lowered reasonable expectation of privacy.[15] Together, the Vernonia and TLO framework diminish students’ expectation of privacy in schools substantially. Using this framework, one could infer that if a school had what it believes to be legitimate reason to monitor student activity,  schools would have the authority to potentially conduct invasive searches.

Specifically related to surveillance matters, the Court decided Clapper v. Amnesty International USA.[16] There, the Plaintiffs, people whose work “require[d] them to engage in sensitive international communications with individuals who they believe[d were] likely targets of surveillance,” alleged that communicating with their clients would inherently subject them to government surveillance, influencing them to cease all communications with their clients in order to protect their client’s privacy and confidentiality of their communications.[17] The Supreme Court rejected the Plaintiffs’ argument, finding  that the feared surveillance was not “certainly impending.”[18] Even though Clapper focused on foreign surveillance, it could have implications for school surveillance of students.[19]

In 2008, the Sixth Circuit’s holding in Brannum v. Overton County School Board limited the extent to which schools may surveil their students.[20] In this case, the Sixth Circuit had to decide whether school officials who installed video cameras throughout the school, including in the locker rooms, violated students’ reasonable expectation of privacy.[21] There, the Court held that the school was justified in setting up video surveillance around the school, but it overreached its scope by extending the surveillance to locker rooms, where students have a significant privacy interest.[22] In the following year, the Sixth Circuit continued to establish a precedent of protecting student privacy interests in G.C. v Owensboro Public Schools where it decided whether a school official violated the Fourth Amendment when the official confiscated a student’s cell phone and read the student’s  text messages.[23] The Court ultimately held that using a cell phone on school grounds does not automatically grant school officials  the unlimited right and ability  to search the content of a cell phone that may or may not be related to an  infraction.[24] Further, the Court declared that searching a cell phone is equivalent to finding information that they otherwise would not have, if they did not have access to such technology, thus enhancing student privacy protections.[25]

Most recently, the Supreme Court’s 2021 decision in Mahanoy Area School District v. B.L. analyzed a school’s ability to regulate student speech in the context of the First Amendment.[26] In that case, a student sent vulgar messages  and gestures over Snapchat to her friends criticizing her school and its cheerleading team.[27] Even though she sent these messages outside of school hours and off-campus, the school suspended the student for one year from the cheerleading team, as she violated school rules by using profanity connected to an extracurricular activity.[28] Ultimately, the Court stated that the First Amendment imposes broad limits on public schools’ ability to regulate off-campus speech delivered on social media.[29] However, the Court left the door open for educators to regulate messages considered highly disruptive to school operations.[30] Although Mahanoy was a First Amendment case, it poses interesting implications on the extent of schools’ surveillance over their students and what constitutes a “highly disruptive” message capable of being surveilled and regulated.[31]

b. Federal Laws

Few federal laws that protect child and student digital privacy currently exist. One of the few federal laws that does focus on the protections of children’s privacy is the Children’s broa Privacy Protection Act (“COPPA”), which establishes requirements for website operators and online services geared towards children under the age of thirteen.[32] Specifically, it prevents data collection for individuals under the age of thirteen without parental consent and requires websites directed at children to publish their privacy policies.[33] Further, COPPA explicitly applies only to online platforms that have actual knowledge that their company is collecting a child’s, as defined by COPPA, personal information.[34]

Regarding student privacy, the primary student privacy law is the Family Educational Rights and Privacy Act (“FERPA”). FERPA functions to safeguard students’ records by preventing schools from sharing students’ personally identifiable information (“PII”).[35] However, in 2008, the federal rulemaking process weakened FERPA’s protections by allowing schools to “authorize the disclosure of education records without consent to contractors, consultants, volunteers, and other outside parties to whom an educational agency or institution has outsourced services or functions,” which includes[36]

Under FERPA, schools may disclose certain student information, as long as it meets either the health or safety exception or the school officials exception.[37] The health or safety exception gives schools more leeway to disclose information to law enforcement for the purpose of protecting a student or community’s health or safety.[38] The school officials exception provides  that any school official can be considered law enforcement for the purpose of information sharing.[39] Further, a “school official” for the purpose of FERPA is defined as any party with a “legitimate educational interest,”[40] but the Department of Education has yet to release an exact definition of what constitutes a “legitimate educational interest,” giving broad discretion to schools to interpret what parties have such an interest.[41] Similarly, FERPA also allows third parties to be considered school officials if they “perform an institutional service or function,” which enhances a school’s discretion in who may have access to student information.[42]

Finally, the Children’s Internet Protection Act (“CIPA”) and the Protection of Pupil Rights Amendment (“PPRA”) establish parameters surrounding children’s internet use and privacy. First, CIPA requires that schools who receive federal discounts for broadband services  “take steps to safeguard children online, including ‘monitoring online activities of minors.’”[43] CIPA, further, requires that public schools block access to harmful websites and monitor “online activities of minors and the operation of a technology protection measure with respect to any of its computers with internet access.”[44] However, CIPA does not give schools proper guidelines to implement this monitoring policy.[45]

The PPRA mandates that no student shall be required to submit survey or evaluation information relating to their political ideology, any psychological problems or those of their family members, sexual behaviors or attitudes, illegal behavior, religious affiliations, or income.[46] It also states that agencies receiving federal funding shall consult with parents on the collection, disclosure, or use of a student’s personal information.[47] That said, parents do not have to be consulted on the collection, disclosure, or use of their student’s personal information regarding instructional and curricular materials.[48]

Several of the aforementioned laws contain serious loopholes. Primarily, the lack of specificity in FERPA could not only give third parties access to students’ data but could also allow schools to disclose student information and data to law enforcement officials.[49] Additionally, other federal child privacy lawsthat require monitoring of online activities to protect children, like the CIPA, are overly broad and do not offer proper guidelines concerning limitations on monitoring, creating an opportunity for unnecessary student surveillance.[50] Also, even though  PPRA mandates that agencies consult with parents regarding the collection and disclosure of their child’s personal information, it does not require such parental consultation regarding their child’s personal information as it relates to educational materials, meaning that parents have no control over how EdTech companies handle their child’s personal data.[51] Finally, COPPA falls short by only protecting children under thirteen and not including a private right of action, so even if an entity violated the law, many concerned parents and families would not be able to seek redress.[52] To conclude, although several federal laws were drafted with the intention of protecting  student data privacy, their overly broad parameters, lack of implementation guidelines, and lack of a private right of action provide ample opportunity for schools and third parties to inappropriately use and access student personal data.

II. Analysis

a. Uses of School Surveillance

Many schools that use forms of online student surveillance justify doing so to improve school safety, such as  monitoring for  cyberbullying or identifying potential school shooters. Numerous state cyberbullying laws give schools  almost unlimited power over student online and electronic activity, regardless of whether it occurs on- or off-campus.[53] Specifically, some laws implicitly authorize surveillance by permitting or requiring schools to discipline students for cyberbullying.[54] Conveniently, these laws tend to avoid discussions of how schools may discover cyberbullying occurrences.[55] In fact, one of the most effective ways to monitor cyberbullying is by using or developing a comprehensive monitoring system for online student activity.[56] To achieve this, several  schools and school districts pay external companies to monitor their students’ online and electronic activity, regardless of where the student is and what time they are electronically active.[57]

Additionally, local school districts use digital surveillance technology to monitor for potential school shooters, but many states’ policies concerning this monitoring are overly broad.[58] For example, the Chapel Hill and Carrboro City Schools school district in North Carolina enacted a policy stating that no student, employee, or visitor has a reasonable expectation of privacy over anything they create, store, send, delete, receive, or display when using the school system’s network, devices, email system, or school-issued technology resources.[59] Further, schools gravitate towards third-party software company partnerships because they give schools the opportunity to access a wide range of student-created content covering a vast period of time.[60]

One popular safety management platform, Gaggle, monitors all words typed by students and can target certain words that are considered troubling. The platform then notifies the school and law enforcement of any potential red flags it finds.[61] Another popular platform is Social Sentinel, a social media scanning software that scans students’ social media platforms and searches students’ words, images, and videos.[62] Proponents of schools partnering with safety management platforms argue that their use is justified due to the clear need for preventing self-harm, violence, and verbal harassment.[63] Nonetheless, accurately predicting the likelihood of future violent behavior or crime is exceedingly difficult and counters the justifiability of subjecting students to such pervasive surveillance schemes.[64] Additionally, novel legal questions ultimately arise as to whether schools should notify their students that they are using surveillance technology to monitor student online activity.[65]

Some schools also use surveillance cameras with cloud-based storage. When using such technologies, administrators can browse video-imaging of students in real time on any device connected to the cloud.[66] Some of the cloud systems even use facial recognition software, especially since many surveillance companies grant schools free facial recognition software.[67] The fact that student video-imaging is accessible on any device connected to the cloud system is problematic, as it relies on the premise that no parties accessing the cloud have nefarious intentions for the students.

Finally, schools have begun using biometric data and artificial intelligence to collect student information and promote school safety.[68] Tech companies are creating and selling to school districts new devices designed to decrease the likelihood of school shootings, like advanced cameras and body scanners that use biometrics and artificial intelligence to recognize faces, detect weapons, gunshots, and track people’s locations in schools.[69] Some of the technology sold to schools not only collects images, but also collects fingerprints and tracks children’s facial expressions.[70] Some schools also implement location tracking by using an “e-hallpass” that continuously logs a student’s time and location in hallways by inserting a radio frequency identification device microchip inside a student’s school identification.[71]

b. Legal Implications

The widespread use of electronic surveillance over students has serious implications on the nexus between the Fourth Amendment’s protection against searches and seizures and students’ data privacy. As of now, little case law exists that explicitly covers a student’s rights under the Fourth Amendment regarding surveillance.[72] Many courts and legislators believe that minors do not have as many liberty rights, including  privacy rights, as adults due to their lower developmental and cognitive capabilities.[73] As such, many schools argue that students have no reasonable expectation of privacy.[74] That said, child constitutional rights advocates argue that purporting that children have fewer privacy liberties due to their developmental capabilities is counterintuitive.[75] Specifically, these advocates argue that because children have less developed decision-making abilities, their privacy rights should be more protected than adults, especially since children do not completely understand the extent to which their data is revealed.[76]

Additionally, several of the laws intended to protect student privacy have major shortfalls. Primarily, policymakers have not prioritized protecting the student data that private companies collect.[77] For instance, FERPA does not clarify whether information obtained from school surveillance technologies is part of an educational record that would have privacy protections.[78] Some people argue that school districts can work around FERPA’s requirement that would otherwise prevent student data from being shared with third parties by classifying EdTech companies as “school officials.”[79] Furthermore, the 2008 rulemaking that weakened FERPA[80] protections allows schools to disclose student biometrics or online activity information to service providers who then have unregulated control over the student data.[81]

Additionally, under the school officials exception,[82] if video surveillance is in the purview of the school’s law enforcement unit, the surveillance recordings would not be considered a protected educational record. The surveillance would only be protected if the school’s law enforcement does not maintain it.[83] Similarly, the broadness of the “school official” and “legitimate educational interest” definitions allows schools to classify EdTech companies as officials with a “legitimate educational interest.”[84] As such, schools have extremely wide discretion in discerning which parties are given complete access to student information under the current laws.[85]

Moreover, because FERPA explicitly allows certain third parties to be designated as school officials,[86] security companies handling student biometric data could be considered school officials, giving them the ability to potentially sell extremely personal student data. To counter this possibility, in the past, the Department of Education has stated that schools requiring parents to accept a third-party learning platform’s conditions in order to enroll  students in the learning platform violated the provisions of FERPA by creating a conditional enrollment that waived privacy rights of students.[87] Nevertheless, this guidance neither prevents other schools from creating similar conditional enrollment schemes or prevents such platforms from selling student information.[88] Ultimately, the lack of specificity in defining certain provisions in FERPA makes how student information can be maintained or shared by private companies ambiguous, increasing the likelihood that student  data will not be kept private.[89]

Next, COPPA only covers children under thirteen, leaving countless students without certain privacy protections.[90] Further, COPPA also does not provide  a private right of action and can only be enforced by the Federal Trade Commission, meaning that even if a school or third-party violated COPPA, countless people would be unable to seek relief.[91] Overall, the lack of judicial protection over COPPA violations makes the probability of widespread data breaches significantly higher.[92]

Finally, current case law does not establish concrete protections over schools’ ability to surveil and collect data on their students. Both Vernonia School District, which discussed students lowered reasonable expectation of privacy,[93] and T.L.O., which created boundaries determining what is considered an unreasonable search and seizure in schools,[94] have major shortfalls related to surveillance and student privacy. Primarily, both cases only discuss searches on school grounds, and students must have a reasonable expectation of privacy in the object of the search.[95] Further, students’ reasonable expectation of privacy online is questionable because by choosing to reveal information and communications online and electronically, students avail themselves to tech companies and other third parties through the third-party doctrine,[96] which states that individuals forfeit  their reasonable expectation of privacy when they voluntarily provide  information to third parties.[97]

This is, of course, problematic because since students’ lives are so engrained in online culture,  they do not really have a choice in how much exposure they have to technology, which is especially concerning considering that schools monitor student digital activity off-campus.[98] For instance, the COVID-19 pandemic increased children’s overall exposure to online and electronic activity by creating (1) school atmospheres reliant on online educational platforms for teaching,[99] and (2) social atmospheres reliant on the digital world for a social outlet free from risk of contracting the COVID-19 virus.[100]

That said, students may have some legal protection, as court precedent creates possibilities for pushback regarding the necessity of student surveillance. The Supreme Court’s decision in Mahanoy Area School District v. B.L. has many interesting potential legal implications for students’ reasonable expectation of privacy. By allowing educators to regulate messages considered highly disruptive,[101] schools could justify their vast surveillance of student electronic activity by arguing that they are trying to protect the school environment from potential highly disruptive or dangerous situations. Although seemingly well-intentioned, using such broad surveillance on students does not mean school officials will be easily able to find or regulate highly disruptive messages.[102]

Additionally, following the Court’s reasoning in Clapper, [103] where the plaintiffs did not have standing because they could not prove with “impending certainty” that the government was surveilling their communications,[104] students experience an  “impending certainty” that their online activity will be surveilled. In states with wide-reaching cyberbullying laws that circumvent the Fourth Amendment’s protections against unreasonable searches and school districts that use social media and online monitoring companies, students could successfully allege with “impending certainty” that schools subjected them to overreaching surveillance.[105] Therefore, students may have judicial standing by arguing that schools are using pervasive methods of surveillance that violate the Fourth Amendment’s safeguards against unreasonable searches.[106]

In summary, schools are caught in a catch-22 of sorts where they will either be blamed for school shootings if they refrain from surveilling their students[107] or criticized by students, parents, advocacy groups, and other interested parties if they deploy surveillance technology. However, regardless of this dichotomy, the ultimate lack of regulations and concrete protections over student data privacy combined with schools’ overreliance on EdTech and surveillance companies to improve the educational experience puts students at a significant risk of companies capturing their data and using it for ulterior purposes.[108]

c. Social Policy Implications

  1. Technology Concerns

In response to the Marjorie Stoneman Douglas school shooting, many schools began collaborating with local law enforcement and security technology companies to implement school surveillance systems.[109] However, surveillance technology is not only used to prevent school shootings, cyberbullying, and self-harming behaviors; it is also used to prevent gang or criminal activity amongst students.[110] Security technology companies initially created social media monitoring systems for use by law enforcement, but over time, schools also started implementing biometric technology to ensure school safety.[111] For example, some schools have implemented biometric technology, like fingerprint identification and facial recognition among others, to monitor students.[112] To promote their technologies, some companies offer schools free surveillance systems[113] that monitor students’ emails, documents, and chat messages and notify school administration if students use certain flagged words.[114] Even though schools are relying on surveillance technologies for school safety, little evidence exists that  definitively proves  that the technology actually makes schools safer.[115] This means that schools may share student data with law enforcement officials and security companies without proof that their surveillance schemes prevent school shootings, cyberbullying, or self-harming behaviors.[116]

In addition to biometric and surveillance technology, use of artificial intelligence in educational technologies and practices is, likewise, increasing.[117] In theory, artificial intelligence machine learning relies on the concept that the systems “can learn from data, identify patterns, and make decisions with minimal human intervention.”[118] In practice, artificial intelligence in schools looks more like creating personalized learning for students based on their unique educational needs.[119] Schools can even use artificial intelligence to crossmatch biometric data of problematic individuals with people inside school buildings and notify administrators if cameras on the premises detect their presence.[120]

While artificial intelligence may foster safer school environments, their potential negative impacts cannot be ignored. Generally, the lack in understanding of how artificial intelligence technology actually works mystifies the exact way such software is used on children.[121] In fact, studies have shown that biometric security systems can be circumvented by outside artificial intelligence manipulation.[122] However, it takes a high level of knowledge and skill to bypass biometric software, even with artificial intelligence capabilities.[123] Regardless, the fact that artificial intelligence is not fool proof illustrates that companies need to develop the technology further before such technologies are employed on students.[124] Similarly, the lack of diversity in the artificial intelligence field creates a risk for implicit bias over race, gender, and age.[125] For example, facial recognition software is known for being less accurate when reading black faces than white faces.[126] Also, because younger people are continuing to develop and are gradually changing, facial recognition software is less accurate in younger faces.[127] Therefore, using facial recognition technologies in schools is especially problematic because they are less likely to yield accurate results.[128]

Moreover, because facial recognition software results for younger individuals is less accurate than recognition software results for older demographics, companies are more likely to store children’s biometric information in facial recognition databases in order to improve internal datasets and algorithms.[129] Overall, schools’ wide-reaching use of surveillance technologies on students, whether through artificial intelligence, biometric data collection, or other monitoring methods for student online activity, does not provide a fool proof method of ensuring school safety.

  1. Equity Concerns

Although schools justify widespread student surveillance for improved safety and educational outcomes,[130] surveillance software can be problematic because low-income students and students of color are more likely to be monitored.[131] In seventeen states, the majority of public school attendees come from low-income households.[132] Likewise, of the schools predominately serving low-income individuals, those that assign equipment and provide access to internet networks may be the primary or only point of access to certain electronic data, services, and equipment for their students, meaning that the students reliant on school technologies to access the Internet are more likely to be surveilled than students with an elevated socio-economic background.[133] Realistically, the students reliant on schools for certain technologies do not have the option of foregoing access to these services if they want access to as equitable of an education as possible.[134]

Likewise, certain demographics  already experience electronic surveillance at disproportionately high rates.[135] Schools that serve higher populations of students of color are more likely to use intense surveillance measures than schools serving primarily white students.[136]  Unfortunately, the use of  extreme surveillance measures on students leads to a higher chance of involvement in the criminal justice system.[137] The combination of surveillance that allows  school officials to discover more code of conduct violations and minorities experiencing a higher level of scrutiny during their educational experience due to implicit bias,[138] especially if their online activity is flagged due to cultural misunderstanding,[139] may even perpetuate the school-to-prison pipeline.[140] Further, when surveillance and artificial intelligence technologies are more likely to flag something innocuous as being a threat worthy of heightened surveillance, schools are subjecting students of color to unnecessary exposure to law enforcement.[141] Additionally, studies  show that strict school security and surveillance measures elicit distrust and disunity among school community members, correlating to lower student satisfaction with their education.[142]

Currently, the artificial intelligence and surveillance technology industries are predominantly non-minority.[143] While this remains true, these technologies will likely continue perpetuating implicit biases triggered by presumptions of crime and violence.[144] In summary, although broad student surveillance helps school officials discover more code of conduct violations, it also heightens current disciplinary disparities in schools and perpetuates the school-to-prison pipeline.[145]

  1. Student Self-Expression Concerns

Generally, three categories exist for determining a school’s authority to surveil its students: whether the school’s motivation for surveillance has either (1) no nexus exists between the school or school-related activity; (2) limited nexus exists between the school or school-related activity; or (3) a substantial nexus exists between the school or school-related activity.[146] Most states acknowledge that schools have either a limited or substantial nexus, which consequently grants schools broad authority to surveil student online and electronic activity.[147]Supporters of school surveillance and EdTech methods argue several points in favor of using such monitoring methods. For example, first, supporters argue that using EdTech to monitor classroom behavior by codifying positive and negative conduct permits teachers to document and provide feedback on student behavior quickly and efficiently.[148] However, creating an online environment where students are aware they are being monitored results in a psychological surveillance state, making students feel as though they are being watched at all times.[149] Consequently, students may not freely express their views, especially views that could be controversial.[150] Suppressing expression of free thought ultimately restricts the academic learning environment by creating settings and situations where students are less likely to challenge one another.[151] Second, those who believe that schools have become dangerous and require technology to create safe environments also believe that surveillance promotes security, welfare, and school safety.[152] Nevertheless, normalizing pervasive surveillance so that it becomes a mundane part of life could potentially generate future case law where people have neither a reasonable expectation of privacy nor any sense of privacy that society would deem reasonable, especially over online or electronic activity.[153] Third, students may avoid sharing difficult personal experiences on discussion platforms out of fear that they may be unnecessarily disciplined or potentially referred to law enforcement.[154]

Also, certain surveillance companies, like Securly, send parents weekly reports and push notifications detailing their children’s internet searches and browsing histories.[155] Even though parental awareness of student online activity can protect children from partaking in damaging online activity, sending parental notifications can also be problematic. For instance, in certain situations, a student could face negative repercussions at home if they were searching information related to their LGBTQ+ identity. Likewise, a student could also be punished or face problems at home if they were searching for information related to sexual education or abortion access, presuming that their parents disapproved. Further, parents who do not have access to a device capable of receiving push notifications could be at a disadvantage as opposed to parents with access to such notifications, creating an equity issue between families whose schools use these systems. Ultimately, relaying children’s complete internet search and browsing histories to their parents strips these children of their autonomy and outlet for self-exploration.

Schools’ electronic surveillance of students infringes on several kinds of student privacy rights. First, students have a right to intellectual privacy, which is the freedom of their own thoughts and beliefs.[156] They also have a right to spatial privacy, which establishes protections against intrusions on their physical and social spaces to give them the space necessary to develop their own thoughts and beliefs.[157] When combined, the rights of intellectual and spatial privacy grant students the freedom to explore and develop new ideas and provides spaces where student communications are able to remain confidential.[158] Unrestrained school surveillance impacts students’ intellectual and spatial privacy rights by giving school officials excessive access to students’ personal thoughts and beliefs.[159]

In summary, there are several detrimental consequences to schools’ overreaching student surveillance. First, adults may overestimate and overreact to potential risk, which harms  children’s ability to learn how to conduct risk assessments of their own.[160] Second, children are more likely to devalue their own privacy, which creates a generation of adults with a significantly impaired understanding of the importance of privacy.[161] Third, using surveillance to control children limits their autonomy and sense of self.[162] Fourth, students lose a significant variety of their privacy rights when they are subject to excessive school surveillance.[163] Lastly, awareness that a school is surveilling children’s online actions and activities could result in self-censoring, harming students’  development of the capacity to control their thoughts and actions by constantly acting to satisfy expectations set by authority figures.[164]

III. Suggestions

a. Modernize the Family Education Rights and Privacy Act of 1974  

From a policy perspective, while schools continue to rely on surveillance companies to monitor student activity, existing laws must be updated to protect students’ data privacy. Primarily, Congress should modernize FERPA[165] to better balance student privacy protections so that advanced security technology and EdTech companies employed to improve school safety do not misuse student data.[166] First, although FERPA indicates that  students’ biometric records are PII, Congress should explicitly define in FERPA that a biometric record includes a student’s fingerprint, retina or iris scan, geolocation, handprint, and facial recognition scan. This would serve to protect personal information that could be collected using advanced surveillance technologies.[167] Further, Congress should expand the current definition of PII that defines PII as  “other information that, alone or in combination, is linked or linkable to a specific student that would allow a reasonable person in the school community … to identify the student”[168] to include examples of linkable information from indirect identifiers, like internet and browser history, IP addresses, and social media history. By offering specific examples of indirect PII, FERPA would establish more stringent protections over students that would require explicit exceptions for situations in which student information may be shared.[169]

Congress should also create a private right of action under FERPA, which would allow families who believe their student’s personal information was shared without consent to seek judicial redress.[170] Additionally, as it stands, FERPA only prevents disclosures of personally identifiable information, meaning that any data that is not PII could be collected. Therefore, Congress should specifically amend FERPA to include  protections for students’ data created from online and electronic activity in connection with educational purposes, even if such data does not qualify as PII.[171] In a similar vein, “educational purposes” should be defined to include any activity conducted in relation  to schooling,  data created on a school-provided device, or data collected for surveillance purposes. Such a clarification would protect student data created from involvement with EdTech platforms, work conducted on a school device, or surveillance by school officials monitoring student activity.

Further, FERPA’s current definition of protected “education records” only includes documents kept or collected by a school.[172] Expanding this definition to include student-related records kept by third parties whose services were sought to aid in the educational and school environment would mean that data collected by EdTech and surveillance companies would  be considered a protected educational record. Also, the type and frequency of data shared with law enforcement[173] would likely decrease if the Department of Education issued guidance explicitly stating that the definition of “school official” [174] with a “legitimate educational interest”[175] should not include third parties, such as EdTech or surveillance companies, whose profits could seek a gain from the sale of private student data.[176] Finally, the Department of Education could choose to impose FERPA’s sanctions when third parties misuse student data to disincentivize such entities from doing so in the future.[177]

b. States Should Enact Data Privacy Laws

Because student data privacy laws at the federal level generally focus on parental rights and consent[178] and Congress is so polarized, state legislatures may be the best vehicle for passing student data privacy protective laws. State legislatures could also enhance their own school privacy laws to limit the collection of biometric data in public schools.[179] For example, a Minnesota law states that a student’s educational data cannot be disclosed to third parties unless either a legal guardian has given permission, or a relevant exception applies.[180] Such an approach adopted by other state legislatures would likely protect students from EdTech or surveillance technology companies improperly redistributing student data. Importantly, the Minnesota law states that data, “created, received, maintained[,] or disseminated by a technology provider” is not property of the provider and must either be destroyed or returned to the school.[181] Statutory language that prevents a third-party from claiming ownership over student data and requires that all such data be given to schools helps to safeguard  students’ sensitive information.

Likewise, California’s student privacy laws prohibit EdTech companies from commercializing student data and collecting it for non-educational purposes, and the law applies regardless of whether the company is working with a school.[182] Creating laws that place the onus of protecting student data on EdTech companies ensures that these companies will not exploit children’s data. If other states choose to enact similar laws, they could even create private rights of action so that families have a legal avenue in case EdTech companies still choose to misuse the data. Moreover, state student privacy laws should require that third parties notify schools and families in the event of a data breach;[183] give students the option to control what types  of their information is stored, like social security numbers, social media information, or biometric data;[184] and require schools and companies to publish and inform families of the types of information they collect and the purposes for which the information will be used.[185] States could also enact laws requiring that schools delete  students’ private information after a certain period of time.[186] Finally, states could enact specific privacy laws protecting students from the commercial use of their biometric data.[187] For example, in 2014, Florida completely banned collecting biometric data in public schools.[188] More realistically, though, states could pass laws that only allow schools collecting student biometric data with parental consent.[189]

To conclude, considering schools’ heavy reliance on third-party data collection entities for educational and surveillance purposes, the serious lack in regulations over student data collection is of great concern and could be mitigated through the improvement of existing federal laws and enactment of protective state data privacy laws.

IV. Conclusion

Many schools use surveillance technology to help teachers with classroom management and curriculum design.[190] Schools may also use surveillance technology to ensure school safety to reduce cyberbullying incidents, self-harming behaviors, and school shootings.[191] Although these reasons seem to benefit students, provisions in the existing laws regulating student and child data privacy are so vague that surveillance technology and EdTech companies have seemingly unrestricted access to student data. Ultimately, schools’ overreliance on these companies combined with the many loopholes in current privacy protection laws creates serious legal and social policy implications for students that should not be overlooked, especially as society continues and furthers its dependence on electronic and online technologies.

 

[1] Lucinda Gray & Laurie Lewis, Nat’l Center for Educ. Stat., Use of Educ. Tech. for Instruction in Public Schools: 2019-20 8 (2021); see, e.g., Sarah F., 36 EdTech Statistics & Facts: The State of Technology in Education, Stream (Jun. 2, 2021), https://getstream.io/blog/edtech-statistics/ (various studies have indicated that using technology to create challenged-based games in online learning significantly improves educational outcomes); see id. (“forty-five percent of schools reported having a computer for each student.); Josh Howarth, 53+ EdTech Industry Statistics (2022), Exploding Topics (Sept. 5, 2022), https://explodingtopics.com/blog/edtech-stats.

[2] Id.

[3] Barbara Fedders, The Constant and Expanding Classroom: Surveillance in K-12 Public Schools, 97 N.C.L. Rev. 1673, 1681 (2019).

[4] Id. at 1682.

[5] U.S. Const. amend IV, (the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures).

[6] Katz, 389 U.S. 347, 353 (1967).

[7] Id. at 360.

[8] New Jersey v. T.L.O., 469 U.S. 325 (1985).

[9] Id.

[10] Id. at 341.

[11] Id.

[12] Id. at 342.

[13] Id. at 341.

[14] Vernonia Sch. Dist. 47J v. Acton, 515 U.S. 646 (1995).

[15] Emily Suski, Beyond the Schoolhouse Gates: The Unprecedented Expansion of School Surveillance Authority Under Cyberbullying Laws, 65 Case W. Res. 63, at 94 (2014).

[16] Clapper v. Amnesty Int’l USA, 568 U.S. 398 (2013).

[17] Id. at 97; see also Clapper v. Amnesty Int’l USA, 568 U.S. 398, 401 (2013) (characterizing the plaintiff-respondents).

[18] See supra note 11 at 98; see also Clapper, 678 U.S. at 401 (requiring that the alleged injury be “certainly impending”).

[19] See infra text accompanying notes 95-98.

[20] Brannum v. Overton County Sch. Bd., 516 F.3d 489 (5th Cir. 2008).

[21] Maya Weinstein, School of Surveillance: The Students’ Rights Implications of Artificial Intelligence as K-12 Public Schools Security, 98 N.C.L. Rev. 438, at 463.

[22] Id.

[23] Id. at 464.

[24] Id.

[25] When an officer discovers information that they would not have otherwise found without using a technology, courts are more likely to hold that using the technology violated one’s reasonable expectation of privacy. Id.

[26] Mahanoy Area School District v. B.L., No. 20-255, slip op. (S. Ct. Jun. 23, 2021) https://www.supremecourt.gov/opinions/20pdf/20-255_g3bi.pdf.

[27] Id. at 1.

[28] Id at 3.

[29] See generally id. at 7 (describing three features that distinguish a school’s efforts to regulate off-campus versus on-campus speech).

[30] Mahanoy Area School District, slip op. (2021).

[31] See infra text accompanying notes 93-94.

[32] Fed. Trade Comm’n, Children’s Online Privacy Protection Rule (“COPPA”), https://www.ftc.gov/legal-library/browse/rules/childrens-online-privacy-protection-rule-coppa.

[33] Fedders, supra note 2 at 1684.

[34] Id.

[35] Weinstein, supra note 15, at 466 (FERPA protects photographs and videos of students, as they are PII.).

[36] Id. at 468.

[37] Id. at 466.

[38] Id. at 469.

[39] Id.

[40] Id.

[41] Id. at 472.

[42] Id. at 473.

[43] New Research Probes Surveillance of Students’ Online Activities, Government Technology (Sept. 22, 2021), https://www.govtech.com/security/new-research-probes-surveillance-of-students-online-activities.

[44] Andy Froelich, The Increasingly Covert and Invasive Surveillance of Students and Its Visible Role in the School-to-Prison Pipeline, 40 Child. Legal Rts. J. 118, 123 (2020).

[45] Id.

[46] 20 USC § 1232(h)(b).

[47] 20 USC § 1232(h)(c)(1)(E).

[48] 20 USC §1232(h)(c)(4)(A)(iii).

[49] See supra text accompanying notes 31-35.

[50] See supra text accompanying notes 36-38.

[51] See supra text accompanying notes 39-41.

[52] See infra text accompanying notes 82-84.

[53] Suski, supra note 11, at 64.

[54] Id. at 74.

[55] Id.

[56] Id. at 68.

[57] Id.

[58] Fedders, supra note 2, at 1686.

[59] Id.; see also Chapel Hill & Carrboro City Schools Policy, CHCCS Policy Code: 3225 Computer, Network, and Internet Usage, Social Networking, https://sites.google.com/site/educationcollaboration/guidelines/chccs.

[60] Fedders, supra note 2.

[61] Id. at 1687.

[62] Id. at 1691; see also Lindsey Barrett, Ban Facial Recognition Technologies for Children? And for Everyone Else, 26 B.U.J. Sci. & Tech. L. 223, at 238-39 (describing two Israeli facial recognition software companies school districts in Indiana and Texas employ that were created for use in prison systems and army checkpoints in the West Bank).

[63] Fedders, supra note 2, at 1687.

[64] Id. at 1689; but see Liz Bowie, Baltimore City student laptops are monitored for mentions of suicide. Sometimes, the police are called., Baltimore Sun (Oct. 12, 2021, 4:26 PM), Baltimore City student laptops are monitored for mentions of suicide. Sometimes, the police are called. – Baltimore Sun (where using the monitoring system, GoGuadian, allowed schools to identify nine students as having severe mental health crises, two of which had never had any mental health care).

[65] See generally Julia Angwin, Back to Surveillance School, The Markup, (Sept. 17, 2022, 8:00AM) https://themarkup.org/newsletter/hello-world/back-to-surveillance-school (explaining that twenty percent of parents do not know their child’s school uses monitoring technology).

[66] Fedders, supra note 2, at 1692.

[67] Id. at 1693; see also Barrett, supra note 55, at 238 (mentioning that as surveillance technology is becoming more affordable, it is also becoming more prevalent).

[68] Weinstein, supra note 15, at 445 (Biometric technology uses the “automated technique of measuring a physical characteristic or personal trait of an individual and comparing that characteristic or trait to a database for purposes of recognizing that individual,” which may include “fingerprints, DNA, facial recognition, iris and retina scans, vein structure, gait, and voice recognition.”).

[69] Id. at 441.

[70] Id. at 442.

[71] Id. at 447; see also Andrew Hope, Governmentality and the ‘Selling’ of School Surveillance Devices, 63 The Socio. Rev. 840, at 843 (2015) (As of 2013, 3% of schools tracked student movements by inserting radio frequency identification device microchips into school uniforms, bracelets, or identification cards.).

[72] See supra Section I, Case Law; see also Weinstein, supra note 15, at 465.

[73] Fedders, supra note 2, at 1707.

[74] Suski, supra note 11 at 68.

[75] See generally Anne C. Dailey, Children’s Constitutional Rights, 95 Minn. L. Rev. 2099, 2099 (2011) (discussing how arguing that children have a limited right to privacy impedes their ability to understand the importance of privacy, as they grow into autonomous adults).

[76] Fedders, supra note 2, at 1707.

[77] Id. at 1683.

[78] Id.

[79] Id.

[80] Weinstein, supra note 15, at 468.

[81] Id.

[82] Id.at 466.

[83] Id.

[84] Id. at 470.

[85] Id. at 472.

[86] Id. at 473.

[87] Id.

[88] Id.

[89] Id. at 468.

[90] Fedders, supra note 2, at 1684; see also supra text accompanying notes 24-26 (listing protections, like providing a reasonable means for a parent to review the personal information collected and offering them a chance to refuse the collection).

[91] Fedders, supra note 2, at 1684.

[92] Id. at 1685.

[93] Suski, supra note 11, at 94.

[94] Id. at 93 (A search in schools is reasonable based on (1) whether the action was justified at its inception, and (2) whether the search was actually conducted reasonable in scope of circumstances that justified the initial interference.).

[95] Id. at 96.

[96] Id.

[97] Cong. Rsch. Serv., R43586, The Fourth Amend. Third-Party Doctrine 1 (2014) (defining the third-party doctrine); see id. at 21 (Justice Sotomayor’s concurrence in United States v. Jones states the third-party doctrine “is ill suited to the digital age, in which people reveal a great deal of information … to third parties in the course of carrying out mundane tasks.”).

[98] Suski, supra note 11, at 96.

[99] Cathy Li & Farah Lalani, The COVID-19 pandemic has changed education forever. This is how., World Economic Forum (Apr. 29, 2020), https://www.weforum.org/agenda/2020/04/coronavirus-education-global-covid19-online-digital-learning/ (discussing the general increase in educational technology use worldwide).

[100] Saijun Zhang, et al., Teens’ Social Media Engagement during the COVID-19 Pandemic: A Time Series Examination of Posting and Emotion on Reddit, 18 Int’l J. of Env’l Rsch. and Pub. Health 1, at 1 (2021) (a study discussing teenagers’ heightened use of social media during the pandemic due to concerns over social distancing); see also Colleen McClain, How parents’ views of their kids’ screen time, social media use changed during COVID-19, Pew Rsch. Ctr. (Apr. 28, 2022) (discussing how parents noticed their children’s social media use increased during the pandemic).

[101] See supra text accompanying notes 20-23.

[102] See, e.g., Benjamin Herold, Digital Surveillance and School Safety: 5 Things Administrators Need to Know, Educ. Week (Jun. 12, 2019), https://www.edweek.org/leadership/digital-surveillance-and-school-safety-5-things-administrators-need-to-know/2019/06 (where the company, Gaggle, flagged more than nine thousand online student incidents in a Washington State school district and a majority were for minor violations like profanity).

[103] Suski, supra note 11, at 97 (holding that a federal law permitted certain parties’ communications to be monitored warrantlessly without violating the Fourth Amendment).

[104] Id. at 98

[105] Id. at 99.

[106] Id.

[107] Weinstein, supra note 15, at 458 (mentioning that schools could be liable for negligence after a school shooting if they do not use surveillance technology on their students).

[108] Fedders, supra note 2, at 1714.

[109] Froelich, supra note 37, at 120; see also id. at 122 (After the Parkland shooting, the Florida legislature passed a law mandating the development of a centralized database for students that would combine their criminal and social-service records and their social media history, creating a significant opportunity for student surveillance.).

[110] Id.

[111] Id.

[112] Id. at 118; see, e.g., Bob Hand, Biometrics in Schools: 4 Ways Biometric Data can be used to Enhance Learning, eLearning Indus., (Feb. 25, 2018), https://elearningindustry.com/biometrics-in-schools-data-enhance-learning-4-ways (discussing using fingerprint scanners at classroom doors to make attendance more efficient, and using radio-frequency identification chips to track that students arrive at school and home safely).

[113] See generally Barrett, supra note 55, at 238 (warning how facial recognition software is growing more prevalent in schools, as it becomes more financially affordable).

[114] Froelich, supra note 37, at 122.

[115] Weinstein, supra note 15, at 443.

[116] Id.

[117] Currently, the global artificial intelligence educational market value is expected to exceed six billion by 2024. Id. at 446.

[118] Id. at 445.

[119] Id. at 447.

[120] Id. at 448.

[121] Id. at 451.

[122] Id. at 452; see, e.g., Biometric security beginning to pique cybercriminals’ interest, Intel471 (Mar. 8, 2022), https://intel471.com/blog/biometric-security-threats-cybercriminal-underground (A vulnerability in a facial recognition software “was discovered that allowed an attacker to bypass biometric authentication with a spoofed image.”).

[123] Weinstein, supra note 15, at 453.

[124] Id.

[125] Id.

[126] Barrett, supra note 55, at 248 (describing how algorithms based primarily on datasets of white people likely contribute to the inefficiencies of facial recognition for people of color); see, e.g., id. at 249 (stating that black women experience the highest rates of false positives in facial recognition).

[127] Id. at 457; see also Barrett, supra note 55, at 250 (False-positives are two to five times more likely to occur for women than men).

[128] Id. at 252.

[129] Id. at 253 (explaining that companies are more likely to focus on collecting information from populations their algorithms and datasets lack, which tends to be children or people of color).

[130] Fedders, supra note 2, at 1675.

[131] Id. at 1679.

[132] Suski, supra note 11, at 82.

[133] Id.

[134] Id. at 83.

[135] Weinstein, supra note 15, at 443.

[136] Jason P. Nance, Implicit Racial Bias and Students’ Fourth Amendment Rights, 94 Ind. L.LJ. 47, at 52 (2019).

[137] Id.

[138] Id. at 55 (Implicit bias can be described by a two-system theory. System one is processing that is quick, automatic, associative, involuntary, and unconscious; system two is conscious processing. People generally make racial associations in the first system of implicit bias.).

[139] Weinstein, supra note 15, at 470.

[140] Nance, supra note 129, at 76.

[141] Id. (“When schools use intense surveillance methods in conjunction with extreme discipline measures, such as zero-tolerance policies, school officials automatically suspend … or refer students to law enforcement when they identify students.”); but see Bowie, supra note 57 (explaining that the only instance school police officers are used after identifying an online suicide risk is when school psychologists and social workers are unavailable, and the officer goes to the family’s home to conduct a wellness check using trauma-informed care and behavioral crisis response techniques).

[142] Weinstein, supra note 15, at 443.

[143] Nance, supra note 129, at 60.

[144] Id.

[145] Froelich, supra note 37, at 130.

[146] Suski, supra note 11, at 70.

[147] Id.

[148] Alex Jiahong Lu, et al., Data Work in Education: Enacting and Negotiating Care and Control in Teachers’ Use of Date-driven Classroom Surveillance Technology, 5 Ass’n for Comput. Mach. 452, at 461.

[149] Id. at 462.

[150] Weinstein, supra note 15, at 450.

[151] Id.

[152] Hope, supra note 64, at 847.

[153] Id. at 852.

[154] Nance, supra note 129.

[155] Herold, supra note 94.

[156] Suksi, supra note 13, at 101.

[157] Id.

[158] Id.

[159] Id. at 102.

[160] Fedders, supra note 2, at 1710.

[161] Id.

[162] Id.

[163] See supra text accompanying notes 148-151.

[164] Fedders, supra note 2, at 1711.

[165] Congress last modernized FERPA in June of 2002. U.S. Dep’t of Educ., Legislative History of Major FERPA Provisions, https://studentprivacy.ed.gov/resources/legislative-history-major-ferpa-provisions#:~:text=This%20document%20explains%20some%20of,updated%20in%20June%20of%202002.

[166] Weinstein, supra note 15, at 474.

[167] Family Educational Rights and Privacy Act, 20 U.S.C. § 1232g.

[168] Family Educational Rights and Privacy Act, 34 CFR § 99.3 (2022).

[169] Elise Young, Educatonal Privacy in the Online Classroom: FERPA, MOOCS, and the Big Data Conundrum, 28 Harv. J. Law & Tec. 549, at 555 (2015).

[170] Id. at 564.

[171] Id.

[172] “Education records” includes those that (1) contain information directly related to a student; and (2) are maintained by an educational agency or institution, or by a person acting for such agency or institution. Mary Margaret Penrose, In the Name of Watergate: Returning FERPA to its Original Design, 14 N.Y.U.J. Legis. & Pub. Pol’y 75, at 87 (2011).

[173] See supra text accompanying notes 31-34 (explaining the “school officials exception”).

[174] See supra text accompanying note 35 (illustratinging how “school official” can have many interpretations).

[175] Supra text accompanying note 76.

[176] Explicitly specifying that EdTech and surveillance companies do not have a legitimate educational interest because they are profiting from their partnerships with schools would prevent them from disclosing student data to outside parties without permission from the school or parental consent. Joanna Tudor, Legal Implications of Using Digital Technology in Public Schools: Effects on Privacy, 44 J.L. & Educ. 287, at 294 (2015).

[177] Young, supra note 162, at 564.

[178] Student Data Privacy Regulations Across the U.S.: A Look at How Minnesota, California and Others Handle Privacy, LearnPlatform (Jun. 17, 2022), https://learnplatform.com/blog/edtech-management/student-data-privacy-regulations.

[179] Weinstein, supra note 15, at 475 (referencing how Arizona, Florida, Kansas, and Louisiana all have laws limiting biometric data collection in public schools).

[180] LearnPlatform, supra note 171.

[181] Id.

[182] Id.

[183] State Student Privacy Policy¸ Electronic Privacy Information Center, https://archive.epic.org/state-policy/student-privacy/.

[184] Id.

[185] Id.

[186] Id. (In 2018, New Hampshire passed a law requiring that educational agencies delete students records either upon request after graduation or by their twenty-sixth birthday.).

[187] Weinstein, supra note 15, at 475.

[188] Id. at 476.

[189] Id. (where Illinois, Louisiana, Kansas, and Arizona all passed laws allowing biometric data collection but only after obtaining consent from a parent or legal guardian).

[190] Lu, supra note 141, at 452.

[191] See supra text accompanying note 56.