The Children’s Online Privacy Protection Act (COPPA), a federal law in effect since 1999, requires that operators of commercial websites, online services, and mobile apps obtain parental consent before collecting personal information from children under the age of 13. This includes not only traditional identifying information, such as children’s names and addresses, but also “biometric” identifiers — including scans and recordings of children’s faces and voices — as well as metadata about kids’ online behavior, such as their user or browsing histories (preserved as data “cookies”).
Private companies collect this kind of personal information all the time. This is not new. Nor are the associated privacy concerns: Watchdogs have been sounding the alarm about internet privacy for decades. What feels different now, or at least since the advent of the COVID-19 pandemic, is the mass migration of schools toward remote instruction on commercial online platforms.
As schools have ratcheted up remote online instruction, have they adequately safeguarded students’ privacy? The answer is no, according to those who have taken to the courts.
Litigation over student privacy
Recent student privacy cases illuminate the distance we’ve come from cases involving, for example, a physical search of a student’s purse (see New Jersey v. T.L.O., U.S. Supreme Ct., 1985) or the divulging of a student’s grades to her peers (see Owasso Indep. Sch. Dist. v. Falvo, U.S. Supreme Ct., 2002).
In April 2020, the parent of two Illinois students, identified as H.K. and J.C., sued Google, alleging that the company illegally collected, stored, and used children’s biometric and personal information without the consent of their legal guardians, in violation of both COPPA and an even more specific Illinois law — the Biometric Information Protection Act (BIPA).
As schools have ratcheted up remote online instruction, have they adequately safeguarded students’ privacy?
According to the plaintiffs, Google has “infiltrated the primary and secondary school system in this country” by widely distributing Chromebook laptops to over half of the nation’s schoolchildren. The laptops come with the G Suite for Education platform (since renamed Google Workspace for Education), which “creates, collects, stores and uses” biometric identifiers of children using the platform, as well as their names, email address, contact lists, passwords, geolocation, browsing history, and other online behavior.
Hector Balderas, the New Mexico attorney general, has filed a similar lawsuit against Google, warning that the consequences of the company’s tracking of children “cannot be overstated.” He avers that “children are being monitored by one of the largest data mining companies in the world, at school, at home, on mobile devices, without their knowledge and without the permission of their parents.”
As for the uses of student information, Balderas acknowledges that Google had (by 2014) stopped explicitly using children’s information for advertising purposes. (This is an odd concession given that, as recently as September 2019, the Federal Trade Commission [FTC] fined YouTube, a Google subsidiary, $170 million for “illegally sucking up kids’ data so it could target them with ads,” and the agency is still seeking to understand what web-based companies are doing with the troves of personal data they’ve extracted from online users.) But Balderas goes on to accuse Google of unlawfully using student data for other commercial purposes, including product development and improvement.
Google insists that it is acting in accordance with the law. So far, a federal judge has agreed, dismissing the New Mexico suit and stating that, per federal guidelines, COPPA allows online operators to transfer responsibility over the consent process onto schools, which can either consent to the collection of student information in lieu of parents or, when required, obtain parents’ consent by acting as “intermediaries” between parents and operators. (Balderas has appealed the ruling; as of this writing, the case is pending in the Tenth Circuit Court of Appeals; the lawsuit brought by the Illinois students is also still pending.)
Google isn’t alone in being accused of unlawfully collecting biometric and other personal data. See, for example, Wilcosky v. Amazon (Ill., 2021); Vance v. Microsoft (Wash., 2021); In re: Zoom Video Communications (Cal., 2021); Thakkar v. ProctorU (Ill., 2021); Hazlitt v. Apple (Ill., 2020); Patel v. Facebook (Cal., 2018); and In re Nickelodeon (3d Cir., 2016).
As additional courts uphold FTC guidance stating that schools are, in many situations, directly responsible for consenting or obtaining parents’ consent before subjecting children to online “data sucking” (a horrid term, yet one that perfectly connotes online predation), then we can expect more schools to come under scrutiny for not doing this properly. Providing or obtaining meaningful consent in this realm is a tricky business, especially when schools don’t have a full understanding of the reach of online platforms, how school employees are using them, or how commercial operators share or utilize private student information.
Zooming into students’ homes
Putting aside the legality of Silicon Valley’s behavior, some people worry that online technology makes it far too easy for schools and educators themselves to invade students’ privacy.
On one level, we already treat some incursions into privacy as unavoidable and even necessary to simulate in-person interaction: No one bats an eye when communication platforms like Zoom, Skype, and Facetime allow public and private employees to literally peer into students’ homes and family lives.
Some people worry that online technology makes it far too easy for schools and educators themselves to invade students’ privacy.
But some online technologies enable educators and schools to go far beyond physical gazing; they’re specifically designed for surveillance. For example, Securly — classroom management software created by GoGuardian — enables teachers to control students’ screens, manipulate their documents, and observe what they are doing online during class. Remote proctoring apps such as ProctorU, Proctorio, Respondus, and Examity allow students to be monitored and controlled in a variety of ways while they are taking exams at home. Security software, available through Bark, Gnosis IQ, Gaggle, and Lightspeed and installed on school-issued devices, allow artifical intelligence bots and living humans alike to monitor students’ online interactions and notify school officials of any behavior indicating a possible health or safety concern. Student activity is now tracked by off-site “security specialists” and can be shared with school resource officers or other law enforcement (see Fedders, 2019). You don’t have to be leading the cavalry to protect civil liberties or stanch the school-to-prison pipeline for this to make you queasy.
The use of technology by schools has begun to garner broader legal scrutiny from advocates and school communities because of not just their intrusiveness but also their potential for abuse. In Robbins v. Lower Merion Sch. Dist. (Pa., 2010), for example, a student accused his high school of — unbeknownst to students or their parents — spying on students at home (even when they were sleeping or undressing) by activating webcams in school-issued laptops. The plaintiffs also claimed that the school was capturing and storing private images of students and their surroundings. The district eventually revealed that it possessed more than 56,000 images obtained through student-issued laptops; it ended the case via monetary settlement with plaintiffs.
Ed tech and the future
The rapid growth and utilization of online and remote learning tools have had effects on privacy that the public (and the law) has yet to fully understand and grapple with. Paradoxically, the well-intentioned push to preserve children’s education and safety during the pandemic may have accelerated the evisceration of their privacy.
Consider that, in a normal classroom setting, students don’t have a legal expectation of privacy or much freedom over their own behavior. During the pandemic, thousands of schools did their best to recreate this milieu remotely. Even from afar, teachers still needed to look over students’ work, corral their attention, monitor them during tests or quizzes, and watch out for signs of erratic or unhinged behavior. Their purpose was not to spy on students, but to teach them and keep them safe, just as they would at school. And there can be no doubt about the considerable benefits conferred by online technology, especially during the pandemic. But striving to replicate the in-school experience has come at a price.
Have we become inured to our loss of privacy online? In 2016, in a lawsuit against Nickelodeon, the Third Circuit Court of Appeals observed: “Most of us understand that what we do on the Internet is not completely private. How could it be? . . . We recognize, even if only intuitively, that our data has to be going somewhere.” The court seemed to express the public’s collective (if uneasy) acquiescence to the fact that, every time we go online, we allow ourselves to be tracked through cookies and algorithms that enable private companies to figure out who we are as consumers and to profit in some way.
Yet, the pandemic-driven proliferation of digital tools within the ed tech sector may test the limits of our acquiescence. As the Nickelodeon court went on to caution: “[N]ot everything about our online behavior is necessarily public.” In fact, numerous federal and state laws prohibit the collection, use, or disclosure of certain kinds of private information — even if some of those laws, like the Family Educational Rights and Privacy Act (FERPA), are woefully outdated (see Underwood, 2017) and others, while newer (such as California’s data privacy laws), fail to account fully for emerging technologies (see Fedders, 2019).
It’s likely that very few of us — including students, educators, parents, policy makers, and judges — are entirely aware of the implications of the wholesale release of student data and metadata. We don’t know whether or how youth (to say nothing of adults) are being “captured” and targeted as current and future customers and consumers. We don’t know to what extent student data will get in the wrong hands through leaks, hacks, or injudicious disclosures by school officials. We’ve already lost our (data) cookies. The challenge now is to figure out what else we might lose.
References
Fedders, B. (2019). The constant and expanding classroom: Surveillance in K-12 public schools. North Carolina Law Review, 97 (6).
Underwood, J. (2017). You say ‘records,’ and I say ‘data.’ Phi Delta Kappan, 98 (8), 74-75.
This article appears in the February 2022 issue of Kappan, Vol. 103, No. 5, pp. 64-65.
ABOUT THE AUTHOR

Robert Kim
Robert Kim is the executive director of the Education Law Center, based in Newark, NJ. His most recent book is Education and the Law, 6th ed.

