0
(0)

In late January, Mark Zuckerberg, CEO of Meta Platforms Inc., and other scions of Silicon Valley were hauled before Congress to explain how their social media companies were not, in fact, destroying the mental and physical well-being of generations of children. “I’m sorry for everything you’ve all gone through,” Zuckerberg said to the students and families who were attending the hearing because they had been victimized by online abuse and exploitation.

But the legal story begins some years before this event, during the onset of the pandemic. Hundreds of districts and dozens of states — alarmed at the effects of social media on students — began filing lawsuits in courts all over the country. In late 2022, many of these actions were combined into a single case (see In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, N.D. Cal., 2023).

The plaintiffs allege that the companies owning and operating the world’s most popular social media platforms — including Instagram, Facebook, YouTube, TikTok, and Snapchat — “target” children as a core market and design their platforms to “addict” them. This has resulted in not only rampant sexual exploitation and “sextortion” of children but also pervasive anxiety, depression, insomnia, and self-harm — in short, a “youth mental health crisis.”

The case (which is, as of early March, still in the pretrial stages) is sure to interest every educator who has witnessed the effects of smartphone addiction on their students. For political pundits, the sheer volume and variety of plaintiffs is noteworthy: At least 41 states have joined this or related lawsuits. These days, it’s unheard of for that many Democratic and Republican attorneys general to find an area of so much agreement.

For legal scholars, the case is no less intriguing. The case illuminates the slipperiness of attempting to contain exposure to the internet using legal doctrine that long predates the internet. Before the 1990s, nearly every personal injury or “tort” resulted from an encounter with a live human, an animal, or an identifiable object. A bar brawl, for example. A sleepy surgeon. A neighbor’s dog. Cigarettes. Product liability law — a subcategory of tort law that spells out when manufacturers, distributors, or sellers of products are liable for any harm they cause — is particularly focused on tangible personal property distributed commercially for public use or consumption. (The narrow focus serves to distinguish disputes involving products from those involving services or intangible concepts or ideas, for which different laws apply.)

As the In re: Social Media case reveals, times have changed. Suddenly, even the most basic arguments used in product liability cases are no longer so straightforward. Is social media “tangible”? Is it even a “product”? When you’re attempting the legal equivalent of nailing cyberspace to a chalkboard, one thing has become clear: We’ve left the corporeal world. Things have gotten so meta.

How social media impacts children — and schools

Near the outset of his testimony before the U.S. Senate Judiciary Committee (almost exactly one year before the social media platform CEOs testified before the same body), Mitch Prinstein, chief science officer at the American Psychological Association, said something very ominous: “Within the last 20 years,” he observed, “the advent of portable technology and social media platforms is changing what took 60,000 years to evolve.”

He went on to explain that social media use in children coincides with a delicate period in human development involving “intricate and precise interactions among neural, biological, social, contextual, and social systems.” The adolescent brain develops in a “specific, predetermined sequence,” Prinstein said. The subcortical areas associated with “human craving for visibility, attention, and positive feedback from peers” develops years before the prefrontal cortex region, which allows adults to better “inhibit behavior and resist temptations.” “When it comes to youths’ cravings for social attention,” Prinstein concluded, “they are all gas pedal with no brakes.”

The lawsuits against social media companies underscore the theme of developmental harm to children. Plaintiffs chronicle mental and physical repercussions (including anxiety, depression, and attempted suicides) resulting from students’ social media use. They also point to the effect on schools. For example, in Seattle School District No. 1 v. Meta Platforms Inc. et al. (W.D. Wash, 2022), Seattle Public Schools chronicled multiple impacts of students’ social media use, including the need to hire additional counselors and medical staff; conduct staff training; develop responsive lesson plans and curriculum; and address upticks in truancy, cyberbullying, property damage, and other disciplinary issues stemming from social media activity.

The In re: Social Media Addiction case

The plaintiffs in In re: Social Media Addiction aren’t just opining that students are suffering from social media addiction or from the abuse directed at them by other users of social media platforms. Instead, they allege that social media companies are 1) intentionally designing their platforms to create compulsive or addictive use among minors and 2) failing to provide adequate safeguards and notifications to parents or address violations of child safety laws occurring on their platforms.

These distinctions are important. In a pretrial opinion, the court stated that, in cases related to product-based negligence, social media companies don’t owe children “a duty to protect them from harm from third party users of defendants’ platforms.” (Educational institutions, in contrast, have a responsibility under civil rights laws to address harassment or violence among students or other third-parties over whom they exercise some form of control.) Instead of holding social media companies responsible for the abusive behavior of its users, the court held that the case would turn on the obligation of social media companies to design safe “products” and to warn users of any “defects” that might exist in those products.

The court went on to list a host of product defects it would consider. These include the failure of social media platforms to adequately verify users’ ages, implement parental controls or notifications, limit the length and frequency of use sessions, or enable reporting of suspected child sexual abuse content.

Product liability vs. free speech

The case includes an interesting tension between personal injury (product liability) and First Amendment (free speech) law. In short, when much of the injury to users comes from speech protected under the First Amendment, what, if anything, can the companies be liable for? The court address the tension in this way, quoting a passage from Winter v. G.P. Putnam’s Sons (9th Cir., 1991):

A book containing Shakespeare’s sonnets consists of two parts, the material and print therein, and the ideas and expression thereof. The first may be a product, but the second is not. The latter, were Shakespeare alive, would be governed by copyright laws, among others. These doctrines applicable to the second part are aimed at the delicate issues that arise with respect to intangibles such as ideas and expression. Products liability law is geared toward the tangible world.

Drawing from this, the court separated out the features of social media platforms into two overarching buckets: 1) features that are more akin to tangible services, property, or products, and 2) features that are more akin to “ideas, content, and free expression.” Bucket 1 is subject to product liability law. Bucket 2, protected by the First Amendment, is not.

Not surprisingly, the social media companies argued that just about everything they do falls in Bucket 2, meaning that the “First Amendment protects them from liability for the speech they publish as well as for all choices they have made in disseminating them.” The court agreed — but only up to a point.

For example, the court held that social media companies’ practice of creating, timing, and clustering content to increase addictive use in the form of “notifications” (including by creating “elevated status,” “trophies,” and other awards for frequent users) to draw users back to their respective platforms was “speech” entitled to First Amendment protection.

On the other hand, the court held that the use of “filter” tools that enable users to edit photos and videos before posting or sharing them were not “speech” entitled to First Amendment protection, but rather “neutral, non-
expressive tools” provided by them. (Plaintiffs had argued that these filters enabled the spreading of “idealized” images that damaged the self-esteem of young users. Failing to inform users when an image had been altered or edited made it difficult for users to discern what was real and what was not.)

Protecting children in the metaverse

So far, plaintiffs have succeeded in convincing the court in In re: Social Media Action that social media companies are answerable to some aspects of their “product” when it comes to protecting children. But there remains a sense that social media is a square peg that doesn’t quite fit product liability’s round hole. Although Facebook, Instagram, and other platforms rely heavily on tangible objects such as cellphones to get to users, the court has determined that the apps themselves are not “tangible” in the way that cellphones are. “It is the phones that vibrate, make sounds, or otherwise manifest, physically, defendants’ [apps]” — not the social media platforms themselves, said the court. Nor did the court buy the analogy that social media was the equivalent of downloadable “tangible personal property,” similar to “buying a container from the Container Store.”

Will plaintiffs ultimately prevail in their argument that product defects exist in the metaverse? Or will the silent epidemic of social media addiction continue to lurk, ubiquitous but invisible within the legal system we’ve created, like cosmological dark matter?

This article appears in the April 2024 issue of Kappan, Vol. 105, No. 7, p. 62-63.

ABOUT THE AUTHOR

Robert Kim

Robert Kim is the executive director of the Education Law Center, based in Newark, NJ. His most recent book is Education and the Law, 6th ed.

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.