Technology

Alexis Spence’s lawsuit alleges Instagram hurts teenagers by design. – Slate

Alexis Spence created her first Instagram profile when she was only 11 years old. Her parents were not aware that she circumvented the device’s parental controls, created multiple secret Instagram accounts, and hid Instagram on her device by changing the app’s icon. Today she is 20 and, she says, suffers from addiction, anxiety, depression, self-harm, eating disorders, and suicidal ideation.

On June 6, with the help of the Social Media Victims Law Center, Spence filed a lawsuit against Meta Platforms (formerly known as Facebook), as the parent company of Instagram, alleging that Instagram’s design caused these mental and physical harms. Similar lawsuits have failed in the past, mostly because social media platforms claim immunity under Section 230 of the Communications Decency Act. However, Spence’s lawsuit is different—and Meta should be worried.

Advertisement

Advertisement

Advertisement

Advertisement

First, Spence’s lawsuit follows Frances Haugen’s disclosure of Facebook’s internal research, known as the Facebook Papers, that shows Meta knew its social media products are highly addicted and harmful to teenage girls. Second, this is one of the first lawsuits to allege design defects: that certain product features and capabilities built into Instagram are fundamentally defective, and these defects caused Spence’s mental and physical injuries. Third, instead of alleging injuries attributable to the content uploaded to the platform by its users, it targets Instagram’s features and capabilities in a way that is content-neutral.

In late 2021, Haugen received international attention as the Facebook whistleblower. The Facebook Papers, which she shared with journalists and during her testimony before Congress, suggested that the company believed the use of its social media products could lead to serious mental health issues, such as anxiety, depression, and eating disorders. In fact, Facebook named the harms as “Suicide and Self-Injury,” or SSI. For instance, one of the slides released in the Facebook Papers was titled, “But, We Make Body Image Issues Worse for 1 in 3 Teen Girls.”

Advertisement

Advertisement

While some experts have challenged the research as inconclusive, in Haugen’s testimony, she detailed how Facebook knew of its products’ potential dangers, especially to preteen and teen girls. She also discussed the Facebook Papers’ overarching theme: Instagram’s success and growth depends on the acquisition and retention of tweens and teenagers. Despite Meta’s policy that all users must be at least 13 years of age to create an Instagram account, the Facebook Papers revealed that the company deliberately studied 10- to 12-year-olds and how they used the platform.

Advertisement

And most importantly to Spence’s lawsuit, the Facebook Papers also revealed that Facebook conducted a study of its “Likes” feature called Project Daisy. In the study, the control Group A kept the publicly displayed like counter, but for Group B, Facebook hid the number of likes from the user. In the end, Facebook concluded, “Likes, comments, and Direct Messages are all very important to teens. … Teens seem very sensitive to receiving interactions and their emotions are directly correlated to the type and [number] of interactions they receive.”

Advertisement

Advertisement

Advertisement

Spence’s suit claims Meta is liable under a products liability theory. In general, a products liability claim can take three forms: manufacturing defect, design defect, and marketing defect. A manufacturing defect may occur when, during the manufacturing process, the product deviates from the intended design, leading to harm or injury to the consumer. A design defect may occur when the design of the product itself causes harm or injury to the consumer. A marketing defect may occur when the manufacturer fails to provide proper instructions or warnings regarding the product.

Other lawsuits against the social media company have attempted to hold it liable under products liability claims. In 2021, in a consolidated ruling, the Texas Supreme Court affirmed the dismissal of three lawsuits against Facebook that primarily alleged marketing defects. In those cases, victims of sex trafficking alleged that they became entangled with their abusers through Facebook and said that the company should be held liable for failing to warn and prevent sex trafficking. Facebook moved to dismiss these claims pursuant to Section 230.

Advertisement

Advertisement

Advertisement

Advertisement

Section 230 generally protects online platforms from legal liability for transmitting, removing, labeling, or hiding problematic third-party content. Courts have interpreted its immunity broadly, but recently, Section 230 has faced tremendous criticism from scholars, politicians, and regulators on both sides of the aisle.

In the Texas cases, the state Supreme Court said Section 230 barred the claims because they were rooted in the content posted on the platform by malicious users. The lawsuits sought to impose liability on Facebook for its failure to combat or remove third-party content, which is specifically barred by Section 230.

Defenders of Section 230’s broad interpretation have been critical of these products liability claims. For instance, the Cato Institute argued that the use of products liability claims improperly circumvents Section 230’s protections. From Cato’s perspective, Section 230’s primary beneficiary is the internet user, not the platform, because Section 230’s purpose is to protect users’ free speech rights under the First Amendment. And if plaintiffs were to prevail with their products liability claims, it would frustrate Section 230’s purpose.

Advertisement

Advertisement

However, not everyone has been critical of products liability claims. Professor (and occasional Slate contributor) Danielle Citron, who has written extensively on Section 230, argues that its protections should not extend to activity that has little or nothing to do with free speech, such as the distribution of a dangerous product. And at least one court appears to agree. In 2021, the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 does not shield Snapchat from liability for an alleged design defect related to its speed filter. In Lemmon v. Snap Inc., the appellate court concluded that Snap could be held liable for foreseeable injuries resulting from the use of its speed filter because the existence of this filter does not rest on any third-party content.

Advertisement

Advertisement

One case that might be instructive in Spence’s claims against Meta is Herrick v. Grindr LLC, which sought to hold the dating app for gay men liable under a products liability claim. In that case, Matthew Herrick met J.C. on the app, and after their relationship ended, J.C. created a fake profile imitating Herrick. Using the fake profile, J.C. invited other men to Herrick’s home for violent sex. Herrick reported the fake profile to Grindr and local law enforcement, but his complaints went unaddressed. So he sued Grindr and asserted that it failed to incorporate common safety features or warn of how Grindr can be used as a “stalking weapon.”

The federal district court dismissed Herrick’s complaint on Section 230 grounds. On appeal, he argued that Section 230 did not bar his claims because they concerned the app’s design, not J.C.’s speech on the app. But in 2019, the 2nd U.S. Circuit Court of Appeals upheld the dismissal because, as it concluded, Herrick’s alleged injuries were caused by the content submitted to the app by J.C., not the app’s design.

Advertisement

Advertisement

Advertisement

The causation analysis put forth in Herrick v. Grindr LLC may be the most important part for Spence’s lawsuit. In fact, both the Texas cases and Herrick illustrate a dispositive issue in lawsuits alleging products liability claims: whether or not the platform itself actually caused the harms or injuries alleged by a plaintiff. But for J.C.’s speech on Grindr, Herrick would not have experienced the alleged harassment and stalking. J.C.’s unlawful acts (speech) were the actual cause of Herrick’s alleged injuries, not Grindr’s acts or omissions. In the Texas cases, but for the predators’ actions, the victims would not have become entangled in sex trafficking.

Advertisement

Advertisement

Spence’s lawsuit against Meta, by contrast, focuses on Instagram’s “Feed + Profile and Explore,” “Friend Recommendations,” and “Likes,” with the latter likely being the strongest claim. Her claims can be reframed as: Regardless of the content posted on Instagram by its users, these product features and capabilities caused the injuries alleged.

Advertisement

In Lemmon v. Snap Inc., the availability and use of the speed filter did not depend on a third party’s content. In Spence’s case, Instagram’s feed, explore, and likes features similarly do not depend on any third party’s content. However, unlike in the Texas cases, where the content of the sex traffickers played a significant role in entangling the victims, in Spence’s case, Instagram’s core features “entangled” her in the mental and physical injuries. Regardless of the content she posted or other users posted to Instagram, it was the “Likes” feature caused the anxiety, depression, and addiction alleged. And as Project Daisy illustrated, per the Facebook Papers, but for the Likes feature, tweens and teens would not have experienced the anxiety, depression, and addiction.

Advertisement

Further, in Herrick, J.C.’s content ultimately caused the harassment and stalking against Herrick. But in Spence’s case, regardless of whatever content Instagram showed her, the Feed + Profile and Explore’s design (e.g., endless scrolling) elicits the emotional response that drives the apparent addiction among tweens and teens. As Spence alleges, Meta’s use of psychological manipulation techniques—sometimes referred to as persuasive design—caused the well-established hallmarks of clinical addiction.

Advertisement

Advertisement

But for Meta’s failure to conduct a reasonable age verification, Spence would not have been exposed to Instagram’s features and design. But for the likes feature, Spence would not have experienced the anxiety and depression that stem from the number of likes collected. But for the endless feed and explore features, Spence would not have experienced the clinical addiction that these features were designed to promote.

Advertisement

While Spence’s case is not a slam-dunk, its focus on content-neutral features as the actual cause of Spence’s mental and physical injuries stand the greatest chance of finding Meta liable. Even though Meta will likely argue that all of Spence’s claims are barred by Section 230, only some of her claims—the claims that rely or depend on the content posted by third-parties—should be barred. However, her claims that rely on content-neutral features should survive. And with the U.S. District Court for the Northern District of California—where Spence’s case is filed—having a greater understanding of technology than other courts, the suit may survive Meta’s motion to dismiss, but only if the district court applies a proper causation analysis.

Advertisement

And it’s important that Spence’s content-neutral claims are allowed to proceed. As I’ve generally argued in the past, social media platforms will not change their behavior, including how they design their platforms, unless the incentive structure changes. Under the status quo, social media platforms are incentivized to maximize user engagement because it increases advertising revenues. Section 230 allows platforms to maximize user engagement without regard to any mental or physical injury that the platforms may cause their users.

Advertisement

Advertisement

Further, setting aside whether Section 230 defenders’ interpretation is correct or proper, the reasons for Section 230 immunity fall flat with respect to Spence’s lawsuit. Here, there is no speech to protect. And if the primary beneficiary of Section 230 protection is the internet user, then it follows that platforms should not be allowed to use Section 230 immunity for the harms the platforms directly cause their users.

Simply put, if platforms are exposed to legal liability (e.g., damages), then the calculus changes—protecting the mental health or physical safety of their users may suddenly align with maximizing profits or, at the very least, avoiding excess costs. This, after all, is what products liability lawsuits are intended to protect: the users’ or consumers’ safety.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.