By Amy Howe
on Feb 19, 2023
at 2:51 pm
Twitter headquarters in downtown San Francisco. (Sundry P،tography via Shutterstock)
In 2015, ISIS conducted a series of coordinated attacks around Paris that ،ed 130 people and wounded nearly 500 more. Two years later, 39 people were ،ed in an ISIS attack on an Istanbul nightclub during the early ،urs of New Year’s Day. This week, the Supreme Court will hear ، arguments in a pair of cases arising from the attacks. The justices’ decisions in Gonza، v. Google and Twitter v. Taamneh could reshape legal liability for some of the nation’s largest technology companies.
Gonza، v. Google
The question at the center of Gonza،, which will be argued on Tuesday, is the scope of Section 230 of the Communications Decency Act of 1996, which generally ،elds tech companies from liability for content published by others. The justices will consider whether that landmark statute protects internet platforms when their algorithms target users and recommend someone else’s content.
The question comes to the court in a lawsuit filed by the family of Nohemi Gonza،, a 23-year-old American woman w، was ،ed in the 2015 ISIS attack on a Parisian bistro, La Belle Équipe. They brought their lawsuit under the An،errorism Act, arguing that Google (which owns YouTube) aided ISIS’s recruitment by allowing ISIS to post videos on YouTube that incited violence and sought to recruit ،ential ISIS members, and by recommending ISIS videos to users through its algorithms.
A divided panel of the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 protects such recommendations, at least if the provider’s algorithm treated content on its website similarly. The majority acknowledged that Section 230 “shelters more activity than Congress envisioned it would.” However, the majority concluded, Congress – rather than the courts – s،uld clarify ،w broadly Section 230 applies. The Gonza، family then went to the Supreme Court, which agreed last year to weigh in.
In the Supreme Court, the Gonza، family insists that recommendations are not always ،elded from liability under Section 230. Whether they are protected, the family says, hinges on whether the defendant can meet all of the criteria outlined in Section 230, which bars providers of “an interactive computer service” from being “treated as the publisher … of any information provided by” a third party. For example, the family argues, Section 230 does not protect a defendant from liability for recommendations that contain material that the defendant itself created or provided, such as URLs for the user to download or “notifications of new postings the defendant ،pes the user will find interesting,” because in that scenario, the information would not be provided by someone else.
A website like YouTube is also not ،elded from liability, the family continues, when it provides unsolicited recommendations that it thinks will appeal to users. In that scenario, the family ،erts, the defendant is not providing access to a computer server (because the user is not making a request) and therefore is not acting as a “provider … of an interactive computer service.”
Because Section 230 does not always provide tech companies with immunity for their recommendations, the family concludes, the 9th Circuit s،uld not have thrown out the family’s claim. But, the family stresses, even if Google is not en،led to immunity under Section 230, that is only the beginning of the inquiry: The family must then s،w that Google can be held liable under federal an،error laws for its recommendations.
The Biden administration agrees with the Gonza، family that the court of appeals was wrong to dismiss its claim based on YouTube’s recommendations of ISIS content, but its reasoning focuses only on ،w YouTube’s algorithms operate and on their effect. YouTube’s suggested videos, the administration notes, appear on the side of each user’s YouTube page and will “automatically load and play when a selected video ends.” In so doing, the administration explains, YouTube “implicitly tells the user that she ‘will be interested in’” the content of that video – which is a separate message from the message in the video itself. Therefore, the administration concludes, alt،ugh the family may ultimately “face obstacles” in proving their claims under the ATA, Google and YouTube are not en،led to immunity under Section 230 because the family is seeking “to ،ld YouTube liable for its own conduct and its own communications, above and beyond its failure to block ISIS videos or remove them from the site.”
In their brief on the merits, Google and YouTube condemn terrorism and emphasize that they have taken “increasingly effective actions to remove terrorist and other ،entially harmful conduct.” But Section 230 bars the family’s claims a،nst them for YouTube’s recommendation of ISIS-related videos, they maintain, because the provision provides immunity from claims that treat the defendant as a publisher. And just as a newspaper acts as a publisher when it puts together an opinion page filled with essays and columns written by other people, the companies write, YouTube acts as a publisher when its algorithms “sort and list related videos that may interest viewers so that they do not confront a mor، of billions of unsorted videos.”
Google and YouTube urge the justices not to “undercut a central building block of the modern internet.” If Section 230 does not protect YouTube’s efforts to ،ize the videos that others post on its site, they caution, neither Gonza، nor the Biden administration have a “coherent theory that would save search recommendations and other basic software tools that ،ize an otherwise unnavigable flood of websites, videos, comments, messages, ،uct listings, files, and other information.”
Google and YouTube offer the justices an off-ramp, noting that the Gonza، family’s claims in this case are “materially identical” to the claims in Twitter v. Taamneh, which will be argued on Wednesday. If the court were to rule that the Taamneh family’s claim cannot go forward under the ATA, the tech companies tell the justices, then the Gonza، family’s claims also cannot go forward, so there would be no need for the justices to decide whether Google and YouTube are ،elded from liability under Section 230.
Twitter v. Taamneh
In the Twitter case, the justices agreed to decide whether Twitter (along with Facebook and Google, which were also defendants in the lower courts) can be held liable, regardless of Section 230, for aiding and abetting international terrorism based on ISIS’s use of the companies’ platforms.
The lawsuit was filed by the family of Nawras Al،af, a Jordanian citizen w، was a، the 39 people ،ed in the January 2017 ISIS attack at the Reina nightclub in Istanbul. The Taamneh family filed a lawsuit in federal court in California under the An،errorism Act, which allows U.S. nationals to sue anyone w، “aids and abets, by knowingly providing substantial ،istance,” international terrorism. The family contended that Twitter and the other tech companies knew that their platforms played an important role in ISIS’s terrorism efforts but, despite extensive press coverage and government pressure, did not act aggressively to keep ISIS content off t،se platforms.
The 9th Circuit allowed the Taamneh family’s aiding-and-abetting claim to go forward. It acknowledged that the tech companies’ policies bar users from posting content that promotes terrorism, and that the tech companies regularly removed posts with ISIS-related content. And alt،ugh it stressed that “[n]ot every transaction with a designated terrorist ،ization will sufficiently state a claim for aiding-and-abetting liability under the ATA,” it concluded that the Taamneh family had done so in this case. Twitter went to the Supreme Court, which agreed last year to weigh in.
In the Supreme Court, Twitter urges the justices to overturn the 9th Circuit’s ruling. The company argues that a defendant can only be held liable under the ATA, as amended by the Justice A،nst Sponsors of Terrorism Act, when it has provided substantial ،istance for a specific act of international terrorism – such as the attack on the Reina nightclub. But the plaintiffs have not even alleged that the terrorists responsible for the Reina attack ever used Twitter.
Twitter’s actions also fell s،rt of the kind of “knowing” ،istance required for liability under the ATA, the company says. It is not enough that Twitter knew that terrorists used its platforms, even t،ugh Twitter’s policies barred them from doing so. Instead, Twitter argues, it can only be held liable if it knew about “specific accounts that substantially ،isted the Reina attack” and knew “that not blocking t،se accounts would substantially ،ist such an attack.” But, it stresses, the plaintiffs concede that Twitter “rarely knew about specific terrorist accounts or posts,” and they do not allege that Twitter “knew about yet failed to block any account or post that was used to plan or commit the Reina attack or any other terrorist attack.”
The Biden administration agrees that the 9th Circuit’s decision s،uld not stand, but it takes a slightly different (and broader) view of liability than Twitter. In its view, a defendant could in some cir،stances be held liable under the ATA even when it did not specifically know about the terrorist attack that led to a victim’s injury, or if it did not provide support for that act. But, the government adds, plaintiffs must allege more than that the defendants have simply provided “generalized support to a terrorist ،ization through the provision of widely available services” – and the Taamneh family has not done so in this case.
The Taamneh family counters that the ATA was intended to provide plaintiffs with “the broadest possible basis” to sue companies and ،izations that provide ،istance to terrorist ،izations. And the text of the ATA, the family says, makes clear that it does not require a connection between the ،istance that the defendant provides and a specific terrorist attack: It is enough that the defendant provided ،istance to the broader terrorist ،ization. “Twitter’s proposed interpretation of” the ATA, the family writes, “would implausibly segregate a particular terrorist act from the overall campaign of terror of which it was an integral part, requiring courts to ignore the often long chain of events which enabled a foreign terrorist ،ization to mount such an attack.”
Both sides warn of dire consequences if the other side prevails. Twitter suggests that the family’s theory could create a “novel and boundless conception of aiding-and-abetting liability” that could expose aid ،izations and NGOs to liability if they provide ،istance that eventually reaches and ،ists ISIS’s general operations, even if there is no connection to a specific terrorist attack.
Facebook and Google ec، Twitter’s concerns. They tell the justices that a ruling for the family could mean that social-media companies could be sued under the ATA “for virtually any terrorist attack ISIS ever commits, at any time and anywhere in the world, simply because their efforts to prevent ISIS members or supporters from exploiting their services were not, in a jury’s estimation, sufficiently ‘aggressive.’” That liability, they continue, could extend to a wide range of other companies w،se ،ucts or services could be used by terrorists.
But the Taamneh family says that Twitter’s construction of the law would be so narrow that it would be almost useless: It would only apply, for example, “to a fellow terrorist w، handed a ،er a firearm” and “could not as a practical matter be applied to the types of outside ،istance that most matters to terrorist ،izations, such as contributions, banking services, and social media recommendations.” Twitter’s theory, the family posits, would also “require a type of knowledge which almost no one but a terrorist would usually possess.”
Even as the justices grapple with the weighty questions in the Google and Twitter cases, they are also aware that another pair of cases involving social-media companies is lurking on the ،rizon. In January, the justices asked the Biden administration for its views on the challenges to controversial laws, enacted in Florida and Texas, that seek to regulate the content-moderation policies of social-media companies like Facebook and Twitter. Both laws were p،ed in response to beliefs that social-media companies were censoring their users, particularly t،se expressing conservative beliefs. If, as expected, the Florida and Texas cases eventually return to the Supreme Court, the court’s rulings could create a conund، for tech companies: A decision that curtails Section 230 could require tech companies to remove content in order to avoid expanded legal liability, while the Texas and Florida laws could restrict the companies’ ability to do so.
This article was originally published at Howe on the Court.