Bin Laden, Burglars, and Banks: The U.S. Supreme Court Considers Twitter’s Role in Terrorism

Bin Laden, Burglars, and Banks: The U.S. Supreme Court Considers Twitter’s Role in Terrorism

U.S. Supreme Court justices whipped out their dictionaries and a deep bucket of metaphors Wednesday in a cumbersome attempt to understand whether or not social media companies can be held legally liable for promoting ISIS videos under anti-terrorism laws. Stolen jewellery, banks, imaginary burglars, and a young Osama bin Laden were all invoked in a testy two-hour oral argument.

The historic hearing came just one day after justices heard arguments for and against removing liability protections for recommendation algorithms currently covered under Section 230 of the Communications Decency Act, which shields tech companies from legal liability for what their users post. Combined, the court ruling on the two cases could fundamentally alter the way social media platforms host content on the internet, and thus change the everyday experience of millions of people online.

What did the justices say about ISIS, Twitter, and content moderation?

The arguments kicked off with Seth Waxman, Twitter’s lawyer, struggling to coherently respond to a hypothetical scenario posed by Justice Clarence Thomas related to the definition of aiding and abetting. If a friend loans a a gun to a known burglar and murderer who is “otherwise a good guy,” and that gun is then used in a crime, did the gun’s original owner aid in the crime?

That question set the tone for the justices’ line of questioning, during which Waxman repeatedly said Twitter should not be held liable for hosting terrorist content because it does not necessarily specifically know whether or not an alleged terrorist on the platform will actually end up carrying out an attack. Many alleged terrorists or users sympathetic to alleged terrorist groups also use Twitter for what it’s best at: shitposting and doomscrolling.

Justices were sceptical of Waxman’s answers, though, and alleged the mere presence of alleged ISIS members on the service could be akin to a ticking time bomb.

“If you know ISIS is using it, you know ISIS is going to be doing bad things, you know ISIS is going to be committing acts of terrorism,” Justice Amy Coney Barrett said during her questioning. Justice Elena Kagan reiterated that statement in her questioning.

“​​You’re helping by providing your service to those people, with the explicit knowledge that those people are using it to advance terrorism,” Kagan said. Waxman, in response, tried to carve out a distinction between Twitter actively helping terrorists commit crime and terrorists inadvertently being aided by Twitter’s failure to remove all related content.

In a rebuttal towards the end of the hearing, Waxman revisited the burglar with a gun scenario and said Twitter, in this case, was more like Walmart which sells guns in stores throughout the country all while knowing someone, somewhere may end up using one to commit a felony.

“Nobody will say they [Walmart] are aiding and abetting particular crimes,” Waxman said.

Photo: Chip Somodevilla, Getty Images
Photo: Chip Somodevilla, Getty Images

What’s at stake in this terrorism case against Twitter?

Twitter v. Taamneh stems from a lawsuit filed by the relatives of Nawras Alassaf, a 23-year-old who was killed in a 2017 ISIS attack on an Istanbul nightclub that left 39 people dead. Alassaf’s relatives sued Twitter, claiming they aid and abet terrorist activity by allowing some ISIS related content to persist on its platform. Twitter maintains it does not knowingly provide assistance to terrorist groups even if they use their platform for promotion.

Unlike Google v. Gonzalez on Tuesday, which grappled with the scope of tech’s liability protections under Section 230 of the Communications Decency Act, the Twitter case focuses squarely on whether claims like these can be bought under the Anti-Terrorism Act. The two are connected though, and a ruling weakening Section 230 immunity on services like recommendation algorithms could potentially open them up to liability under terrorism laws.

Though the case specially involves Twitter, its implications could affect any company that hosts user generated content. As a result, Google and Meta filed briefs in Twitter’s support. The Biden administration also filed a brief backing Twitter where it said the plaintiffs has failed to show Twitter knowingly provided assistance to terrorists. Other Twitter supporters, like The Knight First Amendment Institute fear an overall expansive interpretation of aiding and abetting liability could lead platforms to overcorrect and censor constitutionally protected, and potentially valuable, speech. In practice, that means social media companies could choose to only allow some user generated content on their site following human review, which would be next to impossible for Twitter given its scale. On the other hand, tech companies could also decide it’s safer to simply avoid posts with any mentions of terrorism altogether to steer clear of lawsuits. Both scenarios, critics say, are bad for free expression.

“Neither speech about terrorism nor speech by someone associated with a terrorist group is categorically unprotected, and the government can’t directly or indirectly suppress these broad swaths of political speech,” the Knight Institute wrote in a brief supporting Twitter. “The First Amendment is meant to protect against exactly this type of government intrusion.”

Bin Laden goes to the bank, and other weird hypotheticals

During her line of questioning Wednesday, Justice Kagan asked U.S. Solicitor General Edwin Kneedle if he believed banks should be held liable for aiding terrorist activity if they offered monetary services to Osama bin Laden. Kneedle, who supports Twitter’s position, stammered before eventually admitting he believed banks would be liable in that scenario. That admission led Kagan to press Kneedle on why that same logic wouldn’t apply to Twitter.

Kneedle went on to note his concerns about the court’s ruling aren’t limited to Twitter. Responding to questions from Justice Ketanji Brown Jackson, Kneedle said he feared an expanded interpretation of Anti-Terrorism Act liability could impede on commonplace business practices by many non-tech related businesses, banks included.

“We are concerned about not extending it [Anti-Terrorism Act] so far that legitimate business activities could be inhibited, Kneedle said. “That is a concern that should enter into the analysis.”

Bin Laden made another appearance later during questioning from Justice Brett Kavanaugh. In that case, Kavanaugh asked the lawyer representing the plaintiffs if he believed CNN should be held liable for aiding and abetting terrorist activity when it aired an early interview with Bin Laden where he declared war on the U.S. The plaintiff’s attorney eventually responded, “I think the First Amendment would solve that problem.”

 Jose Hernandez and Beatriz Gonzalez, stepfather and mother of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, arrive to speak to the press outside of the U.S. Supreme Court following oral arguments in Gonzalez v. Google February 21, 2023 (Photo: Drew Angerer, Getty Images)
Jose Hernandez and Beatriz Gonzalez, stepfather and mother of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, arrive to speak to the press outside of the U.S. Supreme Court following oral arguments in Gonzalez v. Google February 21, 2023 (Photo: Drew Angerer, Getty Images)

Justices seem sympathetic to Big Tech’s concerns

The Supreme Court began its two-day inspection of tech and Section 230 on Tuesday with oral arguments in the Gonzalez v Google case. The case, brought by the parents of a college student killed during a 2015 ISIS attack in Paris, alleges YouTube, a Google subsidiary, aided and abetted terrorism by boosting terrorist content in its recommendation algorithm. That argument rests on the assumption that Section 230 immunity does not extend to recommendation algorithms. Tech companies and supporters of wide liability protections reject that premise and fear limiting its scope could open platforms up to a potentially devastating waves of lawsuits.

“Without Section 230, the major platforms would likely survive, but existence of innovators and smaller online sites would be put at great risk,” said John Morris, a principal at the Internet Society.

Schnapper, who also represented plaintiffs in that case, repeatedly brought up YouTube thumbnails, which he confusingly equated to a person sending an email. Schnapper said YouTube’s generation of URLs and images mean thumbnails were no longer mere third-party content covered by Section 230, but rather entirely new content partially created by YouTube.

“Our contention is [that] the use of thumbnails is the same thing under the statute as sending someone an email and saying, ‘You might like to look at this new video now,’” Schnapper said.

Justices weren’t convinced by that argument, with justices Alito and Jackson both saying they were confused by Schnapper’s logic, or lack thereof. Part of that confusion may stem from an incorrect understanding of how social media algorithms work, perhaps on the part of the justices. Justice Kagan made that point explicitly telling the courtroom that the justices really aren’t expert authorities on technology.

“We’re a court,” Kagan said. “​We really don’t know about these things. These are not like the nine greatest experts on the internet.” Kagan and Kavanaugh both expressed apprehensions about the court’s ability to properly adjust legal protections for tech firms and suggested Congress may be better equipped to settle the issue.

Several of the justices worried a lifting of legal immunity for online recommendation could welcome in a wave of lawsuits against firms that could threaten to hobble parts of the internet. Though some expressed sympathy around lifting immunity in extreme cases, the justices couched that by noting the difficulty of figuring out where to draw the line on cases. Schnapper, responding to a question from Justice Amy Coney Barrett, said it’s possible a user’s retweets or likes could be considered new content outside the scope of Section 230 immunity practices. That means, in theory, a troubling retweet could lead to a lawsuit if the court sides with Gonzalez.

Even though multiple justices raised doubts over whether or not Section 230, written in 1996, could have predicted recommendation algorithms, many legal experts seemed to believe it was unlikely the court would step in to make a change, in part due to the plaintiff’s lawyer’s lacklustre performance.

“I don’t know if I’ve ever seen lawyers do so much damage to their own cases,” said Tim Wu, a Columbia Law Professor and former special assistant in the Biden administration. “Schnapper for petitioner was way out of his league and threw away every lifeline threw to him. Painful to watch such a nationally important issue be so badly argued.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.