Live updates: Supreme Court hears Twitter v. Taamneh oral arguments | CNN Business

Supreme Court hears oral arguments in Twitter case that could upend the internet

NightCap100622_Clip3SupremereCourt_16x9
Supreme Court to rule on how we use the internet
01:58 • Source: CNN
01:58

What we covered here

  • The Supreme Court is hearing oral arguments on Twitter v. Taamneh, which will decide if social media companies can be sued for aiding and abetting a specific act of international terrorism when hosting certain user content on its platform.
  • Twitter’s attorney was grilled on what kind of assistance to terrorists would make a platform liable under the anti-terrorism law. He attempted to argue there’s a distinction between a defendant’s action assisting a terrorist versus their inaction.
  • The plaintiffs, the family of a person who was killed in an ISIS attack in Istanbul in 2017, allege social media companies, including Twitter, knowingly aided ISIS by allowing some of the group’s content to persist on their platforms.
  • Twitter has previously argued it was immune from the suit thanks to Section 230 and that just because ISIS used the platform to promote itself doesn’t constitute the company’s “knowing” assistance to the terrorist group.

Our live coverage has ended. You can read more about today’s arguments by scrolling through the posts below.

23 Posts

Takeaways from today's Supreme Court hearing on Twitter v. Taamneh

After back-to-back oral arguments this week, the Supreme Court appears reluctant to hand down the kind of sweeping ruling about liability for terrorist content on social media that some feared would upend the internet. 

On Wednesday, the justices struggled with claims that Twitter contributed to a 2017 ISIS attack in Istanbul by hosting content unrelated to the specific incident. Arguments in that case, Twitter v. Taamneh, came a day after the court considered whether YouTube can be sued for recommending videos created by ISIS to its users. 

If you’re just reading in now, here’s what you need to know:

What’s at stake: The closely watched cases carry significant stakes for the wider internet. An expansion of apps and websites’ legal risk for hosting or promoting content could lead to major changes at sites including Facebook, Wikipedia and YouTube, to name a few. 

For nearly three hours of oral argument, the justices asked attorneys for Twitter, the US government and the family of Nawras Alassaf – a Jordanian citizen killed in the 2017 attack – how to weigh several factors that might determine Twitter’s level of legal responsibility, if any. But while the justices quickly identified what the relevant factors were, they seemed divided on how to analyze them. 

The court’s conservatives appeared more open to Twitter’s arguments that it is not liable under the Anti-Terrorism Act, with Justice Amy Coney Barrett at one point theorizing point-by-point how such an opinion could be written and Justice Neil Gorsuch repeatedly offering Twitter what he believed to be a winning argument about how to read the statute. 

The panel’s liberals, by contrast, seemed uncomfortable with finding that Twitter should face no liability for hosting ISIS content. They pushed back on Twitter’s claims that the underlying law should only lead to liability if the help it gave to ISIS can be linked to the specific terrorist attack that ultimately harmed the plaintiffs.

The key issues at hand: The justices spent much of the time picking through the text of the Anti-Terrorism Act, the law that Twitter is accused of violating — especially the meaning of the words “knowingly” and “substantial.” 

The law says liability can be established for “any person who aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed such an act of international terrorism.” 

Justice Sonia Sotomayor seemed unpersuaded by Twitter attorney Seth Waxman’s arguments that Twitter could have been liable if the company were warned that specific accounts were planning a specific attack, but that those were not the facts of the case and Twitter was therefore not liable in the absence of such activity and such warnings. 

Chief Justice John Roberts grappled with the meaning of “substantial” assistance: Hypothetically, he asked, would donating $100 to ISIS suffice, or $10,000?  

What’s next? The Taamneh case is viewed as a turning point for the future of the internet, because a ruling against Twitter could expose the platform – and numerous other websites – to new lawsuits based on their hosting of terrorist content in spite of their efforts to remove such material. 

While it’s too early to tell how the justices may decide the case, the questioning on Wednesday suggested some members of the court believe Twitter should bear some responsibility for indirectly supporting ISIS in general, even if the company may not have been responsible for the specific attack in 2017 that led to the current case. 

But a key question facing the court is whether the Anti-Terrorism Act is the law that can reach that issue – or alternatively, whether the justices can craft a ruling in such a way that it does. 

Rulings in the cases heard this week are expected by late June.

You can read more takeaways from today’s arguments here.

Oral arguments in the Twitter case wrap up

The Supreme Court today in Washington, DC.

The Supreme Court wrapped up arguments in the Twitter case about two and a half hours after the hearing started. Before the hearing ended, Twitter lawyer Seth Waxman offered a brief rebuttal to his opponents’ arguments. 

Justice Jackson hypothesizes why Twitter didn't take Gorsuch's lifeline

United States Supreme Court Associate Justice Ketanji Brown Jackson poses for an official portrait at the East Conference Room of the Supreme Court building on October 7, 2022 in Washington, DC.

It may be perplexing why Twitter didn’t leap at Justice Neil Gorsuch’s offer of a way out by focusing on the ATA’s mention of “person,” but Justice Ketanji Brown Jackson appeared to reason out why Twitter — and the US government, as well — might be concerned about that.

In an exchange with US Deputy Solicitor General Edwin Kneedler, Jackson said: “I’m wondering whether the concern about that is, if you’re focusing on the person [who committed a terrorist act]… that it seems to take the focus away from the act itself … you could aid and abet a person who committed the act, even if it’s not with respect to that act.”

In other words, that interpretation of the law could significantly broaden ATA liability such that anyone could be sued for knowingly and substantially helping any person who ultimately went to commit some act of terrorism.

Plaintiffs' attorney Eric Schnapper is speaking now

Eric Schnapper, attorney for the plaintiffs is presenting his argument now.

Schnapper is the same attorney who yesterday represented the Gonzalez family in the oral arguments in the Google case.

Amy Coney Barrett walks DOJ through a ruling in Twitter's favor

The exterior of Twitter headquarters on October 26, 2022 in San Francisco, California.

What would a ruling in Twitter’s favor look like? Justice Amy Coney Barrett asked the US government’s Kneedler to help her game out how the Justice Department would like the Supreme Court to shape its opinion, and their exchange showed how the DOJ was concerned about the potential fallout for banks.

Such a line of questioning doesn’t guarantee that Barrett will vote to reverse the appellate decision that found Twitter liable under the anti-terrorist law. But it illuminates how the Justice Department sees the scope of the law and highlighted the key questions that Barrett is grappling with.

She seemed to understand a point made by Kneedler: that the statute should apply to assistance given that is linked to the act of terrorism that injured the plaintiff, rather than just the general enterprise of a terrorist group. But she balked a bit at Kneedler’s assertion that, for a defendant to be liable, the interaction between it and the terrorist group would need to be individualized or “face to face,” rather than something more remote.

She said such a standard would be “a little bit trickier” for the court to draw.

Kneedler later clarified that companies that offer the service in question to all comers should be held liable if there is an additional allegation of knowledge or a specific allegation.

Through his back and forth with Barrett, Kneedler made clear that the US government was concerned about how a decision in the Twitter case would affect not social media platforms but rather banks that could be sued under the law.

“You’re trying to make sure that whatever we said about social media companies wouldn’t get banks off the hook when they had those kinds of special relationships,” Barrett said.

“[The law] doesn’t necessarily require that you know that a particular person is going to commit a particular act, if you know – because of the proximate relationship with the person you’re assisting – that there are a group of acts that they are about to commit or that they are that they have an ongoing practice of committing,” Kneedler said.

US government argues that broadening ATA liability could inhibit innocuous or routine business

In an exchange with Justice Sonia Sotomayor, the US government’s attorney argued that a ruling against Twitter that specifies that routine business activity – as opposed to targeted assistance – can lead to liability could threaten innocuous behavior.

“We are concerned about not extending it so far that legitimate business activities could be inhibited, that banks for example in underdeveloped parts of the world and charities that may depend on those banks, concerns that they may pull back as legitimate businesses,” said US Deputy Solicitor General Edwin Kneedler. “That is a concern that should enter into the analysis.”

Justice Kagan asks DOJ if Twitter is like a bank that knew it was offering its services to Osama bin Laden

Justice Elena Kagan offered a hypothetical for US Deputy Solicitor General Edwin Kneedler about a bank that offered its services to a known terrorist.

Once Kagan clarified that the terrorist in her hypothetical is someone particularly well known to be terrorist, like Osama bin Laden, Kneedler said he thought that bank would be liable under the anti-terrorist law at issue in Wednesday’s case.

Kagan then grilled on Kneedler on what differentiated Twitter in this case from the bank in her hypothetical. Kneedler tried to insist that there was a difference in the “nature” of the interactions, prompting Kagan to ask if it had to be like “personal banking” to be covered under the law.

DOJ now presenting its views on the anti-terrorism law

US Deputy Solicitor General Edwin Kneedler is now up to present the Justice Department’s views on the case.

The Justice Department argues that Twitter should not be liable under the law.

Regarding how Congress wrote Justice Against Sponsors of Terrorism Act, Kneedler said, “Congress ensured that JASTA does not reach so broadly as to inhibit legitimate and important activities by businesses, charities and others.”

Justice Jackson doesn't seem to buy a key Twitter argument

Supreme Court Justice Ketanji Brown Jackson on February 7 during President Joe Biden's State of the Union address at the U.S. Capitol.

Justice Ketanji Brown Jackson seemed unpersuaded by Twitter’s argument that Anti-Terrorism Act (ATA) liability only arises when a defendant’s actions substantially assist a specific act of international terrorism, as opposed to merely providing general support to the terrorist group.

“You have to know that you are providing substantial assistance to an act of international terrorism … that happened to be a terrorist attack that injured the plaintiff,” Twitter attorney Seth Waxman said.

There is a “gulf,” Waxman said, between that and general support provided to a terrorist organization not connected to a specific attack.

Jackson pushed Waxman to “explore” that gulf more fully, before ultimately concluding, “I don’t know that I see clearly that distinction.”

The court is referencing Halberstam v. Welch, a key case on the tests of liability for assisting a crime

The Supreme Court is hearing oral arguments in Twitter v. Taamneh, where plaintiffs have alleged social media companies, including Twitter, knowingly aided ISIS in violation of a US antiterrorism law by allowing some of the group’s content to persist on their platforms

The plaintiffs in the case include the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017.

The case was heard in the 9th U.S. Circuit Court of Appeals, which used the the three-element framework from the 1983 Halberstam v. Welch case to rule that social media companies could be liable for aiding and abetting an act of international terrorism.

Twitter took this ruling to the Supreme Court for a review, which brings the elements of Halberstam v. Welch into focus. Here’s a guide:

What’s the case: Bernard Welch, Jr. was a burglar in a relationship with Linda Hamilton, who also performed some responsibilities of a secretary for him. In 1980, Welch was arrested for killing Michael Halberstam while burglarizing Halberstam’s home. Halberstam’s widow filed a wrongful-death lawsuit against both Welch and Hamilton. Hamilton opposed the lawsuit, claiming that she did not know about Welch’s activities, and that she believed Welch ran a legitimate business. The district court ruled that Hamilton was jointly liable with Welch.

Why is the case being referenced here: The Halberstam v. Welch case provided a three-element framework that the Ninth Circuit applied to this Twitter case. That framework requires that:

  1. The party aided by the defendant must perform a wrongful act that causes an injury
  2. The defendant must be generally aware of their role in the wrongful act at the time that they’re providing assistance
  3. The defendant must knowingly and substantially provide assistance.

The Ninth Circuit first determined that “the party” from the first element was ISIS rather than the individual gunman, and that therefore the complaint sufficiently alleged element one. In the same vein, the Ninth Circuit concluded that the “principal violation” could refer to ISIS’s more general campaign of terror rather than the specific attack that caused injury.

The Ninth Circuit then concluded that the allegations that Defendants were generally aware that ISIS used its platforms to recruit, raise funds, and spread propaganda, and that it did not take proactive, aggressive measures to restrict ISIS-related content, showed that element two was also met.

For the same reason, the Ninth Circuit concluded that Defendants “knowingly” assisted the principal violation, leaving only whether they “substantially” assisted, which involved another multiple-factor test from Halberstam.

Halberstam v. Welch also provides a six-part test to determine whether something qualifies as substantially and knowingly:

  1. The nature of the act assisted
  2. The amount and kind of assistance
  3. The defendants’ presence at the time of the wrongful act
  4. The defendant’s relationship to the actor
  5. The defendant’s state of mind
  6. The duration of the assistance

So as the oral arguments are now heard in the Supreme Court, justices are concerned about defining substantial assistance and what it means to knowingly provide that assistance given the precedent set in Halberstam v. Welch case.

Twitter lawyer resists lifeline offered by Justice Gorsuch during questioning on anti-terrorism statute 

Justice Neil Gorsuch offered Twitter’s attorney the opportunity to reframe his arguments around the “aid and abet” language in the anti-terrorism statute. Twitter attorney Seth Waxman turned him down.

Gorsuch started the inquiry by asking whether the better interpretation of who is liable under the anti-terrorism law is a defendant who aided or abetted a person who committed an act of international terrorism. He noted that the lawsuit against Twitter doesn’t link Twitter to the three people involved in the attack against on the Istanbul nightclub.

However, Waxman stressed repeatedly that Twitter viewed the object of the “aid and abet” language in the anti-terrorist statute to be the criminal activity. “I think it is perfectly fine to read the object as the person who committed the act of international terrorism, but it is in the nature of abetting criminal activity,” the attorney said.

Gorsuch suggested Waxman was refusing to read the statute for what it said.

What to know about the underlying law at issue in the Twitter case 

The justices have spent much of the oral arguments so far nitpicking the text of the Anti-Terrorism Act, the law that Twitter is accused of violating. Here is the text of the relevant statute, with the most important part in bold:

The justices are chiefly concerned in this part of the argument with two issues: How to define “substantial assistance,” and what it means to “knowingly provide” that assistance.

Justice Sonia Sotomayor said the issues of “knowingly” and “substantial” were “the center of the issue.”

If the Turkish government had told Twitter about specific accounts that were planning a specific attack, and Twitter did nothing to remove those accounts, then Twitter could be liable in that situation under the ATA, Seth Waxman, an attorney representing Twitter, said.

But the facts in this case are not at all the same, he added, saying that Twitter is simply being accused of not doing enough to remove general ISIS content that wasn’t linked to the specific terrorist attack that gave rise to the case.

Sotomayor seemed unpersuaded, at least on the “knowing” standard.

“Willful blindness is something that we have said can constitute knowledge,” she told Waxman.

Twitter lawyer grilled on what kind of assistance to terrorists would make platform liable under key law

Confronting Twitter attorney Seth Waxman, Justices Sonia Sotomayor and Elena Kagan tag-teamed a series of questions about what kind of assistance to terrorists could make a social media website liable under the anti-terrorism law — with Kagan expressing frustration that a somewhat off-point Waxman wasn’t responding to their inquiry.

Sotomayor asked whether there was a difference between a defendant providing a terrorist a gun versus providing money — an “indirect,” “fungible” form of assistance. Other court rulings have said defendants are liable for providing financial assistance, she noted.

Under her and Kagan’s follow up questioning, Waxman attempted to argue there was a distinction between a defendant taking an action that assisted a terrorist versus their inaction helping a terrorist. He said, in this case, the plaintiffs were arguing Twitter was not doing enough to police terrorist conduct on its platform.

Kagan redirected Waxman to answer the justices’ question rather than focus on how the complaint was framed, telling him that she was “going to rewrite their complaint for them.”

The liberal justices’ line of questioning came after Waxman appeared to struggle to address a hypothetical scenario posed by Justice Clarence Thomas. Thomas asked how the law would treat him if he provided a gun to a friend who went on to commit a crime.

Waxman replied there weren’t enough facts in the hypothetical to address Thomas’ query. Sotomayor then stepped in and appeared to throw Waxman a lifeline, directing him back to Twitter’s brief and overall argument — focusing on the claim that a defendant must have knowledge that the assistance he gives to a third party will lead to a specific terrorist act in order to face liability under the Anti-Terrorism Act.

Justice Thomas digs into key issue of case in first question to Twitter's attorney 

United States Supreme Court Associate Justice Clarence Thomas during official portrait session at the Supreme Court on October 7, 2022 in Washington, DC.

Justice Clarence Thomas has the first question for Twitter’s lawyer and digs into a key issue of the case.

He asks whether Twitter sees the anti-terrorist law as requiring the defendant to have knowledge of the specific terrorist attack in question — in this case an attack on an Istanbul night club — or just a general knowledge of terrorists.

Twitter attorney Seth Waxman clarified that that was not what Twitter is arguing in the case.

NOW: Supreme Court hears oral arguments in Twitter case that could shape the internet's future

The U.S. Supreme Court seen on February 21.

The Supreme Court is in session and justices are now hearing oral arguments in Twitter v. Taamneh, a case which could decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms have hosted certain user content.

What both sides have said: The plaintiffs in the case — the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017 — have alleged that social media companies including Twitter had knowingly aided ISIS in violation of a US antiterrorism law by allowing some of the group’s content to persist on their platforms despite policies intended to limit that type of content.

Twitter has said that just because ISIS happened to use the company’s platform to promote itself does not constitute Twitter’s “knowing” assistance to the terrorist group, and that the company cannot be held liable under the antiterror law because the content at issue in the case was not specific to the attack that killed Alassaf.

What’s at stake: This case, along with Google v. Gonzalez, carries significant stakes for the wider internet. An expansion of apps and websites’ legal risk for hosting or promoting content could lead to major changes at sites including Facebook, Wikipedia and YouTube, to name a few.

How today's arguments will unfold

Supreme Court oral arguments on Twitter v. Taamneh will start soon. Here’s what we know about how the day will go.

Seth Waxman, an attorney representing Twitter, will present first.

Then we’ll hear from Deputy Solicitor General Edwin Kneedler, representing the US government.

Rounding out the list will be Eric Schnapper for the plaintiffs. Schnapper is the same attorney who yesterday represented the Gonzalez family in the oral arguments in the Google case.

As with yesterday’s arguments, it’s possible things could run long.

Justice Gorsuch is still under the weather

United States Supreme Court Associate Justice Neil Gorsuch poses for an official portrait at the East Conference Room of the Supreme Court building on October 7, 2022 in Washington, DC.

Justice Neil Gorsuch remains ill and will be absent from the courtroom for the second straight day, the Supreme Court said.

Gorsuch, as he did with Tuesday’s oral arguments in Gonzalez v. Google, will participate remotely via phone.

Why lawyers may raise pivotal lawsuits CompuServe and Stratton Oakmont in today's SCOTUS arguments

During today’s oral arguments, you may hear references to “CompuServe” and “Stratton Oakmont” — here’s what you need to know about these pivotal lawsuits that helped pave the way for Section 230’s creation.

Both cases were decided in New York courts.

In 1991’s Cubby v. CompuServe, the court held that CompuServe, an early internet portal, could not be sued for libel over a defamatory forum post by one of its users, because CompuServe did not, as a rule, engage in any content moderation.

Four years later, in Stratton Oakmont v. Prodigy, a different court held that the online service Prodigy could be sued because it routinely moderated some content and thus should be liable for violative content on its forums that it had failed to remove.

The conflicting rulings prompted US lawmakers to worry that these sorts of lawsuits might strangle the nascent internet in its crib.

Thus, Section 230 was born, which explicitly immunized websites for engaging in content moderation. The new law, passed in 1996, did not require websites be politically neutral or that they must moderate content in a specific way.

The point, its authors have argued, was to protect the ability of websites to innovate new ways of content moderation. Now, as the Supreme Court weighs whether to scale back Section 230’s protections for content moderation, the question is how far the justices may go and what will happen to websites that face new liability as a result.

Section 230 will be mentioned a lot today. Here's what to know about the law that made the modern internet. 

The U.S. Supreme Court is seen in Washington, DC on February 21, as the justices heard arguments in a landmark case that could transform the internet by scrapping decades-old legal protections for tech companies over harmful content on their platforms.

Congress, the White House and now the US Supreme Court are focusing their attention on a federal law that’s long served as a legal shield for online platforms.

The Supreme Court is hearing oral arguments on two pivotal cases this week dealing with online speech and content moderation. Central to the arguments is Section 230, a federal law that’s been roundly criticized by both Republicans and Democrats for different reasons but that tech companies and digital rights groups have defended as vital to a functioning internet. In Tuesday’s oral arguments on Gonzalez v. Google, the law came up multiple times.

Tech companies involved in the litigation have cited the 27-year-old statute as part of an argument for why they shouldn’t have to face lawsuits alleging they gave knowing, substantial assistance to terrorist acts by hosting or algorithmically recommending terrorist content.

Here are key things to know about the law:

  • Passed in 1996 during the early days of the World Wide Web, Section 230 of the Communications Decency Act was meant to nurture startups and entrepreneurs. The legislation’s text recognized the internet was in its infancy and risked being choked out of existence if website owners could be sued for things other people posted.
  • Under Section 230, websites enjoy immunity for moderating content in the ways they see fit — not according to others’ preferences — although the federal government can still sue platforms for violating criminal or intellectual property laws.
  • Contrary to what some politicians have claimed, Section 230’s protections do not hinge on a platform being politically or ideologically neutral. The law also does not require that a website be classified as a publisher in order to “qualify” for liability protection. Apart from meeting the definition of an “interactive computer service,” websites need not do anything to gain Section 230’s benefits – they apply automatically.
  • The law’s central provision holds that websites (and their users) cannot be treated legally as the publishers or speakers of other people’s content. In plain English, that means any legal responsibility attached to publishing a given piece of content ends with the person or entity that created it, not the platforms on which the content is shared or the users who re-share it.
  • The seemingly simple language of Section 230 belies its sweeping impact. Courts have repeatedly accepted Section 230 as a defense against claims of defamation, negligence and other allegations. In the past, it’s protected AOL, Craigslist, Google and Yahoo, building up a body of law so broad and influential as to be considered a pillar of today’s internet. In recent years, however, critics of Section 230 have increasingly questioned the law’s scope and proposed restrictions on the circumstances in which websites may invoke the legal shield.

Read more about Section 230 here.

Analysis: Tuesday's oral arguments on Google and the internet's future left justices confused at times

Justices of the U.S. Supreme Court pose for their official photo at the Supreme Court in Washington, DC on October 7, 2022. (Seated from left) Associate Justice Sonia Sotomayor, Associate Justice Clarence Thomas, Chief Justice John Roberts, Associate Justice Samuel Alito and Associate Justice Elena Kagan, (Standing behind from left) Associate Justice Amy Coney Barrett, Associate Justice Neil Gorsuch, Associate Justice Brett Kavanaugh and Associate Justice Ketanji Brown Jackson.

Nine justices set out Tuesday to determine what the future of the internet would look like if the Supreme Court were to narrow the scope of a law that some believe created the age of modern social media. After nearly three hours of arguments, it was clear that the justices had no earthly idea.

On several occasions, the justices said they were confused by the arguments before them – a sign that they may find a way to dodge weighing in on the merits or send the case back to the lower courts for more deliberations. At the very least they seemed spooked enough to tread carefully.

Justice Elena Kagan even suggested that Congress step in. “I mean, we’re a court. We really don’t know about these things. You know, these are not like the nine greatest experts on the internet,” she said to laughter.

That hesitancy, coupled with the fact that the justices were wading for the first time into new territory, suggests the court, in the case at hand, is not likely to issue a sweeping decision with unknown ramifications in one of the most closely watched disputes of the term.

Tech companies big and small have been following the case, fearful that the justices could reshape how the sites recommend and moderate content going forward and render websites vulnerable to dozens of lawsuits, threatening their existence.

Oral arguments drifted into a maze of issues, raising concerns about trending algorithms, thumbnail pop-ups, artificial intelligence, emojis, endorsements and even Yelp restaurant reviews. But at the end of the day, the justices seemed deeply frustrated with the scope of the arguments before them and unclear of the road ahead.

Yet even Thomas, who has expressed reservations about the scope of Section 230 before, seemed skeptical. He sought clarification from Schnapper of how one might be able to distinguish between algorithms that “present cooking videos to people who are interested in cooking and ISIS videos to people interested in ISIS.”

Alito asked whether Google might have been simply organizing information, instead of recommending any kind of content. “I don’t know where you’re drawing the line.”

Chief Justice John Roberts tried to make an analogy with a book seller. He suggested that Google recommending certain information is no different than a book seller sending a reader to a table of books with related content.

When Lisa Blatt, a lawyer for Google, stood up she warned the justices that “exposing websites to liability for implicitly recommending third-party context defies the text [of 230] and threatens today’s internet.”

Read more

Read more