Big tech has a lot to worry about these days: congressional scrutiny, layoffs and more. But this week, all eyes will be on the U.S. Supreme Court as it takes up two cases with the potential to revolutionize the operation of social media companies.
The two cases, Gonzalez v. Google and Twitter v. Taamneh, stem from tragedies caused by terrorist attacks. The families of the victims are asking the justices to crack the hard shell of immunity from lawsuits arising out of third-party content posted on interactive websites such as Twitter, Facebook and YouTube.
The two cases have similar facts but they raised slightly different questions as they came to the Supreme Court. Let’s take one at a time.
Algorithmic recommendations face scrutiny
In the Google case, to be argued on Tuesday, the estate and relatives of Nohemi Gonzalez sued the social media giant. Nohemi was a U.S. citizen and student in November 2015 when she was killed by terrorists who attacked a Paris bistro. The Islamic State group (IS) claimed responsibility.
In their lawsuit, Nohemi’s relatives claimed that Google, through YouTube, violated the Antiterrorism Act of 1990. That law authorizes American nationals to sue for injuries “by reason of an act of international terrorism.” It imposes liability on “any person who aids and abets, by knowingly providing substantial assistance” to anyone who commits “an act of international terrorism.”
The lawsuit claimed that Google allowed IS to post videos inciting violence and recruiting members. It also claimed that YouTube recommended IS videos to users through an algorithm identifying users who might be interested in those videos.
Google successfully moved to dismiss their lawsuit by raising Section 230 of the Communications Decency Act of 1996. Section 230, much criticized today by some members of Congress, Justice Clarence Thomas, and others, immunizes interactive websites like Facebook, YouTube, or Twitter from lawsuits arising from third-party content on those sites.
The U.S. Court of Appeals for the Ninth Circuit affirmed the trial court’s dismissal of the relatives’ lawsuit. In the Supreme Court, they have narrowed their case by asking the justices whether the Google-owned YouTube enjoys Section 230 immunity from a claim based on its algorithmic recommendations of third-party content to its users.
The Gonzalez appeal will be the first time that the Supreme Court has examined Section 230, which was enacted almost 30 years ago to encourage the growth of the internet.
A question of aiding and abetting
Gonzalez’s relatives are also part of the high court’s second case involving Twitter, which will be argued on Wednesday. While Gonzalez was murdered in Paris, Nawras Alassaf, Sierra Clayborn, Tim Nguyen and Nicholas Thalasinos were killed in separate terrorist attacks by IS in Istanbul and San Bernardino, California.
In 2015, at least 130 people were killed in Paris in a coordinated attack carried out by Islamic State Group terrorists. The Supreme Court will consider if social media companies bear responsibility for content carried on their platforms that helps terrorist organizations communicate, fundraise and recruit. Photo by Jeff J Mitchell/Getty Images
The families also sued Google, Twitter and Facebook under the Antiterrorism Act. They claimed that those platforms — by hosting and recommending IS content, particularly its use as recruitment, fundraising and communications — “knowingly provided substantial assistance” under the act and “aided and abetted” an act of international terrorism.
The relatives’ claims were dismissed without the trial court relying on Section 230. The Ninth Circuit again affirmed the dismissals, but with one exception. The appellate court said the family of Nawras Alassaf had plausibly stated an aiding-and-abetting claim that should be reconsidered by the trial court. Twitter, joined by the other two platforms, then asked the Supreme Court to review that decision.
Decisions could reshape social media
The immunity issues and how the justices decide them, whether under Section 230 or the Antiterrorism Act, could have sweeping implications for social media platforms — the content they post and the content they take down. Not surprisingly then, more than 70 “friend-of-the-court” briefs, mainly from the tech community supporting the platforms, have been filed in the high court. Reflecting the broad interests at stake, other briefs have been filed by states, religious groups, gun control organizations, business groups, former national security officials and members of Congress, among others.
The Biden administration has filed a brief in the Google case, arguing that Section 230 bars claims by Gonzalez’s relatives that YouTube failed to block or remove third-party content, but it doesn’t shield YouTube from any liability for its targeted recommendations of ISIS content to its users. In the Twitter case, the administration asked the justices to rule in favor of the social media platforms, noting that plaintiffs “allege that defendants knew that ISIS and its affiliates used defendants’ widely available social media platforms, in common with millions, if not billions, of other people around the world, and that defendants failed to actively monitor for and stop such use.”
Those allegations, the administration argued, do not “plausibly” allege that Twitter “knowingly provided substantial assistance” to an international act of terrorism.
The two cases have many other aspects likely to engage and even bedevil the justices as they wade into this special arena for the first time. And it is not likely the last time that they will do so.
Perhaps even more controversial and significant are two cases awaiting the court’s decision on whether to hear them next term. NetChoice and the Computer and Communications Industry Association have challenged Florida and Texas state laws enacted in response to conservative complaints about censorship.
In NetChoice v. Paxton, the social media company and the association argue that the First Amendment has been violated by a Texas’ law barring social media platforms with at least 50 million active users from blocking, removing or demonetizing content based on the users’ views. They contend the law also would prevent them from removing harmful content. A federal appellate court ruled in favor of the state.
A different federal appellate court ruled in favor of NetChoice’s challenge to a similar Florida law. The state has turned to the Supreme Court with its appeal.
The judge writing the Florida opinion said: “The question at the core of this appeal is whether the Facebooks and Twitters of the world — indisputably ‘private actors’ with First Amendment rights — are engaged in constitutionally protected expressive activity when they moderate and curate the content that they disseminate on their platforms.”
The justices have asked the U.S. solicitor general for her views on whether to grant review to the cases. The split between the two appellate courts increases the chances that the justices will agree to take the cases.