IE 11 is not supported. For an optimal experience visit our site on another browser.

Supreme Court takes up a divisive issue: Should tech companies have immunity over problematic user content?

The family of Nohemi Gonzalez, who was killed in the 2015 Paris terrorist attacks, claims YouTube helped aid and abet the spread of violent Islamic ideology.

The Supreme Court on Monday stepped into the politically divisive issue of whether tech companies should have immunity over problematic content posted by users, agreeing to hear a case alleging that YouTube helped aid and abet the killing of an American woman in the 2015 Islamic State terrorist attacks in Paris.

The family of Nohemi Gonzalez, one of 130 people killed in a series of linked attacks carried out by the militant Muslim group, commonly known as ISIS, argued that YouTube’s active role in recommending videos overcomes the liability shield for internet companies that Congress imposed in 1996 as part of the Communications Decency Act.

Nohemi Gonzalez.
Nohemi Gonzalez.Cal State via Facebook

The provision, Section 230 of the act, says internet companies are not liable for content users post. It has come under heavy scrutiny from the right and the left in recent years, with conservatives claiming that companies are inappropriately censoring content and liberals saying social media companies are spreading dangerous right-wing rhetoric. The provision leaves it to companies to decide whether certain content should be removed and does not require them to be politically neutral.

The stakes could be enormous, because recommendations are now the norm for online services, not just YouTube. Apps such as Instagram, TikTok, Facebook and Twitter long ago began to rely on recommendation engines or algorithms to decide what people see most of the time, rather than emphasize chronological feeds.

NetChoice, a trade group for tech corporations, said the industry needs as much flexibility as possible to decide what to take down or leave up.

“Without moderation, the internet will become a content cesspool, filled with vile content of all sorts, and making it easier for things like terrorist recruitment,” NetChoice counsel Chris Marchese said.

In a separate move, the court said Monday it would hear a related appeal brought by Twitter about whether the company can be liable under a federal law called the Anti-Terrorism Act, which allows people to sue people or entities who “aid and abet” terrorist acts. The same appeals court that handled the Gonzalez case revived claims brought by relatives of Nawras Alassaf, a Jordanian citizen killed in an Islamist attack in Istanbul in 2017, who accused Twitter, Google and Facebook of aiding and abetting the spread of militant Islamic ideology, which the companies deny. The question of Section 230 immunity has not yet been addressed in that case.

Twitter’s lawyers said in court papers that the company provides “generic, widely available services to billions of users who allegedly included some supporters of ISIS” and that it has regularly enforced policies preventing terrorists from using its services. The ruling in the Twitter matter will also affect the claims against Facebook and Google in the same case.

Gonzalez was a 23-year-old college student studying in France when she was killed while dining at a restaurant during the wave of attacks, which also targeted the Bataclan concert hall.

Her family is seeking to sue Google-owned YouTube, alleging it allowed ISIS to spread its message. The lawsuit targets YouTube’s use of algorithms to suggest videos for users based on content they have previously viewed. YouTube’s active role goes beyond the kind of conduct Congress intended to protect with Section 230, the family’s lawyers allege. They say in court papers that the company “knowingly permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence” that helped it recruit supporters, some of whom then conducted terrorist attacks. YouTube’s video recommendations were key to helping spread ISIS’ message, the lawyers say. The plaintiffs do not allege that YouTube had any direct role in the killing.

Gonzalez’s relatives, who filed their 2016 lawsuit in federal court in Northern California, hope to pursue claims that YouTube violated the Anti-Terrorism Act. A federal judge dismissed the lawsuit, but it was revived by the San Francisco-based 9th U.S. Circuit Court of Appeals in a June 2021 decision that also resolved similar cases the families of other terrorist attack victims had brought against tech companies.

Google’s lawyers urged the court not to hear the Gonzalez case, saying in part that the lawsuit would be likely to fail whether or not Section 230 applies.

The Supreme Court has previously declined to take up cases about Section 230. Conservative Justice Clarence Thomas has criticized it, citing tech giants’ market power and influence.

Another related issue likely to head to the Supreme Court concerns a law Republicans enacted in Texas that seeks to prevent social media companies from barring users who make inflammatory political comments. On Sept. 16, a federal appeals court upheld the law, which the Supreme Court prevented in May from going into effect.