Voz media US Voz.us
107 days and counting

SINCE KAMALA HARRIS' LAST PRESS CONFERENCE

Big Tech Censorship: Section 230 makes its way to the Supreme Court

The Supreme Court studies two cases that directly involve the article that exempts technology platforms from liability for the content published on them.

Tribunal Supremo Big Techs

(Wikimedia)

Published by


Section 230 of the Communications Decency Act of 1996 is heading to the Supreme Court. The court will be tasked with ruling on two cases involving this measure. The article, considered by many to be essential for the development of technology companies, states that "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

With these words, the law protects large technology platforms (Twitter, Facebook or Instagram, among others) from possible lawsuits that may be filed against them based on the content published by users. In other words, they exempt these companies from liability for the content published on them. Until now. The Supreme Court admitted the appeal of two cases that could change everything. They are also the first cases that force the major technology companies to appear before this court, something that has not happened since the Internet was created.

Should Google be blamed for the Bataclan massacre?

Gonzalez, Reynaldo et al. v. Google, will determine to what extent Google can be blamed for the Bataclan massacre that took place in mid-November 2015 in Paris. The lower courts rejected the claim, and now the Supreme Court has agreed to take over the case.

The lawsuit, filed by the family of one of the victims, Nohemi Gonzalez, alleges that Big Tech should bear some responsibility for the videos inciting Islamist violence which were found on YouTube days after the attacks:

Whether Section 230 [the rule that in principle relieves companies of responsibility for the content of their users] applies to these algorithmically generated recommendations is of enormous practical importance [...]. Interactive computer services consistently direct such recommendations, in one form or another, to virtually every adult and child in the United States who uses social networks.

Google tried to defend itself. The company argued that the only link between the Paris attacker and YouTube was that one of the terrorists frequently used the platform and, on one occasion, was featured in an ISIS propaganda video: "This court should not lightly adopt a reading of section 230 that threatens the basic organizational decisions of the modern internet."

Can Big Tech be sued for alleged complicity in terrorist acts?

The second, Twitter vs. Taamneh, refers to the attack that took place at a nightclub in Istanbul in 2016 and killed 39 people. In contrast to the Bataclan lawsuit, the lower courts did decide to accept the case. They ruled that Twitter, Facebook and Google should take some responsibility for what happened at the Reina club during the 2016 New Year's Eve party. Now, it will be up to the Supreme Court to decide.

This case, unlike the previous one, has nothing to do with the content recommended by the algorithm. It will determine whether social networks can be sued for alleged complicity in an act of terror. Here they will try to define what responsibility these platforms have when hosting user posts that support terrorist groups, even if they do not refer to a specific attack.

The development of these two cases could change the Internet as we know it. More so, it brings back to the table the problem that is found all too often in Big Tech: censorship.

tracking