News

Section 230 before the Supreme Court

In the US, Section 230 of the Communications Decency Act protects social media platforms from liability for the content hosted on their platforms. MOW posted about this in October when the Supreme Court agreed to hear Gonzalez v Google and Twitter v Taamneh. Both cases relate to social media’s failure to mediate content and accuse Youtube and Twitter of complicity in the promotion of inciting content which drove ISIS recruitment, and therefore, indirectly, terrorist attacks in Paris and Istanbul. 

See our post here.

There have recently been some highly significant developments on Gonzalez v Google. Last month the Department of Justice released an amicus brief which asks the Supreme Court to vacate the judgment of the 9th Circuit Court of Appeals, which upheld the decision of the district court in favour of Google, barring non-revenue Antiterrorism Act (ATA) claims under Section 230. The court, likewise, dismissed the plaintiffs’ revenue-sharing claims, which were based on ISIS-affiliated users having received revenue from Google-owned AdSense, as they did not “plausibly allege an ATA violation”. 

The DoJ’s brief does not entirely endorse or vindicate the arguments advanced by the plaintiffs but does argue that Section 230(c)(1) should not insulate online platforms from liability in all instances.

A reviewing website is not, for instance, insulated from claims that it manipulates the order of published reviews to extort business under Section 230. Likewise, the DoJ argues that design choices can be considered outside of Section 230 immunity, drawing from a statement from Justice Thomas in Malwarebytes v Enigma Software. Justice Thomas notes that in a number of decisions, courts have extended section 230 far beyond its reasonable bounds. In the case Jane Doe No. 1 v. Backpage.com, for instance, the First Court of Appeal found that the actions of Backpage were shielded from accusations that the company had violated federal prohibitions on sex trafficking, due to their status as host rather than publisher of the illegal content. This was despite the fact that Backpage “deliberately structured its website to facilitate illegal human trafficking” by “accept[ing] anonymous payments, fail[ing] to verify emails, and stripp[ing] metadata from photographs to make crimes harder to track”. 

The Department of Justice, like Justice Thomas, came to the conclusion that a blanket application of Section 230 is nonsensical. The legislation was designed to protect free expression online, not to provide a handy defense for website operators who knowingly promote and facilitate criminal activity on their platforms. 

Instead, a flexible interpretation should apply: in the case of Gonzalez v Google, Google should not, the DoJ argues, be held responsible as the speaker or publisher of ISIS content hosted on Youtube (under Section 230). Google can, however, be challenged for “designing and implementing design algorithms that result in the communication of a distinct message from Youtube”. In this case the promotion of radicalizing ISIS content. 

This absence of immunity does not necessarily mean Google will be found liable, which will need to be proven by the plaintiffs, but the distinction that the DoJ draws between content hosting, where Section 230 shields intermediaries from liability, and content promotion, is certainly useful. 

Facebook, Youtube, Instagram and Twitter will, of course, in their singular mission to capture as many eyeballs as possible, continue to tailor newsfeeds to individual preferences, but may well be encouraged to be more sparing in promoting algorithmic suggestions relating to radicalising or damaging content.

See further analysis here.