Law in the Internet Society

Immunity for Platforms and its Utility in 2017

-- By EddyBrandt - 10 Nov 2017


In 1995, in the midst of the internet's explosion of growth, a case was decided that threatened to derail the growth of online service providers (for the purposes of this essay, I will be using this term to refer primarily to providers of online platforms). The court in Stratton Oakmont v Prodigy Services Co. found liability for the defendant, Prodigy Services, whose online forum was the site of defamatory material posted by third-party users. The “conscious choice, to gain the benefits of editorial control, opened (Prodigy Services) up to a greater liability to... other computer networks that make no such choice” Stratton Oakmont. This decision was promptly overruled via Section 230 of the Communications Decency Act in 1996 (the Act). The Act’s rationale was clear: if we impose liability upon, and treat as publishers for legal purposes, internet platforms when illegal acts are committed by third-party users of the platform services because the platforms decided to filter content, then the providers of said services will have reduced incentive to screen offensive material. As a society we have an interest in those materials being screened, so we won’t impose liability on internet platforms on the basis of their coquetry with publication. But has this mutual symbiosis run its course? Has society's give outweighed its take?

Is 230 suited for 2017?

Society has certainly held up its half of the bargain - providers of different sorts have escaped liability in the face of many a tortious act committed by their users on their platforms. Goddard v Google, Barnes v Yahoo!, Inc.. And, while complaints have been filed in the court system as well as with the FEC, companies like Facebook have yet to face any liability as publishers. But much like the child explorers of the internet which the Act sought to inoculate from indecent images in 1996, today young people and adults alike browse platforms that are awash in a greater, and likely unforeseen, form of danger: false information that masquerades as truth - fake news. The depth of Russia’s interference in the 2016 US presidential election, involving thousands of paid-for ads on Facebook, has become public knowledge. But less so are the news stories from the developing world. False information, blocked from government oversight through encryption, has set off mob attacks in India, killing several. Facebook, lacking an office in Myanmar, has become a breeding ground for hate speech and virulent posts about the Rohingya. Political institutions and real lives are at stake. Where is the bargained-for filtering that justifies the immunity granted to these platforms?

The Path Forward

It’s been argued that leniency has been crucial to Silicon Valley's explosion, that legal immunity subsidized a nascent industry, similar to 19th century common law's embrace of industrial development. And we need not disavow entirely the benefits of flexible regulation for online service providers to acknowledge that new circumstances necessitate change. Whether through Facebook’s ad-directing algorithms, or twitter’s disposition to soundbite-style communication that allows bots and trolls to drown out more reasoned debate, curated social feeds are being manipulated to devastating effect on the public discourse. And, unlike ever before, tremendous power to direct the flow of information now belongs to a very small group of private individuals, and their decisions on the matter will have far-reaching consequences for the whole world, and life-or-death consequences for many.


One response to the mayhem - the one that platform giants advocate - is to allow the companies to self-regulate. There are arguments for at least some level of self-regulation; Professor Urs Gasser of Harvard University contends that platforms not only have the incentives to clean up their act, but reservoirs of data, and the capacity to combine those incentives and resources into effective action. And to this point, Facebook is responding: the company has embarked on a public relations campaign amidst public outcry over the 2016 election, and has implemented different features for combating the spread of false information. But the status quo of loose regulation and widespread legal immunity has already fostered damaging outcomes for many, and with a rapidly shifting news cycle that threatens to leave these failures in the past, there is little reason to believe that society can rest solely on these assurances.


As Gasser points out, total deference is insufficient on its own; gap-filling regulation with an eye toward transparency will be needed where self-regulation falls flat. Across the Atlantic, some have begun to heed the call of proaction and show a willingness to encroach upon the immunity of platform giants: a new German law purporting to fine social networks large sums for failing to remove hate speech posted by users went recently into effect, and Prime Minister Theresa May has stated that Britain is examining the role of Google and Facebook, and the publisher/platform distinction that has so far served to immunize both from much liability. And now, even in the United States, legislative measures that would have up until recently gone without a sliver of support from online platforms garner approval from the same.

The regime of immunity has, in some relevant sense, failed. Society's response, still in the making, is unclear. But one thing is certain: without public discourse on the topic of legal liability for online platforms, it becomes much more difficult to imagine a serious change to the status quo that has facilitated a structural blow to the American political institution, and spurred several horrors in the developing world, right before our eyes.

Your draft adequately introduces the issues, though the introduction to the introduction is too long; everything from Stratton Oakmont to the 2016 election can be put succinctly in three or four sentences.

You haven't shown why there is any argument for continued "safe harbor" immunity at all. The "infant industry" subsidy makes no sense with respect to companies strong enough to control elections and sway governments. They are media companies, equipped not only with their own First Amendment rights like the publishers with whom they compete, but with special immunities that others don't have. They can no longer claim that they don't edit or shape content; that's the source of immense market power for them. They can only depend on the idea, statutorily defined, that the user is "another content provider," just like themselves for all the difference it makes to section 230.

So the place to begin is: the Web was centralized by the platform companies based on an extraordinary subsidy to centralization of function without aggregation of responsibility. A range of bad social effects immediately followed. Now the platform companies ask that "self-regulation" be decreed for the perpetuation of the subsidy, for which no sufficient argument has been given unless one believes that they are not powerful enough already, and need dispensation from the requirements of the rule of law as it applies to everyone else. Capacious as the First Amendment's protections of the media are, they need more. Just as Mr Zuckerberg bought all the houses around his own because he needed more privacy, right?

I think the best route to improvement here is to take the bull by the horns and give the best case you can for offering the legal immunity subsidy to the companies as they are now. If you can do that, and the case is any good, you have a Hell of an essay, and Kevin Martin of Facebook has a job for you.

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r5 - 04 Dec 2017 - 21:53:47 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM