Prodigy Redux
Once we acknowledge that virtual worlds are just like other online providers, the arguments being advanced to regulate their conduct sound very familiar. That's because we dealt with online providers "censoring" their customers at least 15 years ago.
Specifically, the Ludlow/EA incident mirrors a seminal event in Internet law. In the late 1980s, Prodigy Networks was a leading commercial online service that offered a self-contained universe of interactive tools, such as email, chat, message boards, and file downloads. In 1990, Prodigy terminated the accounts of subscribers who complained about its practices, which led to claims that Prodigy engaged in censorship.[35] Prodigy responded that it could control user-submitted content to create a family-friendly environment, just as a newspaper has the right to make editorial decisions about what it publishes.[36]
Prodigy may not have fully appreciated the consequences of its response. By analogizing itself to a newspaper, it implicitly invited courts to treat it like a newspaper in other respects as well. Five years later, a court did just that. In the 1995 Stratton Oakmont v. Prodigy decision, a New York Supreme Court held that Prodigy was liable for user-submitted defamatory content on its network, just as a newspaper would be liable for publishing defamatory content.[37]
The Stratton Oakmont decision sent shockwaves through the nascent Internet industry. Providers seeking to offer family-friendly services felt that they would be liable if they failed to catch and remove harmful user-generated content. Other providers felt compelled to implement new controls over user content, even if such efforts would inhibit the community's development or would be cost-prohibitive. Either way, the threat of liability forced providers to increase their censorship of users.
Fortunately for the Internet's development, Congress overturned Stratton Oakmont nine months later by enacting Section 509 of the Communications Decency Act,[38] codified as 47 U.S.C. [ss] 230 ("Section 230"). Section 230 grants online providers a near-blanket immunity from liability for their users' content.[39] This immunity applies whether or not the online provider tries to control content it deems objectionable, meaning that online providers can figure out the best way to serve their communities. With this legal protection, a thousand online communities have bloomed, spanning the spectrum from tightly controlled to virtually unregulated.[40] This diversity has allowed individuals to find venues that serve their needs, giving customers the power to reward (or punish) providers for their choices. Section 230 played a nontrivial role in the Internet's ascension as a dominant media, a development from which we have all benefited.
Prodigy's experiences from the early 1990s teaches a valuable lesson. We want to give providers the option to exercise control over content they deem objectionable. As a result, we give providers a tremendous incentive—near-absolute immunization from liability—to exercise this option.[41] Yet those who object to EA's private censorship want to strip away discretion from providers, just like those who complained about Prodigy 15 years ago. Fortunately, we know that the Prodigy story ends happily, with the proliferation of diverse and robust online communities. Why try to rewrite this ending?