- Measuring Wikipedias Success
- Threats to Wikipedia
- Wikipedias Response to the Vandal and Spammer Threats
- Conclusion
Wikipedia’s Response to the Vandal and Spammer Threats
The previous section explored how vandals and spammers constantly attack Wikipedia. This section considers how these threats affect the Wikipedia community.
Increased Technological Barriers to Participation
Over time, Wikipedia has implemented technological measures to make it harder for spammers, vandals, and casual users to add or edit site content, including:
- Restricting the creation of new articles only to registered users38
- Blocking IP addresses of repeat offenders, such as a controversial block of all IP addresses owned or operated by the Church of Scientology39
- Requiring new and anonymous users to solve a CAPTCHA40 before adding new external links41
Also, Wikipedia administrators can technologically restrict editing of certain pages.42 A page with “full protection” means that only Wikipedia administrators can edit the page, and a page with “semi-protection” can be edited only by autoconfirmed43 Wikipedia users.44 Although articles covered by full protection remain relatively rare,45 “[s]emi-protection is now quite common for pages on subjects in the news headlines.”46
All of these practices restrict, and therefore are inconsistent with, free editability. Overall, however, Wikipedia’s current technological restrictions are fairly modest. For the most part, anyone can edit Wikipedia at any time, and the current technological hurdles modify that statement only slightly. Nevertheless, Wikipedia has been progressively adding new editing restrictions, which I think is consistent with a macro-trend to slowly “raise the drawbridge” on the existing site content and suppress future contributions.47 If so, Wikipedia may be incrementally moving away from free editability.
Recently, the English-language Wikipedia site has been considering a more dramatic movement away from free editability: a technological measure called Flagged Revisions.48 (Several Wikipedia sites around the world, including Germany’s and Russia’s, already deploy Flagged Revisions.)49 Flagged Revisions would make edits from casual contributors effectively invisible until approved by a more trusted Wikipedia editor.50
Flagged Revisions would change Wikipedia in two significant ways. First, many contributors would no longer be able to instantly publish their contributions. Second, ultimate publication of most users’ contributions would be predicated on an editor accepting the contribution.51 Thus, Flagged Revisions would mark the effective end of Wikipedia’s free editability. Everyone can still try to make edits, but only a fraction of those edits will be approved for publication, and the remainder will be effectively discarded.
At the time of this writing (April 24, 2010), Wikipedia is planning to try a less restrictive alternative to Flagged Revisions called Flagged Protection and Patrolled Revisions.52 Flagged Protection is an alternative to categorizing problematic pages as semi-protected or fully-protected, both of which prevent editors with insufficient credentials from editing the page at all. Instead, problematic pages could be subject to Flagged Protection, which would allow everyone to edit the page, but only contributions from editors with the requisite credentials would publish to unregistered readers immediately.53 All other changes would require some level of approval before publishing to unregistered users. Although Flagged Protection is consistent with more drawbridge-raising, Flagged Protection is, in some ways, more permissive than the current semi- and fully-protected options because everyone can still edit every page (even if their edits never get approved).54 Further, so long as any of the protection options (semi, full, or flagged) remain infrequently used, these measures do not really change the general proposition that anyone can freely edit most of Wikipedia.
Patrolled Revisions allows editors with the requisite credentials to mark some edits as not vandalism.55 This informs other editors that they do not need to spend time making the same no-vandalism determination. Thus, Patrolled Revisions facilitates communication among editors and enhances the anti-vandalism systems already in place.
Collectively, Flagged Protection and Patrolled Revisions are part of the drawbridge-raising progression, but they are also consistent with the current assessment that Wikipedia has avoided significant incursions on free editability. Part 2 of this article suggests that more dramatic technological measures are inevitable.
Increased Social Barriers to Participation
Although Wikipedia has successfully resisted significant technological barriers to editing, I think its main barriers to user participation currently are social, not technological. For example, even without Flagged Revisions, many user contributions simply do not remain published on the site because other editors quickly delete new articles56 and revert edits.57 In these cases, the user contributions may be momentarily published but are quickly erased. Knowing that it is hard to make sustainable contributions, some users choose not to participate.58 Other users whose contributions are erased never come back.59
Why has it become so hard for users to make contributions that actually stick? Xenophobia is a major contributing factor.60 Due to the constant threat of spam and vandalism, some Wikipedia editors become socialized to assume that site edits are made by bad folks for improper purposes,61 thus developing a “revert first” mentality.
The adverse presumptions especially apply to unregistered or unsophisticated users who do not comply with Wikipedia’s cultural rituals, such as signing talk pages.62 By failing to conform to the rituals, these contributors implicitly signal that they are Wikipedia outsiders, which increases the odds that Wikipedia insiders will target their contributions as a threat. As one book says, “If you’re editing and aren’t logged in, you’re in some sense a second-class citizen on the site. Expect less tolerance of minor infractions of policy and guidelines.”63 This insider xenophobia is a more significant incursion on free editability than any technological measure because it leads to quick screening of user contributionsboth illegitimate and legitimate.
Even if social barriers presumptively block free editability, anyone can overcome these barriers by becoming a Wikipedia insider. Insider status is open to everyone and does not depend on any credentials, experience, or specific domain expertise.64 However, becoming a Wikipedia insider requires more than just showing up. To gain enough status to reduce the chances of xenophobic reversions, a contributor must incur non-trivial costs. The contributor is expected to build a user page,65 learn Wikipedia-specific technological codes,66 discuss proposed changes with other editors before editing an entry,67 submit to an arcane dispute resolution process,68 learn a “baffling culture rich with in-jokes and insider references,”69 and survive a sometimes rough-and-tumble milieu.70
Thus, becoming a Wikipedia insider requires a fairly significant commitment. For many contributors, the benefits of insider status are not worth these required investments,71 leaving these contributorsand their contributionsvulnerable to xenophobia reversion. As a result, despite Wikipedia’s vast readership, only a few of those readers have the actual ability to make lasting improvements to the site.72