Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Wikipedia and the Politics of Verification 283

Slashdot regular contributor Bennett Haselton writes "The reports of Sinbad's death become greatly exaggerated. A Wikipedia contributor is unmasked as a fraud, raising questions about why he wasn't called out earlier. NBC airs a piece about how anybody can edit any article on Wikipedia, and errors creep in as a result. (Duh.) But what's most frustrating about all these controversies surrounding Wikipedia is that news reports describe these incidents as if they are a permanent, unsolvable problem with any type of community-built encyclopedia, when in fact there seems to be a straightforward solution." More words follow. Just click the link.

In its simplest form, couldn't a person's academic credentials be verified by sending a confirmation link to their .edu e-mail? (Which could be identified as a faculty address either by a domain name like "faculty.schoolname.edu", or by a Web page in the faculty section of the school's Web site identifying the person's e-mail address?) And then once the user's bona fides have been verified in this or some other way, couldn't they put their seal of approval on any article whose contents need to be considered reliable, or that readers want to cite as an authoritative source? In this way, with only a few minutes of effort and without changing a single word of the article, its value is increased many times -- surely one of the best possible trade-offs in terms of effort versus reward. (As for the question of "What experts would do this?", the answer is, presumably the same people who contribute to sites like Wikipedia currently. If their motives are altruistic in the first place, hopefully they would be willing to take this extra step if they knew it would increase the article's usefulness.)

Something like this model is planned by the operators of Citizendium.org, a Wikipedia alternative (I balk at using the word "rival" although it is inevitable that people will see them that way). The last time I wrote about Citizendium, some thought it sounded like such a valentine to the project that they wondered if I was a shill; actually, sometimes a project just comes along that aligns almost exactly with what I would have done if I could have re-done a popular project like Wikipedia with a few design changes, and when that happens, I just say so. Some others may have wondered if I was sucking up for a board position or something. No, that would be, like, work. But I think they have some good ideas that will make them a more useful alternative in some cases, unless Wikipedia copies back some of their ideas in order to serve both needs at once, which would also be a good thing.

Consider the two major issues on which Citizendium is planning to take a different approach from Wikipedia: (1) user verification, and (2) putting published articles into an "approved" state under the stewardship of a credentialed editor, who has to sign off on any future changes to the article. The issue of user verification can be further divided into two sub-issues: (a) verifying users for the purpose of ascertaining their credentials, and (b) verifying users for the purpose of limiting the amount of vandalism committed by new users under pseudonyms. (While editorial control on Citizendium means that it is not possible to vandalize the public-facing version of an article after it has gone into an "approved" state, users can still vandalize an article while it is a "work in progress" being built up towards the first milestone where it can be approved. Citizendium founder Larry Sanger says that such vandals are surprisingly, pathetically motivated even though their work is only seen by a small audience.)

On the first issue, the one of verifying user credentials, I think the verification of .edu addresses especially would be a cheap and easy way to increase the value of every article that that user writes, or signs off on. I don't think, however, it's necessary to go as far as Citizendium is currently planning on going, by requiring real names and biographies of all users. My thinking is that if an article is synthesized by 100 monkeys with typewriters but the finished product is giving the blessing of a credentialed professor of physics, it's pretty much just as reliable as if the professor had written it themselves. And if the same article gets the blessing of multiple credentialed experts, it could justifiably be considered more reliable than many printed sources written by a single author. The point is that the credentials that matter, are those of the people who stake their reputation on the accuracy of the article, not necessarily those of the people who contribute to it. So on this front, I think that while Wikipedia asks too little of users' backgrounds, Citizendium's current plan would ask too much, because as long as you have the credentials of one person who has signed off on an article, collecting non-verifiable bios of the article's other contributors doesn't actually gain anything.

The other side of verifying credentials is the use of credentials to prevent vandalism. In this situation it's not necessary to verify that the user actually is who they say they are; the system only needs to ensure that the same user is not signing up over and over again after previous accounts get banned for abuse. (You could ban users by IP address, but tools like Tor make it easy for users to connect from what appears to be a different IP address every time.) A blog post from Citizendium founder Larry Sanger lists three possible approaches instead: (a) requiring existing user X to vouch for new user Z before Z can join; (b) requiring new user Z to provide a link to a "credible" Web page establishing their identity; or (c) requiring new user Z to provide a link to a "credible" Web page of some person X who can vouch for Z's identity. I don't know how quickly a system could grow by referrals only -- after all, I was surprised that GMail took off so quickly during the period when you could only join with an "invite" from an existing user. Then again, GMail was giving away something for free that almost everyone could use, so most people who wanted it, would find themselves closely linked to someone else who had it. Citizendium, on the other hand, asks not what they can do for you but what you can do for them, and so might not achieve enough penetration to spread by referrals only.

I suggested that one alternative would be to send a postcard to each new user's physical address with a unique six-digit number, which they would have to enter in order to complete their registration, in order to verify that new users really were unique. The problem here, apart from the privacy concerns, is the delay that users would incur before their registration was complete, which would take away the "instant gratification" that they could get from starting to contribute right away. (You could let users edit before their address is verified, but that would just enable the same person to keep re-creating new accounts with unique but fake addresses, and use them to commit vandalism before the account was found out.)

Another idea would be that for new users, their first, say, three edits would go into a queue to be reviewed by verified users, and once the first three edits have been approved, the user is able to make edits in real time. (Since anybody would be able to review a new user's edits to make sure they were not spam, the new user's edits could be reviewed very quickly, since any Citizendium volunteer who was online, could review the latest entries in the edit queue and approve them.) It's true that a user could game this system by, for example, submitting three minor improvements, and then using their unblocked account to vandalize articles while they're being worked on. However, even in this case, the "vandal" would probably end up having a positive contribution to the site, because of the three small improvements that they'd already made. If a legitimate Citizendium volunteer would have to spend more effort making those three small improvements, than it would take to let a new user make those constructive changes and then ban them and revert their destructive changes once the user is caught committing vandalism (and the latter wouldn't take much effort at all), then Citizendium has actually gotten a good deal out of the "vandal"! (To make this work, a user's first contributions could not be "neutral" changes like replacing one word with a synonym; they would have to be actual improvements, even small ones, thus ensuring that the net effect of a potential "vandal" is positive.) There may be other possible solutions. These are just alternatives in case the model of referral by trusted users turns out not to work.

Now switching to the other side of the reliability issue: Whether the default article that is displayed to the public for a given topic, should be the latest "stable" version approved by credentialed users, or the very latest version incorporating all edits submitted by any user whatsoever. Having talked with members of the Citizendium and Wikipedia communities in their respective forums, there appear to be three schools of thought on the article stability issue. The first is that the whole idea of putting articles into an "approved" state and moderating all changes going forward, goes against the "spirit" of wikis in general and Wikipedia in particular. The second, suggested on the Wikipedia discussion list by Sheldon Rampton, is that it would be a useful feature if credentialed users could select certain page versions in the page history and "sign off" on the accuracy of one of those past versions; the page displayed by default would be the bleeding-edge latest one (with all of the possible vandalism and inaccuracies that entails), but users who wanted a reliable, citable source could look in the history. The third school of thought is that reliability is so valuable, that the default page displayed to the public and carrying the stamp of the project, should be the latest version approved by credentialed editors -- the model that Citizendium currently has in mind.

I'm not really partial to the first view, since I think the success of the project should be defined by how it achieves its goals (whatever you define those goals to be) and not in whether it kept with its original "spirit". Since Wikipedia has far more readers than contributors, if your motivations for contributing to or maintaining Wikipedia are at all oriented towards doing good for other people, presumably meeting the needs of readers is more important than keeping the party going for contributors (provided, of course, that the environment for contributors is at least pleasant enough to keep them contributing). The choice between the second and third points of view is more interesting. There's no obvious best-of-both-worlds choice here, because what motivates many contributors (the fact that their changes go live to the entire world, right away) is also what motivates vandals.

On the other hand, the problem doesn't sound unsolvable. You could go with the Citizendium model of editor-approved changes but create a prioritized system for "urgent" updates, in the case of changes to an article made to incorporate current events. Suppose users (who have been verified using one or more of the methods above) are each issued a certain number of "credits" that they can use to mark a proposed update as an urgent, breaking change. (Misusing these credits to mark changes as "urgent", that really aren't, would be considered abuse tantamount to spamming or vandalism.) Then let's say, for example, Anna Nicole Smith dies. A user could submit this change to the Anna Nicole Smith article, along with a link to a reliable news source (e.g. a wire service story) and a credit marking the change as "urgent". Since an editor would not need any particular expertise to view the article and verify that the change was accurate, any editor could review the "urgent request queue" and approve that particular change for publication, ensuring that the queue was checked frequently throughout the day and urgent updates would get pushed through quickly. Thus the site could keep pace with breaking current events without the kind of inaccuracies that plagued Kenneth Lay's Wikipedia entry when he died.

So there's a trade-off there, between displaying all the latest changes by default and motivating people to contribute but also running the risk of vandalism, versus displaying only the latest editor-approved page. Where there is not a trade-off, that I can see, is in the option of simply having an editor-approved version of a given page -- whether it's displayed by default, or only stored in the version history where people can look for it. To me, both of these steps seem to consist of pure gain for relatively little effort:

  1. Verify credentials of academic professionals by poking their .edu address.
  2. Allow them to give their "blessing" to certain versions of a page in the page history, so that users can rely on those specific page versions and even cite them as sources where appropriate.

So I hope that Citizendium will help bring more prominence to the idea, and that something similar might get incorporated back into Wikipedia. The approval of an identity-verified expert can improve an article's value so much, for such comparitively little extra effort, that it makes no sense not to have that option.

This discussion has been archived. No new comments can be posted.

Wikipedia and the Politics of Verification

Comments Filter:
  • by CastrTroy ( 595695 ) on Wednesday March 28, 2007 @12:27PM (#18516745)
    As far as I know, .edu addresses are for United Statesian schools only. I live in Canada, and every school I know uses .ca addresses. So using a system like this, only Americans could participate. Also, again, I'm unsure of what credentials are required to get a .edu domain, and whether or not you have to be an actual school, or how easy it is to fake. Also, I believe students as well as professors have email addresses as @schoolname.edu, so I don't believe that just having a .edu address is enough to say you are qualified.
  • Re:ok I'll bite (Score:5, Informative)

    by eln ( 21727 ) on Wednesday March 28, 2007 @12:29PM (#18516767)
    You make a good point about subjects where no credentials exist (a PhD in Happy Days-ology?), but this is an imperfect solution even when credentials are available.

    Even if credentials are valid, that doesn't stop people from making errors. If a credentialed professor puts his stamp of approval on a piece of information on the Wiki that he mistakenly believes is correct, that makes it a lot harder to correct that mistake, because the professor will be seen by most of the editors as the authority on the subject. Even credentialed professors are not infallible.

    Also, what about a professor with a PhD in one field posting inaccurate information in another, closely related, field. The professor may honestly believe his information is correct, but that doesn't make it so. In this case, the incorrect information will stay on the Wiki until someone with more relevant credentials can successfully argue against it.

    The only way to make sure information is correct (or at least in line with current academic thought on the subject) is to backup every statement of fact on the Wiki with a peer-reviewed authoritative source, such as a study published in a reputable journal or something like that. While this may sound like an ideal, I don't think it's really a reachable goal if you want to maintain the vast amount of knowledge that Wikipedia currently maintains.

    The issue here is whether Wikipedia wishes to remain true to its original goal of being a community-edited encyclopedia or not. If so, it has to deal with the problems that come from that, and which are exacerbated by enormous popularity. The more popular you are, the more "undesirables" you attract: vandals and people who are trying to use your platform to spread an agenda. The only way to effectively combat this is with dedicated editors willing to spend an enormous amount of time policing the site for vandals, and doing research on disputed statements of fact. Trusting credentials, even verified ones, rather than research is a shortcut, and one that just isn't going to work as well as people seem to think it will.
  • by Mycroft_514 ( 701676 ) on Wednesday March 28, 2007 @12:54PM (#18517121) Journal
    .edu addresses can easily be faked as evidenced by one George Burgess, who works at the University of Florida in Gainsville. He is cited by publications all over as a shark expert and as "Dr. Burgess". In reality, he only has a BS degree in an unrelated field and happens to maintain a portion of the University's museum assests. He rolls the .edu address into "his" respectability by hosting his shark website on the university ISP.

    Another example was a professor at the United States Naval Academy (Full professor) who was quickly terminated when it came to light that he faked his resume and did not have a PHD.

    Sorry, your idea just doesn't cut the mustard.
  • by jandrese ( 485 ) <kensama@vt.edu> on Wednesday March 28, 2007 @12:59PM (#18517179) Homepage Journal
    Well, once you finish your paper you should go back and correct the errors you found in Wikipedia (don't forget to cite your sources). I find that in general the more citations an article has, the more reliable it generally is, at least for scholarly subjects.
  • Re:ok I'll bite (Score:4, Informative)

    by ZachPruckowski ( 918562 ) <zachary.pruckowski@gmail.com> on Wednesday March 28, 2007 @01:09PM (#18517315)
    Three quick comments:

    1) You're correct that no one is infallible, but we have two or more experts on every approved article. That certainly helps reduce the odds of mistakes. That keeps one expert from running roughshod over a whole field too.

    2) Having credentials does not excuse people from citing sources. Fortunately, most people with credentials understand "reliable sources" better than most others. In the last three months, I haven't run into the issue of an expert saying "I've got a PhD, I don't need a source"

    3) We have articles which are approved, and others which are not. This allows for articles on Happy Days or articles that aren't up to snuff yet to exist, while preventing negative drift of good articles [wikipedia.org] by locking those articles from unreviewed changes.

    Zach Pruckowski - Citizendium Executive Committee
  • Re:ok I'll bite (Score:3, Informative)

    by jonbritton ( 950482 ) on Wednesday March 28, 2007 @01:26PM (#18517557) Journal
    The power or Wiki is that anyone can edit, so anyone can fix the mistake.

    More to the point, anyone can edit/revise/delete, and you're supposed to read the system with that in mind. Should we really be building public resources that are aiming at flawless, objective truth? Or should we be encouraging everyone to develop better bullshit detectors, read more skeptically and demand reasonable evidence?

    I read plenty of Wikipedia articles that are just nonsense. Not that I can refute their claims, but I (or anyone with a semester of college composition under their belt) can tell a non-developed point, "facts" stated without reference or evidence, or opinion and speculation. An Encyclopedia doesn't replace a textbook, and it isn't meant to -- it's basically an essay about the subject.

    It's easy enough. Before you start reading Wikipedia, read this: http://en.wikipedia.org/wiki/Wikipedia:Manual_of_S tyle [wikipedia.org] See any of the sins mentioned in there, in the article you're reading? Get a second opinion. It's the internet, there's no shortage of informal, shallow overviews out there.
  • Re:ok I'll bite (Score:3, Informative)

    by voice_of_all_reason ( 926702 ) on Wednesday March 28, 2007 @01:47PM (#18517873)
    wikipedia is an encyclopedia and as a result all material must be properly referenced

    You are incorrect. Note that the page says it is "a guideline... and is considered a standard that all users should follow. However, it is not set in stone and should be treated with common sense".

    It applies this guideline to All material that is challenged or likely to be challenged. Therefore, "The sun is made up mostly of hydrogen" does not need a reference.
  • Re:ok I'll bite (Score:4, Informative)

    by nuzak ( 959558 ) on Wednesday March 28, 2007 @02:08PM (#18518171) Journal
    It's true that credentials don't automatically equal veracity, even when subjected to peer review. What's hard to dispute however is that they're still far more trustworthy than the current situation in Wikipedia. Did you know phlegm causes vomiting in rodents? That's a fact that was in WP before I deleted it for lack of a cite, let alone the dubiosity (no, I'm not particularly fascinated with phlegm: I was looking up the notion of "humours", e.g. phlegmatic, and surfed there). That's wikiality for you.
  • by Anonymous Coward on Wednesday March 28, 2007 @03:42PM (#18519405)
    That's what you think. Apparently there are bunch willing to do it without rewards.

    Let me dumb it down for you: The dude wasn't talking to you.

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...