Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Submission + - Algorithm Rates Trustworthiness of Wikipedia Pages

paleshadows writes: Researchers at UCSC developed a tool that measures the trustworthiness of each wikipedia page. Roughly speaking, the algorithm analyzes the entire 7-year user-editing-history and utilzes the longevity of the content to learn which contributors are the most reliable: If your contribution lasts, you gain "reputation", whereas if it's edited out, your reputation falls. The trustworthiness of a newly inserted text is a function of the reputation of all its authors, a heuristic that turned out to be successful in identifying poor content. The interested reader can take a look at this demo (random page with white/orange background marking trusted/untrusted text, respectively; note "random page" link at the left for more demo pages), this presentation (pdf), and this paper (pdf).
This discussion was created for logged-in users only, but now has been archived. No new comments can be posted.

Algorithm Rates Trustworthiness of Wikipedia Pages

Comments Filter:

"Anyone attempting to generate random numbers by deterministic means is, of course, living in a state of sin." -- John Von Neumann