For a long time, online encyclopedia Wikipedia has been criticized for not being a fully reliable source; anyone is able to edit the encyclopedia anonymously, dis- and misinformation can be posted and might even persist. There is no consistent given indication of reliability. However, starting this fall, the ‘WikiTrust’ feature could have a great impact on the trustworthiness of Wikipedia.
WikiTrust is a system created by UCSC Wiki Lab researchers that should indicate how trustworthy Wikipedia contributions are, by assigning different shades of orange as background color to new or edited texts. Its’ algorithms calculates the authors’ reputation; if the authors’ contributions are preserved or built upon he or she gains reputation, and if they are deleted or edited swiftly he or she loses reputation. The shade of orange is derived from the author’s reputation; the lighter the shade of orange, the more likely the author is to be trusted (see some screenshots here). When users view a page and do not edit or delete the authors’ text, they do contribute ‘trust’ to the author. This way information on a page that persists is more likely to be accurate and reliable and edits from unreliable sources might be noticed faster.
Actually the WikiTrust software isn’t new at all. It has been an extension for MediaWiki since November 2008. People that run their own wiki with MediaWiki are able to make use of the this extension for free. Also the Wikimedia Foundation has already demoed WikiTrust a couple of times. But at any moment this fall the researchers expect the (demo) feature to be added to the entire encyclopedia. Registered users will soon have the option to turn on the ‘trust info’ tab on and view the colored text to find out more about the reliability of (the edits on) a page.
The Wikipedia community never really like the worth ‘truth’. As the WikiTrust wiki states: Of course, the algorithms implemented in WikiTrust cannot discover “truth”, and cannot discover false information when all editors and visitors agree with it. The concept is based on consensus. That’s nothing new under the sun. Nevertheless, WikiTrust revolves around trusting the information on Wikipedia. With this system users are perhaps given a reason to have more general trust in Wikipedia. Why wouldn’t people ‘trust’ pages containing errors and misinformation? Probably the majority of users use Wikipedia very swiftly and does not bother about the authors’ reputation. WikiTrusts’ algorithms might be able to mark dis- and misinformation as trustworthy, if the author has a high reputation and nobody bothers about editing or deleting that certain contribution. Nonetheless, an author will always start with a low reputation; even if you are truly and expert on a specific field of study, your first entries won’t give you a high reputation, no matter how knowledgeably your contribution is.
Despite the good intentions of making Wikipedia a more reliable source, there already are skeptics that don’t believe WikiTrust will make a positive difference. A number of critical questions that could possibly arise: Does WikiTrust really improve Wikipedias’ reliability through authors’ reputation, or does it enable dis-and misinformation over time to be perceived as credible information by the (actions of the) crowd? Will this system separate the expert from the ‘lying amateur’, or will it instead keep experts from participating, for they’ll all have the same reputation as the ‘lying amateurs’ in the beginning?
Not far from now the WikiTrust software will be implemented on the entire encyclopedia. It’ll be very interesting to keep track of the impact it will have. Will it be used often? And if so, are the users aware of their influence on the authors’ reputation? Should they be? Can the authors’ reputation be misguiding? Only time will tell. And ‘time’ itself will become a more and more important factor to improve the reliability of Wikipedia.