Technology

Wikipedia to Tinge Suspect Entries With Orange Cast

Wikipedia plans to roll out a new feature with the goal of enhancing the site’s credibility.

Called “WikiTrust,” the optional feature color codes entries based on reliability, according to a Wikipedia page describing the new development.

The Wikimedia Foundation did not return TechNewsWorld’s request for comment in time for this article’s deadline.

Check Text Tab

The color-coding tool gives users a “check text tab” that reveals author, origin and reliability of the text. The intent is to highlight spam, surreptitious changes and outright information-tampering by contributors who might have ulterior motives for making changes — for example, a company’s competitors or a politician’s opposition party.

WikiTrust lets users see the edit compared to the original text, and it allows them to access information on the author of the edit. The trustworthiness of an entry is computed according to the author’s reputation, as well as the reputation of users who have subsequently revised the text and the article where the text appears.

Color codes associated with “text trust” as it’s called, are displayed in background colors in the check text tab: White is high-trust; less trustworthy text is highlighted in orange, with darker shades corresponding to lower computed levels of trust.

Wikipedia presented a demo of the technology, based on an experimental Firefox extension, at Wikimania 2009. The demo is still in beta; Wikipedia plans to have a full demo shortly.

Tailoring Edits

It is difficult to gauge the efficacy of the tool in the absence of more information — such as how Wikipedia will compute an author’s reputation, for example. Will judgments be rendered automatically through some technological means, or manually by individuals?

Depending on how well the tool works, it could be an important step forward for Wikipedia as it seeks to establish greater authority for its ubiquitous online encyclopedia.

While numerous — perhaps even most — Wikipedia entries are factually accurate and compete, there have been several cases in which entries have been edited by authors with suspect motives. These circumstances often affect politicans or celebrities.

Wikipedia has implemented safeguards for most well-known figures. U.S. President Barack Obama’s Wikipedia page is semiprotected and can be edited only by established registered users.

Apart from such high-level cases, though, many — especially in academia — view even routine Wikipedia entries on noncontroversial topics as suspect and have banned its use as a source for research papers.

“Right now, Wikipedia doesn’t have the standing of a published encyclopedia, and part of that is because it allows just about any entry to be edited by anyone — there is no proof that these authors or editors have the necessary background to speak authoritatively on a subject,” Ken Saunders, president of Search Engine Experts, told the E-Commerce Times.

The new tool may alter that perception.

“Anything, in fact, that would help users understand the quality or potential lack of quality of an edit would be an improvement,” added Saunders.

College students rely on Wikipedia too much, said Scott Testa, a business professor at Cabrini College, the problem is not limited to that site. Rather, many younger people have developed the habit of automatically turning to the Internet to find authoritative resources.

Wikipedia is not riddled with errors, in Testa’s view: “Nine times out of 10, I would say, an entry is accurate.”

The problem is that 1 percent of errors.

“That has soured it as a source, especially for people in academia,” noted Testa.

WikiTrust will help — as will the larger push to greater transparency and credibility on the Web.

“Web 2.0 information sources, in general, are moving in this direction,” Testa observed. “It is a sign that the technology is maturing.”

1 Comment

  • This article is skewed in favor of the accuracy of Wikipedia. I’ve been a Wikipedia editor for 4 years, and I’ve got about 50 pages on my watchlist. I’d estimate that a quarter of the edits are vandalism, a quarter are self-interested (either someone with an ax to grind or someone promoting their own website), a quarter are by well-meaning but uninformed people, and a quarter are good edits by knowledgeable people. For the articles that no one’s watching, the 75% of poor edits remain in place except for obscenities, which are removed by bots. There’s a lot of good information on Wikipedia, and there’s a lot of pure nonsense. Don’t believe anything you read there without checking a reliable source for verification. The citations at the end of the article are a good place to start.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

How confident are you in the reliability of AI-powered search results?
Loading ... Loading ...

Technewsworld Channels