Wikipedia has mechanisms to avoid falsehoods? TRUE

Wikip&eacute ;dia has mechanisms to avoid falsehoods? True

Founded in 2001, the online encyclopedia Wikipedia relies on its community of users to develop its content this which, even after two decades, continues to annoy teachers and experts who question its reliability. TheRumour Detectorwanted to know about the mechanisms to guarantee the quality of his articles. 

The origin of the critiques 

< p>The online encyclopedia is based on an unusual model of knowledge creation: Wikipedia favors the search for consensus within the community of collaborators, rather than individual expertise. The bet made by its founders is therefore that the contribution of a large number of collaborators makes it possible to create high quality content, while minimizing personal interests. 

According to the page dedicated to administration of the site, the governance of Wikipedia is ensured by the community of users and not by the Wikimedia Foundation which is responsible for funding. Even the text of the policies and guidelines is the result of collective writing.

It is this absence of formal rules and the lack of clarity of quality standards that have prompted criticism of Wikipedia, as studies devoted to this encyclopedia have observed since the 2000s. Business management researcher Chitu Okoli of Concordia University published a review in 2009. He found that many critics assumed that this way of working favored lower quality publications. 

Four of Wikipedia's error limiting mechanisms

However, there are rules. In 2020, two American researchers who investigated how Wikipedia works concluded that the online encyclopedia had implemented practices and policies that worked well enough that, in their words, Wikipedia had “become one of the rare places on the Internet able to combat problematic information”. Here are four examples. 

1) Some publisher actions are restricted 

The basic principle of wiki technology is that any Wikipedia user can edit the content. The volunteers who write and edit the pages are called editors. However, new editors, i.e. those who have had an active account for less than 4 days or who have made less than 10 editions, are limited in the actions they can take. For example, they cannot create new pages or edit pages designated as “sensitive topics”. This process aims to reduce “ vandalism ” by malicious publishers, as well as errors caused by inexperience. 

2) Concern for neutrality&nbsp ;

Two of the five founding principles also aim to avoid slippages. On the one hand, Wikipedia “is an encyclopedia”, which means that no “original research” is allowed and it is not a space for opinions: articles must aim for “as much accuracy as possible “. With this in mind, articles should be written with a concern for neutrality, citing “ authoritative sources on the subject ”. 

3) Protocols for settling disagreements 

During a conflict between two editors, Wikipedia encourages discussions to reach a consensus. It is also possible to ask for a third opinion, to use the mediation space, to discuss with the community on one of the many bulletin boards, to ask for comments and, as a last resort, to call on a arbitration committee. The members of the arbitration committee could then impose penalties if the conduct of a user was found to be at fault. For example, they could prohibit this user from editing a particular page, or several that address the same subject. 

4)Control mechanisms < /p>

According to the site's administration policies, certain higher-level editors have the role of ensuring compliance with the principles and guidelines. Thus, the “ stewards ” have full access to all the elements of the site and can modify the access rights of any user. They can intervene in case of vandalism. 

In addition, “administrators” are editors who can delete certain pages or “protect” others. A “ protected ” page means that the editing possibilities are limited. This helps prevent vandalism or “edit wars” (when two clans clash with conflicting opinions). Administrators can also block an editor: he would then no longer be able to edit Wikipedia content. 

Finally, Wikipedia uses software robots, called “bots”, which analyze contributions from editors and warn Wikipedia administrators of potential trolls or vandals. Some of these bots would also be very effective in erasing false information by returning a page to its previous version when an edit seems suspicious. 

The difficulty in measuring reliability 

Despite these measures, the encyclopedia is not immune to errors. One of the best known is that concerning the American journalist John Seigenthaler. In 2005, his Wikipedia page alleged that he had been involved in the assassinations of John and Robert Kennedy. It took four months before this information was corrected. Many experts, however, believe that such anecdotes are the exception rather than the rule. from Wikipedia to as many articles on the same subjects from the Britannica Encyclopedia. The Nature team came to the conclusion that the number of errors was similar in the two encyclopedias: 4 inaccuracies per article on average for Wikipedia, compared to 3 for the “classic” encyclopedia. The Encyclopedia Britannicareacted quickly by declaring that the study was devoid of merit. Naturerejected these criticisms. 

Since then, other studies have attempted to assess the accuracy of Wikipedia, either by comparing it to other sources or by calling on experts in a particular field. For example, in 2006, an American historian compared 25 biographies on Wikipedia, on Encarta (Microsoft's online encyclopedia) and in the American National Biography Online, from Oxford University. He concluded that the latter offered the most detailed texts, but that in the end, both presented practically no factual errors. 

In 2011, an American political scientist analyzed the articles devoted to the leading candidates in the 155 gubernatorial elections in the United States between 1998 and 2008. He found “no errors”. 

According to Alberta scientists who in 2014 looked at the quality of Wikipedia articles in the areas of health, nutrition and medicine, the information there is generally of good quality, but significant errors and omissions are said to be quite common. 

Researchers from Saudi Arabia who, in 2015, evaluated 47 articles on cardiovascular diseases, found several inaccuracies, mainly omissions.

A study by orthopedic surgeons in 2019 concluded that the anatomical information on Wikipedia is appropriate, but not equivalent to that in anatomy textbooks. 

Wikipedia cites a few of these studies on one page “ Reliability of Wikipedia ”. The Wikimedia Foundation funded a study in 2012, comparing content in three languages ​​with that of online encyclopedias. 

In an opinion piece published in 2019, Darius Jemielniak, a member of the board of directors of the Wikimedia Foundation, alleged that the errors that can be found on Wikipedia are different from those found in other publications. For example, the content of an article can be replaced by insults if it is about a known personality. These acts of vandalism, however, are rare and easy to identify. 

The Canadian researcher Chitu Okoli, who in 2009 conducted a review of studies carried out on Wikipedia so far, felt that Wikipedia should not be compared to expert publications, but rather to other encyclopedias. Also, Wikipedia's accuracy seems to vary by domain. German pharmacists wrote in 2014 that Wikipedia is an accurate source of information on drugs for students in the health sector. 

Wikipedia and the lack of trust 

The debate on the use of Wikipedia as an academic resource is still ongoing, underlined in October 2022 researchers from the Netherlands. American authors who, in 2020, have taken an interest in Wikipedia's policies to combat disinformation, for their part, say that several students are told to avoid Wikipedia. 

Something that Darius Jemielniak of the Wikimedia Foundation deplores. According to him, professors underestimate the ability of collaborators who are not experts to spread knowledge. He believes that Wikipedia challenges the hierarchy of academic institutions, which could be confronting for some professors. 

That said, Wikipedia itself calls for caution. 

< p>The encyclopedia stresses that its content is user-generated and therefore the articles are not error-proof. The quality of an article can also be influenced by the biases, interests, education and background of the editor who wrote it.

Wikipedia therefore does not recommend using one of its articles as a source for another article. Instead, she suggests citing the original source cited in the Wikipedia article. 

The Alberta researchers cited above also suggest using Wikipedia as a starting point for research, or as an additional source. This is what journalists do, noted a Stanford University study in 2018, comparing their methods for assessing the reliability of a site with those of historians and students. The researchers then noticed that the journalists left a “dubious” site to see what was said about it on Wikipedia, and immediately went to the footnotes of the Wikipedia notice to establish their research.&nbsp ;

Link to the original article  
https://www.sciencepresse.qc.ca/actualite/detecteur-rumeurs/2023/02/23 /wikipedia-has-mechanisms-to-avoid-false-true
This article is part of the Rumor detector section, click here for the other texts. 

Previous Article
Next Article

Leave a Reply

Your email address will not be published. Required fields are marked *