Wikipedia:No. Wikipedia is NOT biased

Regularly on talk pages, some users complain that "Wikipedia is biased", usually when their POV-pushing edits get reverted. Other editors do claim that Wikipedia indeed is biased towards certain point of views. Both are incorrect statements.

Wikipedia is a methodology edit

Wikipedia is not only a collaborative editing software, it is foremost a set of rules, the WP:Five pillars. These pillars include both editorial and behavioral guidelines to construct Wikipedia. But more than that, these rules also define a methodology to triage information and retain knowledge.

Indeed, one of the primary goal of an encyclopedia, or any system interested in representing or furthering human knowledge, is to separate knowledge from beliefs.[1] Verifiability and neutrality of an article's content are two pillars that define such a methodology, just like sciences have the scientific method.

How can this fix editors POV? edit

Of course, editors have their own point of views: nobody can be neutral towards every topic. This is largely recognized and accepted in the community (at least the English one). However, this is why such a methodology is necessary: how could any entity define what would be the "good" point of views, in other words opinions, to keep and which to reject? And furthermore with a so numerous and diverse editors base?

The solution chosen by Wikipedia is to not rely on editors opinions at all, but only on reliable sources. This methodology prevents Wikipedia swaying towards editors' points of views: their opinions are irrelevant, what matters are the reliable sources, on knowledge that is verifiable, which is something completely external to editors and can be done in a neutral, almost automatic way.

Thus, not only is it incorrect to state that Wikipedia has a bias, albeit a good one, it also defeats the very purpose of such a statement, by ignoring the methodology that allows knowledge building rather than belief.

So is Wikipedia perfect? edit

Having a methodology focusing on knowledge building does not mean that Wikipedia is against beliefs: beliefs are a useful and most extraordinary ability of the human mind. Wikipedia is simply not interested in them, except if they are so spread that they become notable and verifiable, eg, part of a culture.

This does not also mean that Wikipedia is never biased: a methodology isn't an instantaneous nor foolproof solution, it takes time to converge[a] to a 100% encyclopedic state satisfying a totally neutral and verifiable point of view. Furthermore, as with any methodology, there can be methodological biases that editors make from time to time.

Finally, the methodology itself has its own flaws, nobody and nothing is perfect, and "verifiability is not truth",[b] so that even if Wikipedia correctly converges to a 100% encyclopedic state on some articles according to its own defined methodology, it can still be far from the truth. Other criteria may be devised, Wikipedia has no pretense to be the one holding the best methodology to triage knowledge, and lots of other examples abound, such as WikiWikiWeb (the original wiki) or other wikis or the scientific methodology.

Why is a "good pov" source bad? edit

Sure, a "bad pov", for example claiming beneficial health effects for procedures that are not only quackery but dangerous such as inedia, are not only misinforming but can have disastrous consequences.

But a "good pov" can be as detrimental: for example, if we assume that doubting of everything is a good pov, we could label a procedure as ineffective, despite good evidence that it may be beneficial and maybe the only known treatment for a specific pathology. Hence, in this case too, not only can a good pov be misinforming, but it can also turn away readers from something that could have helped them.

That is not to say that we should not exert reasonable skepticism and accept at face value any quackery claim, but that whatever the pov, misinformation can be as detrimental, whether it's good (pseudoskepticism) or wrong (quackery). Nobody can be correct all the time.

Instead of relying on any single source, whether it has a good or bad pov, the solution is simple: rely on a method. This naturally produces scientific skepticism. That's why secondary (peer-)reviewed sources are more reliable than primary ones, because they follow a methodology to triage information and ensure it's correct.[c] Wikipedia merely does the same but at the next level, being a tertiary source, by relying on multiple of these secondary (peer-)reviewed sources.

Since the methodology is of paramount importance for Wikipedia, accepting sources violating the methodology solely because they have a "good pov" incurs as much risk as accepting "bad pov" sources, by risking to accept beliefs as knowledge, instead of triaging knowledge from beliefs.

See also edit

Notes edit

  1. ^ There is actually some scientific evidence that the WP:Bold, revert, discuss process of Wikipedia allows to converge to a more neutral state when involving editors with opposing views, by means of phenomena identified as "opposites attract", "unsegregated discussion" and "productive friction". More infos in Ideological bias on Wikipedia and in particular these references.[2][3]
  2. ^ This quote comes from an older formulation of the Verifiability policy, and has now been turned into an essay, but the essence still remains in the policy.
  3. ^ The qualitative difference between primary and secondary sources is a simplification that works most of the time, but sometimes a secondary source may be bad and primary sources can be good if used with care. What matters is the source's reliability, but whatever the criteria used, reliability is always based on an assessment of the source's methodology, and thus secondary sources are more likely to be more reliable than a primary source, although there are exceptions. Note also that the context matters: a source can only be qualified as secondary with regards to a specific information, not in general, as all sources are primary for something as exemplified by the news articles.

References edit

  1. ^ Vogt, Katja Maria (September 2012). Belief and truth : a skeptic reading of Plato. Oxford University Press. ISBN 9780199916818.
  2. ^ Greenstein, Shane; Gu, Yuan; Zhu, Feng (March 2017) [October 2016]. "Ideological segregation among online collaborators: Evidence from Wikipedians". National Bureau of Economic Research (w22744). doi:10.3386/w22744.
  3. ^ Holtz, Peter; Kimmerle, Joachim; Cress, Ulrike (23 October 2018). "Using big data techniques for measuring productive friction in mass collaboration online environments". International Journal of Computer-Supported Collaborative Learning. 13 (4): 439–456. doi:10.1007/s11412-018-9285-y.