Voz media US Voz.us

A study reveals that Wikipedia demonstrates ideological bias by linking negative words to the political right-wing

Instead, words such as "joy" are associated with the political left. The report revealed 1,628 words that included different connotations ("positive," "neutral" and "negative").

Wikipedia página principal(Captura Wikipedia.org)


A new study conducted by the Manhattan Institute revealed that there is politically biased content on Wikipedia. The report indicated there is a tendency among editors of the digital "encyclopedia" to use language representing the emotions of "anger," "disgust" and "fear" on web pages related to any topic representing the "moderate" political right.

On the other hand, the "moderate" political left is often associated with language connoting "joy."

The authors of the report analyzed 1,628 words that included different connotations (classified into "positive," "neutral," and "negative" emotions), and reviewed the recurrence with which these appear on pages associated with figures on the right relative to those on the left:

We find a mild to moderate tendency in Wikipedia articles to associate public figures ideologically aligned right-of-center with more negative sentiment than public figures ideologically aligned left-of-center (...) We also find prevailing associations of negative emotions (e.g., anger and disgust) with right-leaning public figures; and positive emotions (e.g., joy) with left-leaning public figures.

These tendencies suggest there is political bias embedded in Wikipedia articles.

The trend is predominant in the U.S.

The trend is predominant in the profile and biography websites of U.S. presidents, Supreme Court justices, congressmen, governors, mayors and journalists, as well as people from other Western countries around the world:

These prevailing associations are apparent for names of recent U.S. presidents, U.S. Supreme Court justices, U.S. senators, U.S. House of Representatives congressmembers, U.S. state governors, Western countries’ prime ministers, and prominent U.S.-based journalists and media organizations.

This biased information is used by tools such as OpenAI's ChatGPT and other artificial intelligence platforms that feed on Wikipedia content to do their job and answer users' questions:

We find some of the aforementioned political associations embedded in Wikipedia articles popping up in OpenAI’s language models. This is suggestive of the potential for biases in Wikipedia content percolating into widely used AI systems.

Wikipedia's neutrality policy is not being enforced

According to its rules, Wikipedia has a policy that requires its content to have a neutral point of view (NPOV), so, the report argues that "Wikipedia’s NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles."

Wikipedia’s neutral point of view (NPOV) policy aims for articles in Wikipedia to be written in an impartial and unbiased tone. Our results suggest that Wikipedia’s NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles.

The report recommends Wikipedia use "advanced computational tools" that highlight content that is politically biased (such as a scoring system that allows the user to identify such language).