Voz media US Voz.us

Wikipedia went to war (and lost)

Faced with a brutal decline and at a time when artificial intelligence threatens to make its business model obsolete, the community that manages the world's most consulted encyclopedia decided that its top priority was to rewrite the history of the Arab-Israeli conflict.
Go woke, go broke. The formula rarely fails.

Jimmy Wales, founder of Wikipedia

Jimmy Wales, founder of WikipediaZUMAPRESS.com/Cordon Press.

Wikipedia was born with an extraordinary promise: a free encyclopedia, built collaboratively by humanity, neutral by design and accessible to all. For several years it functioned, with imperfections, as the quintessential digital reference. With more than 15 billion monthly visits, it became the first result that appears when we search for anything on Google, the source that feeds the knowledge panels of search engines and, crucially, one of the main corpora with which artificial intelligence models are trained.

That privileged position was a huge asset...and too great a temptation for certain ideological groups.

Since at least 2022, and with an intensity that skyrocketed after the Hamas attacks of October 7, 2023, a group of approximately 40 coordinated editors carried out what different researchers and journalists describe as a systematic campaign to alter the narrative about Israel, demonize Zionism and offer a biased narrative regarding the Arab-Israeli conflict.

These are not isolated issues or minor editorial disputes. According to an investigation by Unpacked Media, the operation was methodical: the group used a private Discord server to assign editing tasks as if it were a political campaign, with task management boards and "office hours" to coordinate its members. The group, identified as Tech for Palestine (TFP), went so far as to attempt to manipulate the Wikipedia pages of British politicians before the U.K. election.

"The power in Wikipedia is not held by the one who knows the most, but by the one who has the most free time and organization."

The changes introduced were of substance, not form. For example, articles that historically described Zionism as a national liberation movement were rewritten to present it as a form of settlement colonialism or racial supremacy. References to the ancestral Jewish presence in Israel were minimized or removed from numerous articles, subordinating the historical continuity of the Jewish people to politically expedient narratives. And they went further: descriptions of Hamas and Hezbollah were watered down, their designations as terrorist organizations questioned or removed, and their records of suicide bombings deleted to construct a more "innocent" image of terrorists.

According to an investigation by the NPOV portal cited by Jewish Onliner, pro-Iranian regime editors worked for years on Wikipedia deleting information about the mass executions of political prisoners in 1988, removing references to Tehran's denial of the Holocaust, and promoting Iranian state media as reliable sources while discrediting independent journalists and dissidents. A single editor, identified as "Iskandar323" (currently facing a ban on the site) made more than 49,000 edits to 16,000 pages, including the removal of thousands of words about human rights violations just days after October 7.

The term "vindication jihad," from Iranian Supreme Leader Ali Khamenei himself, defines soft warfare in computer space aimed at "rewriting reality itself." Wikipedia became, according to researchers, one of his theaters of operations.

The co-founder is alarmed, belatedly

The most revealing episode of the crisis came in November 2025, when Wikipedia's own co-founder, Jimmy Wales, personally intervened to point out the "particularly egregious" bias in the article titled Gaza Genocide, according to reports from Times of Israel.

The article, which since its inception in July 2024 had become one of the most visited articles on the entire internet, stated in its first sentence that the genocide of Gaza was the "ongoing, intentional, and systematic destruction of the Palestinian people in the Gaza Strip carried out by Israel during the Gaza war." Not as a view of what certain agencies hold, but as fact. Wales called it one of the worst Wikipedia articles he'd seen in a long time and noted that it flagrantly violated the encyclopedia's principle of neutrality. In a message posted on the article's discussion page, he explained that a neutral formulation should begin by acknowledging that "multiple governments, NGOs, and legal bodies have described or rejected the characterization of Israel’s actions in Gaza as genocide."

The principle of a neutral point of view, Wales reminded, was non-negotiable and could not be trumped by other policies or the consensus of the editors. The clarification was necessary, but it also came too late and did not solve the underlying problem. Because the Wikipedia problem is not just one of political militancy: it is structural. The encyclopedia is managed by a relatively small set of very active editors who dominate entire subject areas. There are no experience requirements or credentials to become an administrator or referee. Administrators are unpaid volunteers chosen by other editors, whose authority derives from mastery of the site's internal rules, not from knowledge of the topics they moderate. Understanding the anatomy of these editor wars is fundamental to understanding how the power on the platform is not held by the one who knows the most, but by the one who has the most free time and organization.

These factions succeed using three digital guerrilla tactics.

  • First, editorial filibustering or war of attrition: since the rules require "consensus," if an academic expert corrects a definition to bring it into line with historical reality, a coordinated group reverses its editing within minutes claiming "lack of neutrality." If the expert insists, they report it en masse. The academic, who has a real job, eventually gives up. The activist keeps control of the page.
  • Second, the Source Washing: coordinated groups vote en bloc to classify research bodies that expose their biases as "unreliable sources." Simultaneously, they elevate the status of ideologically sympathetic state media or NGOs to shield their modifications under a cloak of supposed documentary legitimacy.
  • And finally, the capture of language and theoretical framework: the ultimate goal is not only to change data, but the entire conceptual framework. By rewriting articles to insert the jargon of critical theory, they get historical concepts trapped in the ideological framework. Once that terminology is consolidated in the first paragraph of a locked article, the new reality is fixed, delegitimizing the counterpart.

This creates a systemic vulnerability: coordinated groups with patience and knowledge of internal procedures can win editorial disputes simply by exhausting bona fide editors. The Arbitration Committee, the highest dispute resolution body, has dozens of pending cases and limited ability to detect organized campaigns. Experts consulted by JNS describe this as a "recurring trend" in which pro-Israeli content is demoted, merged with broader articles or outright removed. The consequence is an information disaster: a platform run by obsessive amateurs, ideologically motivated editors and moderators who ignore or enable problems they should prevent.

The problem beyond Wikipedia and false solutions

The impact of this bias doesn't stop at Wikipedia. Google bases more than 60% of its knowledge panels (those informational boxes that appear in search results when we ask about a concept, person or place) on Wikipedia content. This means that what is edited there reaches billions of people directly without most of them knowing they are reading manipulated material.

But there is an even more troubling and more far-reaching effect: large artificial intelligence language models use Wikipedia as a central source for their training data and for answering queries. When a user asks a chatbot about Zionism, Hamas or the Arab-Israeli conflict, the system may be using propaganda as if it were objective knowledge. This is what the Iranian regime, with its "vindication jihad," realized before many Western analysts: if you can control the encyclopedia, you control not only what people read but what the artificial intelligence will learn tomorrow.

Faced with the imminent danger of AI becoming a propaganda disseminator, development labs have had to turn their training rooms into a technological battlefront. To avoid blindly repeating a hijacked encyclopedia, they try to apply multiple layers of defense.

The first is what is called corpus dilution: more algorithmic weight is given to published books, journalistic essays and verifiable historical databases over the collaborative encyclopedia, seeking to counteract amateur rewriting. In addition, they apply Reinforcement Learning from Human Feedback (RLHF), where evaluators test the models with highly sensitive topics. If the system responds using the biased framework of a vandalized item, that response is penalized and one that reflects the multiplicity of perspectives and objective facts is preferred.

Added to this are the Red Teams, specialists who deliberately attack AI during its development by challenging conspiracy theories to identify vulnerabilities and adjust parameters. And finally, grounded responses (Grounding), which force the AI to seek real-time information from vetted news sources before issuing a response on geopolitics. Despite these defenses, it's a constant arms race: detecting explicit hate speech is easy for a machine, but detecting the subtle omission of historical context is a major challenge. This is where the irony becomes especially cruel for Wikipedia.

The Wikimedia Foundation faces a major threat - if users can ask an artificial intelligence model directly and get synthesized answers, why visit the encyclopedia at all? Organic search traffic (which is Wikipedia's oxygen) is dropping as search engines integrate AI-generated answers. The relevance of the collaborative encyclopedia model is being questioned more and more every day.

The sensible response to this threat would have been to adapt: improve editorial quality, build stronger verification mechanisms, perhaps incorporate AI tools to scale the review process. Instead, in June 2025 it was revealed that the Foundation had conducted a test on the mobile version of the site that displayed AI-generated summaries at the top of longer articles. The goal was to improve accessibility for readers who find long articles difficult to scan quickly. The reaction from publishers was one of rejection. The test was paused.

The anecdote reveals the scale of priorities of the community that manages Wikipedia: rather than adapting to the world to come, they prefer to preserve political control and maintain their editorial trenches. A community that rewrites history has no intention of ceding control to algorithms.

Wikipedia went to war and is losing it. When it most needed to build credibility to withstand the AI onslaught, it turned to becoming an ideological tool. The market has an answer for that: it is the same one it had for so many institutions that put the political agenda ahead of quality and reliability. Users migrate. Relevance evaporates.

tracking