Ask Runable forDesign-Driven General AI AgentTry Runable For Free
Runable
Back to Blog
Technology & Culture32 min read

Wikipedia's Existential Crisis: AI, Politics, and Dwindling Volunteers [2025]

As Wikipedia turns 25, it faces unprecedented threats: political attacks, AI scraping, volunteer decline, and a culture that questions its founding ideals of...

Wikipediadigital misinformationartificial intelligence scrapingvolunteer community declineonline encyclopedia+11 more
Wikipedia's Existential Crisis: AI, Politics, and Dwindling Volunteers [2025]
Listen to Article
0:00
0:00
0:00

Wikipedia at 25: A Crisis of Relevance in the Age of AI

Wikipedia launched on January 15, 2001, during a moment when the internet still felt like a vast, uncharted frontier. The idea was radical: a free, collaboratively-edited encyclopedia that anyone could contribute to, that ran on donations, and that refused to become a commercial product. For two decades, it worked spectacularly. The site became the fifth most visited website on Earth. Teachers assigned Wikipedia articles. Journalists cited it. Medical professionals consulted it. Millions of people discovered information they needed without paying a cent.

But on Wikipedia's 25th birthday in January 2025, the site isn't celebrating. Instead, it's fighting for its survival. The threats are coming from directions nobody predicted in 2001, and they're converging simultaneously in ways that feel genuinely catastrophic.

The first threat is political. In 2024 and 2025, prominent figures on the political right began openly attacking Wikipedia as biased, controlled by leftists, and unreliable. Elon Musk branded the site "Wokepedia," alleging it's been hijacked by far-left activists. Tucker Carlson devoted a 90-minute podcast to claiming Wikipedia is "completely dishonest and completely controlled on questions that matter." Congressional Republicans sent letters accusing Wikipedia of "information manipulation." The Heritage Foundation announced it would "identify and target" the volunteer editors themselves, not just the organization.

The second threat is technological. AI companies have been scraping Wikipedia's content at massive scale to train language models. Chat GPT, Claude, Gemini—they all trained on Wikipedia's freely-licensed information. Meanwhile, instead of visiting Wikipedia directly, people now ask their AI chatbots questions and get answers summarized from Wikipedia content without the original source getting credit, traffic, or visibility.

The third threat is existential to the organization itself: volunteers are disappearing. New editor registrations have dropped by more than a third since 2016. The average editor is aging. Younger people aren't joining. Wikipedia became a victim of its own success—it's so comprehensive that it no longer needs constant expansion the way it did in the 2000s. So contributors moved on to newer platforms, different hobbies, changing careers.

Behind all of this is a deeper crisis of belief. Wikipedia was founded on principles that feel quaint in 2025: the belief that neutrality matters, that sourcing information carefully matters, that volunteering for the public good matters, that a noncommercial project can survive and thrive. In an era of algorithmic rage, political tribalism, and the conviction that "there is no truth, only power," these principles seem almost naively optimistic.

Last month, a veteran Wikipedia editor named Christopher Henner published an essay that circulated widely through the editor community. In it, he expressed a fear that haunts longtime contributors: Wikipedia is becoming a "temple filled with aging volunteers, self-satisfied by work nobody looks at anymore." He wasn't being melodramatic. The data backs him up. According to Wikipedia's own statistics, the site lost more than a billion visits per month between 2022 and 2025.

The Political Attack on Wikipedia's Neutrality

Wikipedia's problems with political actors aren't new, but they've escalated sharply. The organization has always dealt with edit wars over contentious topics. Someone rewrites the opening paragraph of an article about a politician. Someone else changes it back. Editors argue on the talk pages. Eventually, they reach consensus or an administrator locks the page.

But the 2024-2025 assault on Wikipedia is fundamentally different. It's not editors fighting editors anymore. It's organized political movements trying to delegitimize the entire project.

In October 2024, Republican congresspeople James Comer and Nancy Mace sent a letter to Wikipedia accusing it of "information manipulation" in a congressional investigation. Rather than defend itself aggressively, as the Wikimedia Foundation did in 2010 when the FBI demanded they remove the agency's logo, the organization sent back a respectful explainer about how Wikipedia works. The pragmatic tone shift reveals how much has changed. In 2010, a government agency making an unreasonable demand could be refused because society still had faith in legal processes. In 2025, the organization knows the political environment is less predictable and more potentially hostile.

The Heritage Foundation's announcement that it would "identify and target" volunteer editors crossed a new line. This wasn't criticism of Wikipedia's content or structure. It was a threat to find and potentially harm the individuals who volunteer their time and labor to maintain the site. For many editors, the thought of being doxxed or harassed for editing an article became a genuine concern.

Elon Musk's "Wokepedia" jab was more casual but equally damaging. When a billionaire with 200 million social media followers calls your project corrupt and ideologically captured, it reaches far more people than any thoughtful critique ever could. The charge spread rapidly among conservative audiences. Within weeks, there were calls to create a "neutral" alternative to Wikipedia, one without left-wing bias. Several such projects have been attempted over the years (Conservapedia, Everipedia, others), but none have achieved anything close to Wikipedia's scale or utility.

The accusation of bias stings partly because Wikipedia does employ humans, and humans have biases. But the project's editorial process is specifically designed to counteract individual bias. Articles on contentious topics are supposed to present multiple viewpoints. Citations are required. Vandalism and bad-faith edits are caught and reverted. Talk pages show the reasoning behind editorial decisions. It's not perfect—no human system is—but it's far more rigorous than the casual assertions made on social media.

What makes the political attack particularly dangerous is that it's not falsifiable. If Wikipedia publishes something conservatives dislike, that proves bias. If Wikipedia publishes something conservatives like, that proves they're trying to seem balanced to deflect criticism. There's no argument that can win against that framing. The Wikimedia Foundation eventually realized this and stopped trying to win the political argument at all.

AI Scraping: The Irony of Machines Training on Human Knowledge

In 2022, Open AI released Chat GPT, and suddenly everyone was talking about AI. By 2023, it became clear that essentially every major AI model had been trained on Wikipedia's content. The sites trained on large portions of the internet, but Wikipedia represented a particularly valuable source: information vetted by humans, cited with links, written clearly, and released under a Creative Commons license that technically allowed (required) attribution.

The irony is profound. AI companies' sudden market dominance was partly built on Wikipedia's existence. Wikipedia spent 20 years building a repository of reliable information. Volunteers spent millions of hours writing, editing, fact-checking, and improving articles. And then, when machines could ingest all that work at once, it became a training dataset—valuable but ultimately extractive.

Wikipedia's founders and current leaders understood this was coming. They knew that AI would eventually be trained on the site's content. That was acceptable, even anticipated. The problem is what happened after training. AI companies built products that bypass Wikipedia entirely. When someone asks Chat GPT a question, they get an answer that synthesizes information from Wikipedia (and thousands of other sources), but the user never visits Wikipedia, never sees the citations, never learns where the information came from. Wikipedia loses traffic, loses visibility, loses the sense that it's an essential resource.

Between 2022 and 2025, Wikipedia lost more than a billion visits per month. That's not entirely because of AI—there are other factors. But the timing is too convenient to ignore. People are getting their information from AI summaries instead of reading Wikipedia articles directly.

Further, the relationship became asymmetrical. AI companies used Wikipedia to build billion-dollar products. Wikipedia received nothing in return. Google has paid billions for content and data over the years. Meta has paid for content. But Open AI, Anthropic, Google (with Gemini), and Meta (with Llama) trained their models on Wikipedia for free. Wikipedia didn't ask for payment because the license technically didn't require it. But the philosophical principle was violated: the work of volunteers was harvested to enrich corporations.

The Wikimedia Foundation has been cautiously optimistic about potential solutions. They've suggested that AI companies could fund Wikipedia's APIs, paying for access to structured data. Some companies have made small donations. But this doesn't really solve the problem. The core issue is that humans are still necessary to create good AI models, but the economic incentives no longer point to supporting human-powered knowledge production.

There's a bitter technical irony here too. AI models trained recursively on their own synthetic data suffer from what researchers call "model collapse." The models degrade over time because they're learning from increasingly compressed, distorted versions of information. Human-written, human-vetted content from sources like Wikipedia actually helps prevent this problem. The AI companies need Wikipedia more than Wikipedia needs them, but the power dynamic makes it impossible for Wikipedia to extract fair value from that relationship.

The Volunteer Crisis: The Graying of Wikipedia

Wikipedia's most critical resource has never been computing power or servers. It's been volunteer editors. The site runs on thousands of people who donate their time, knowledge, and passion to building an encyclopedia that anyone can use.

For most of Wikipedia's history, that pipeline of new volunteers seemed reliable. People discovered the site, found an article that was incomplete or wrong, and fixed it. They discovered they enjoyed the work, the community, the sense of contributing to something meaningful. They returned regularly. Some became power users, editing hundreds or thousands of articles. A smaller group became administrators, helping to manage disputes and enforce policies.

That pipeline has broken. New user registrations dropped by more than a third between 2016 and 2025. According to Wikipedia's own data, the median age of active editors is increasing every year. The community is aging, and it's not being replaced.

There are several reasons for this. First, the low-hanging fruit is gone. In the early 2000s, Wikipedia needed thousands of articles about basic topics. Now it has millions of articles. New contributors can still improve existing articles, but the work is less immediately visible and rewarding. You can't write an article about a major historical event that gets millions of views in your first week. You can correct a typo or add a reference to a section, which is less psychologically rewarding.

Second, the barrier to entry has increased. Wikipedia's editing interface is intimidating for newcomers. The policies are complex. The community can be unwelcoming. If you make a mistake, your edit gets reverted. If you get aggressive or break rules, you might get blocked. The friendly, forgiving culture of the early days has become more formalized and bureaucratic.

Third, competing platforms have drawn away potential contributors. Reddit, Tik Tok, Discord, Twitter—these platforms offer immediate feedback, community engagement, and social rewards. They're designed to be addictive. Wikipedia's interface looks like it was designed in 2005 (because it was, mostly). It requires patience and delayed gratification.

Fourth, and perhaps most significantly, the motivation has shifted. In the 2000s, contributing to Wikipedia felt like participating in a revolutionary project. The internet was still new. The idea of a free, collaboratively-built encyclopedia was novel and exciting. Now, in 2025, that novelty is gone. Wikipedia isn't revolutionary anymore. It's an institution, boring and reliable, but not exciting.

The consequences of this decline are serious. Wikipedia's English version still has thousands of active editors, so it's not at risk of collapse immediately. But specialized topics are increasingly neglected. Articles about niche subjects, non-English versions of Wikipedia, and emerging topics struggle to find volunteers willing to build and maintain them. The article about some scientific discovery written in 2010 might never be updated again. The article about a developing political situation might contain outdated information that nobody bothers to fix.

Veteran editors describe a pervasive sense of decline. Christopher Henner's essay captured this feeling acutely. He worried that Wikipedia is becoming a museum, a record of work done in the past, rather than a living, growing project. The last major wave of new editor recruitment happened in the early 2010s. A decade later, those editors are the old guard, mentoring even fewer newcomers.

Wikipedia's response has been to try to attract younger users through short-form video content. In October 2024, the project launched a Tik Tok, Instagram, and You Tube presence called "Depths of Wikipedia," republishing interesting facts from articles in short, snappy formats. The initial results were promising: over 23 million views across platforms. But views don't translate to editors. The videos might get people to visit Wikipedia, but there's no evidence they're converting viewers into contributors.

The Information Ecosystem is Changing

Wikipedia's decline in traffic and visibility is happening against the backdrop of a collapsing local news infrastructure. For decades, Wikipedia editors relied on local newspapers as sources. If an editor wanted to write about a local politician, a community organization, or a regional event, they could cite the local paper's reporting. That sourcing foundation is eroding.

According to the Pew Research Center, the U. S. has lost roughly 2,500 newspapers since 2005. Thousands more have reduced staff to skeleton crews. Entire cities have lost their local news organizations entirely. When Wikipedia editors can't find reliable, published sources about local topics, they can't write or expand articles. The encyclopedia becomes increasingly hollowed out, strong on nationally and internationally important topics but weak on local, regional, and niche subjects.

This also makes Wikipedia more vulnerable to certain kinds of manipulation. One of Wikipedia's core strengths was that the sourcing requirement meant you couldn't just make things up. You had to cite something published elsewhere. But if reliable secondary sources disappear, that protection weakens. And if editors diminish in number, there are fewer people to catch and revert bad-faith edits.

Meanwhile, the internet itself has fragmented. In the 2000s, if you wanted general information about a topic, you went to Wikipedia. Now you might ask a chatbot, search Reddit for opinions, watch a You Tube video, consult a specialized subreddit or Discord community. The generalist encyclopedia is competing against specialized, personalized, conversational alternatives.

Political Instability and Censorship Threats

Wikipedia's challenges aren't limited to the United States. Around the world, authoritarian governments have been increasingly hostile to the platform.

In mainland China, the Great Firewall continues to block every version of Wikipedia (both English and Chinese). The Chinese government considers the site a threat because it contains information about human rights abuses, political dissidents, and historical events the government prefers to suppress. Hundreds of millions of Chinese citizens have no legal way to access Wikipedia.

Saudi Arabia has imprisoned Wikipedia editors for documenting the country's human rights violations on the platform. When individuals risk imprisonment for editing an encyclopedia, it reveals just how threatening authoritarian governments find the idea of a freely-accessible, uncensored repository of information.

The UK is considering age-gating Wikipedia under its Online Safety Act, which would require identification to access the site. This would fundamentally change Wikipedia's nature. Part of its power has always been the ability to access information anonymously, without surveillance or tracking. Age verification requirements would create records of who's accessing what information, potentially chilling access to sensitive topics.

These threats are existential in a different way than the domestic political attacks or the AI disruption. They're about the principle of access. If governments around the world can effectively block or restrict Wikipedia, the project's mission becomes impossible. You can't build a universal reference work for all of humanity if significant portions of humanity are denied access.

Wikimedia Foundation's Diplomatic Pivot

The Wikimedia Foundation, the nonprofit organization that operates Wikipedia, has undergone a leadership transition in recent years. In 2023, a new CEO, Bernadette Meehan, took over. Her background is unusual for the role. She spent years as a foreign service officer and served as an ambassador. She's experienced in negotiation, diplomacy, and international relations.

According to the Foundation's communications, Meehan's appointment signals a strategic shift. Wikipedia is moving from a tech-focused organization to a diplomatically-focused one. In the face of political attacks, government threats, and corporate pressure, the Wikimedia Foundation needs skills that go beyond software engineering or nonprofit management.

When Republican congresspeople accused Wikipedia of bias, the Foundation's response wasn't to aggressively defend itself or publish a point-by-point refutation. Instead, it published a respectful explainer about how Wikipedia's editorial process works. The approach is cautious, diplomatic, seeking to de-escalate rather than win the argument. It reflects a recognition that in today's political environment, the traditional tools of persuasion and argument don't always work.

But even the best diplomat faces challenges with the current environment. You can't negotiate with an algorithm. You can't reason with political movements that have decided in advance that your organization is the enemy. You can't fairly compete with AI companies when they have vastly more resources and market power.

The Foundation is pursuing several strategies simultaneously. It's seeking financial support from technology companies for API access and research. It's trying to improve its public image and reach younger audiences through social media. It's investing in volunteer recruitment and mentoring. It's working with governments to resist censorship. But all of these efforts feel somewhat incremental given the scale of the challenges.

The AI Dependency Paradox

Here's where the situation becomes genuinely complicated. AI models perform better when trained on high-quality, human-vetted information. Wikipedia is one of the best sources of that kind of information on the internet. As AI becomes more central to how people access information, the quality of Wikipedia becomes more important, not less.

Researchers have documented that AI models trained recursively on their own synthetic data degrade over time. This is called model collapse. If an AI system trains on its own outputs, those outputs become increasingly distorted and simplified. The models lose nuance, factual accuracy, and reasoning capability.

The solution, according to researchers, is to continuously train on human-generated, human-verified content. Wikipedia is exactly that kind of content. It's written by people, edited by people, fact-checked against cited sources, and maintained by a community with standards.

So we have a paradox: AI companies desperately need Wikipedia to exist and remain high-quality, but their business model gives them no incentive to support it. They can train on Wikipedia's content for free, and the fact that people increasingly use AI to get answers instead of visiting Wikipedia directly is fine from their perspective. It makes their product more useful.

From Wikipedia's perspective, the situation is frustrating. The value they created is being harvested by corporations worth billions, while Wikipedia's traffic declines and its volunteer base shrinks. The conversation between Wikipedia and AI companies has been friendly but unproductive. The Foundation has asked for financial support. A few companies have made donations. But there's been no structural change to align incentives.

The Case for Wikipedia's Importance

Despite all these challenges, there's a strong case that Wikipedia matters more now than ever. In an era of misinformation, algorithmic distortion, and AI hallucination, reliable information sources are increasingly valuable.

Wikipedia isn't perfect. Articles can contain errors. Some topics are undercovered. Some are covered from perspectives that reflect the demographics of the editor base (predominantly English-speaking, wealthy-country biased, male-dominated). But the project's commitment to sourcing, its requirement for citations, its willingness to label disputed claims, and its comparative transparency about editorial decisions make it more trustworthy than most alternatives.

When a researcher wants to understand a topic, Wikipedia is often a good starting point. When a journalist needs to quickly understand a subject they're covering, Wikipedia is a useful reference. When a student wants to learn about something, Wikipedia is usually reliable enough for preliminary research.

Further, Wikipedia represents a model of knowledge production that's increasingly rare: collaborative, noncommercial, human-centered, and transparent. In a world where most large information systems are proprietary, algorithmic, and profit-driven, Wikipedia's existence is itself valuable. It demonstrates that another way is possible.

The site is also genuinely useful in ways that AI chatbots aren't. Wikipedia's hyperlinks let you explore connections between topics. The talk pages show you disagreements and reasoning. The citation links take you to primary sources. AI chatbots give you summaries without those surrounding structures. Wikipedia gives you context.

The Existential Question: Does Wikipedia's Mission Still Matter?

Underneath all the specific challenges—political attacks, AI competition, volunteer decline, government censorship—is a deeper existential question: Do the principles Wikipedia was founded on still matter in 2025?

Wikipedia was built on five core pillars: that neutral point of view is important, that sources matter, that anyone can edit, that the community is reliable, and that Wikipedia is a nonprofit. These principles reflected beliefs about how knowledge should be created and shared.

In 2025, these principles seem almost naive. Neutral point of view? In an era where people live in ideological bubbles, where every fact is contested, where "your truth" and "my truth" are accepted as equally valid, the idea that you can write an article that everyone agrees is fair seems quaint. Sources matter? When people increasingly get their information from AI summaries, social media recommendations, and echo chambers, the idea that careful sourcing will win out seems optimistic.

Yet that skepticism might be exactly why Wikipedia matters more than ever. In a world fragmenting into tribalism and relativism, an institution committed to verifiability, sources, and neutral presentation of multiple viewpoints is countercultural. It's not fashionable. It won't make you rich. It won't get you famous on social media. But it might actually help people understand reality.

The challenge for Wikipedia is communicating that value to younger generations who never experienced a world without internet. For people who grew up with Google, Wikipedia's existence might feel inevitable, like asking why we need libraries. But Wikipedia isn't inevitable. It requires constant maintenance, volunteer effort, and resources. If it collapses, rebuilding it would take decades.

The Road Ahead: Can Wikipedia Survive the Next 25 Years?

Wikipedia's next 25 years will determine whether it becomes an artifact of the internet's early idealism or a continuing force for making knowledge freely available. Several things need to happen.

First, the organization needs to solve the volunteer problem. This might require fundamental changes to how editing works. The interface needs modernization. The barrier to entry needs lowering. The community needs to become more welcoming to newcomers. The project needs to give contributors reasons to stick around beyond altruism.

Wikipedia has experimented with mobile editing, simplified editing interfaces, and AI-assisted writing tools. But these changes have been incremental. A more radical restructuring might be necessary. Perhaps Wikipedia needs to be less like a museum of articles and more like a living, growing platform.

Second, Wikipedia needs a sustainable financial model. Currently, it relies on small donations from millions of people and occasional major grants. This approach has worked for 25 years. But it's precarious. If donation revenue declined, the organization would struggle. One solution is to negotiate with AI companies and other organizations that benefit from Wikipedia's content. Another is to develop fee-based services for researchers, organizations, and enterprises that need API access or specialized tools.

Third, Wikipedia needs to find ways to coexist with AI rather than compete with it. This might mean developing structured data that AI systems can use more effectively. It might mean requiring AI companies to provide clear attribution and links back to Wikipedia. It might mean creating research partnerships that support Wikipedia's mission while also supporting AI development.

Fourth, the organization needs to address the political environment while preserving neutrality. This is genuinely difficult. You can't satisfy people who have decided in advance that you're biased. But you can communicate clearly about your processes, involve diverse perspectives in governance, and demonstrate transparency. You can show that Wikipedia's decisions are made by editors following established rules, not by a shadowy elite.

Fifth, Wikipedia needs to expand in languages and regions that are currently underrepresented. The English Wikipedia is relatively strong. But Wikipedia in Hindi, Arabic, Spanish, and Chinese are much smaller. As the internet becomes more global and non-English speakers become a larger share of internet users, Wikipedia's inability to serve those communities effectively becomes a bigger problem.

The Ideological Context: From Optimism to Skepticism

Wikipedia's current crisis reflects a broader historical shift. In the early 2000s, the internet seemed full of democratic potential. Blogs would democratize publishing. Wikipedia would democratize knowledge. Open-source software would democratize software development. Forums and message boards would create horizontal communities. The technology optimists believed that decentralized, user-powered systems would outcompete centralized, corporate ones.

What actually happened was more complicated. Some decentralized systems thrived (open-source software, Wikipedia). Others got captured by corporations (social media). Others became wastelands of abuse and misinformation (forums, message boards). The idealism of the early internet collided with the realities of human nature, economic incentives, and political power.

Wikipedia has survived longer than most idealistic internet projects. But its survival is increasingly threatened by the world that emerged after the 2008 financial crisis, the rise of social media, the concentration of tech wealth, and the fragmentation of shared information spaces.

Younger people don't remember a time when Wikipedia felt revolutionary. For them, it's just another website. The volunteers who built Wikipedia in the 2000s are older now, often busy with career and family obligations. The energy that created Wikipedia was the energy of young people in a young internet. Recapturing that energy is nearly impossible.

Geopolitical Dimensions: Wikipedia as a Soft Power Battleground

Wikipedia is increasingly becoming a geopolitical battleground. Authoritarian governments recognize that the platform is a way for their citizens to access information the state would prefer to suppress. Democratic governments are concerned about Wikipedia becoming a vector for misinformation. Tech companies see Wikipedia as both an asset (for training) and a competitor (for information access).

China's blocking of Wikipedia is straightforward authoritarianism. The government doesn't want its citizens accessing information about human rights abuses or political dissidents. Russia has intermittently blocked Wikipedia or pressured it to remove articles about the war in Ukraine. Saudi Arabia imprisons editors who document human rights violations.

But the threats from democratic governments are more subtle. The UK's proposed age-gating requirement would make Wikipedia less accessible to young people and require identity verification. The EU's digital regulations might impose compliance burdens that are difficult for a nonprofit to manage. The U. S. political attacks don't involve direct legal pressure yet, but the threat is implicit.

Wikipedia's strength is that it's been able to maintain a position of rough independence from all these powers. It's not owned by any government or corporation. Its hosting infrastructure, while dependent on commercial data centers, isn't directly controlled by any nation. Its volunteer base spans the globe. But this independence is increasingly fragile.

The Role of Wikimedia Foundation Leadership

The appointment of Bernadette Meehan as CEO was a deliberate strategic choice. The Foundation recognized that Wikipedia's challenges require diplomatic skills as much as technical expertise. Meehan's background in foreign service and ambassadorial roles signals that the organization is shifting toward a model where engaging with governments, corporations, and international bodies is central to the mission.

This approach has both promise and risks. The promise is that experienced diplomacy might allow Wikipedia to negotiate better terms with tech companies, resist government pressure, and find sustainable funding. The risk is that diplomatic compromise might require concessions on Wikipedia's principles.

For example, if the UK's age-gating proposal becomes law, should Wikipedia comply or resist? Compliance means losing some readers, but it avoids legal conflict. Resistance might be more principled but could result in legal battles the nonprofit might lose.

Similarly, if the U. S. government decides Wikipedia is politically biased and threatens consequences, how should the Foundation respond? The diplomatic option would be to engage with government officials, try to educate them about Wikipedia's process, perhaps make some procedural changes to address concerns. The principled option would be to refuse to change content or processes based on political pressure.

These aren't easy decisions. Diplomacy requires flexibility. Principles require firmness. Finding the right balance is the central challenge of Meehan's leadership.

What Wikipedia's Crisis Reveals About the Internet

Wikipedia's current moment reveals something important about how the internet has evolved. The early vision was of a network that would decentralize power, democratize information, and level the playing field. What actually emerged is a new form of concentration.

A handful of tech companies (Google, Meta, Amazon, Microsoft, Apple) control most of the internet infrastructure and traffic. A handful of AI companies are racing to build the next layer of information mediation. Meanwhile, the institutions that might counterbalance this power—like Wikipedia, like local news, like libraries—are struggling.

Wikipedia's survival matters because it's one of the few genuinely independent information institutions left. If Wikipedia were to collapse, the internet would become even more dominated by commercial platforms and corporate interests. There would be no major free reference work, no noncommercial space where information is organized by actual human knowledge rather than engagement metrics.

From this perspective, Wikipedia's challenges are urgent not just for the organization itself but for the health of the internet as a whole. Whether society values free, reliable information enough to support it is an open question.

Potential Solutions: Building Wikipedia's Future

Wikipedia isn't without options for addressing its challenges. Several paths forward exist, though none are easy.

Revenue diversification: The Foundation could expand beyond donations. Offering API access to organizations, creating premium research tools, developing partnerships with universities and libraries—these could generate revenue while serving Wikipedia's mission. The key is ensuring that any commercial activity doesn't compromise Wikipedia's core principle of free access.

Technology improvements: Better editing interfaces, AI-assisted article writing, improved mobile experiences, and community tools could lower barriers to contribution and make editing more rewarding. Wikipedia has experimented with these, but more aggressive modernization is needed.

Governance reform: Expanding Wikipedia's governance structure to include more diverse voices, from different countries, languages, and backgrounds, could address concerns about bias and make the project more representative of global knowledge.

Content strategy: Rather than trying to cover everything, Wikipedia could focus on becoming exceptionally good at certain domains. Being the best source for scientific topics, historical events, and biographical information might be more sustainable than trying to be equally comprehensive across all subjects.

Partnerships: Formal partnerships with universities, libraries, archives, and research institutions could provide resources, expertise, and legitimacy while extending Wikipedia's reach and credibility.

Legal and policy advocacy: Rather than just defending against attacks, Wikipedia could proactively advocate for policies that support free and open knowledge. This might include supporting net neutrality, resisting government censorship, and promoting open licensing.

The Future of Knowledge in an AI Era

Ultimately, Wikipedia's fate is tied to a broader question: What will the future of knowledge look like when AI can generate, synthesize, and present information more quickly than humans can?

One possibility is that human knowledge production becomes a niche, artisanal activity. Humans write thoughtful essays and books for people willing to pay for high-quality information. AI provides free, fast, shallow summaries for the masses. This would be a loss for knowledge quality and access.

Another possibility is that human knowledge production remains central, but it's mediated through commercial platforms. Google, Open AI, and others would essentially hire people to write and edit content, then present that content through AI interfaces. This preserves human knowledge work but removes it from the commons.

A third possibility is that Wikipedia or something like it survives and adapts, continuing to be a space where humans collaborate to create reliable, free, publicly-owned knowledge. This would require significant changes but isn't impossible.

Wikipedia's next 25 years will be defined by whether it can navigate these possibilities. It requires financial sustainability, volunteer engagement, technological adaptation, and political skill. It requires treating Wikipedia not as a historical artifact but as a living project that must evolve or fade.

The stakes are higher than Wikipedia itself. They're about whether human knowledge remains a public good, owned by no one, accessible to everyone, or whether it becomes another commodity captured by corporate interests.

The Personal Stories Behind Wikipedia

Behind the statistics and structural challenges are millions of individual stories. There are the volunteers who have spent 15 years contributing to Wikipedia with almost no recognition. There are the teenagers in developing countries who discovered Wikipedia, found information they couldn't get anywhere else, and had their horizons expanded. There are the journalists, researchers, and students who have relied on Wikipedia to do their work better.

There's also the story of Wikipedia's co-founder Jimmy Wales, who had a vision for free knowledge and helped build an organization that has served billions of people. And the story of the Wikimedia Foundation staff and volunteers who have dedicated their careers to maintaining and improving the project despite constant challenges.

Wikipedia is ultimately a human project about human knowledge. It works because people care about accuracy, about attribution, about making information available to others. If those values disappear, if the volunteers give up, if society decides that Wikipedia isn't worth supporting, then Wikipedia will die. Not with a dramatic crash, but with a gradual fade as articles go unupdated and the community ages out.

The question facing Wikipedia today is whether enough people still believe in those values enough to keep the project alive and relevant for another 25 years.

Cut Costs with Runable

Cost savings are based on average monthly price per user for each app.

Which apps do you use?

Apps to replace

ChatGPTChatGPT
$20 / month
LovableLovable
$25 / month
Gamma AIGamma AI
$25 / month
HiggsFieldHiggsField
$49 / month
Leonardo AILeonardo AI
$12 / month
TOTAL$131 / month

Runable price = $9 / month

Saves $122 / month

Runable can save upto $1464 per year compared to the non-enterprise price of your apps.