Wikipedia had to fight to establish its legitimacy—and now it faces a new existential threat posed by generative AIBy Meghan Bartels edited by Eric Sullivan
Ian Ramjohn remembers the first time he edited Wikipedia. It was 2004, when the site was just three years old, and its information about the government of his home nation of Trinidad and Tobago was a decade out of date. But with little more than his Internet connection, he corrected the page in minutes. “That was huge,” he says. “I got hooked pretty much right away.”
Ramjohn is an ecologist by training, and for more than a decade, he has worked at the nonprofit organization Wiki Education, a spinoff of the Wikimedia Foundation, another nonprofit that hosts the site. For a decade before that, even while he taught in several adjunct professor roles in the U.S., Ramjohn was also a dedicated Wikipedian, as the site’s editors are known, editing articles on Trinidadian history, as well as topics such as figs and palms.
He started editing early enough to have watched Wikipedia’s credibility evolve. It began as a site that was strongest in niche topics—pop culture and tech were overindexed—only to grow into the Internet’s first stop for background on an enormous range of subjects, science included. Whether you want a list of microorganisms that have been exposed to the vacuum of space, a description of every bone in the human body or a guide to the mountains of Jupiter’s supervolcanic moon Io—for many readers, it’s still the quickest way to catch up on a topic.
But now, 25 years after Wikipedia’s founding, it is losing visitors. The Wikimedia Foundation has reported that human page views fell about 8 percent over certain months in 2025 compared with 2024. An external analysis of data from the company Similarweb conducted by the consulting firm Kepios found that, when counting total average monthly visits, the site lost more than one billion visits between 2022 and 2025. Wikimedia and outside researchers have argued that artificial-intelligence-powered searches plays a part because these searches reduce the number of people who click through to source sites. Wikipedia’s trust infrastructure—which includes citations, edit histories, talk pages and enforceable policies—was shaped in part by disputes among editors and visitors to the site over the coverage of evolution, climate change and health topics. What happens when a reference site's content still matters, but fewer people visit, cite or edit it?
In its earliest days, the website relied on people to write about what they knew, which resulted in contributors citing weak sources, including personal blogs. As readership and editing surged, Wikipedians no longer all knew one another. As Wikipedia’s editor ranks swelled, the site codified guidelines such as prioritizing neutrality and flagging unsupported claims for removal, Ramjohn says. Editors began enforcing sourcing more consistently. Early controversies over how Wikipedia presented topics such as evolution, compared with “intelligent design,” as well as debates over how to convey the science and denial of climate change, also led to stricter enforcement and, in some high-conflict cases, even page protection on articles, Ramjohn recounts.
But even as it took steps to become more trustworthy and its popularity surged, using Wikipedia as a source remained discouraged by many instructors, especially in higher education. “There was always a lot of ‘Don’t use Wikipedia,’” says Diana Park, a science librarian at Oregon State University, who has co-taught a course about the website.
A central concern was skepticism about the website’s accuracy because of its open-editing policy. What later became clear is that that very feature could also weed out errors and address outright vandalism. “I think the part of Wikipedia that people don’t know is the peer review,” Park says. “Wikipedia isn’t this Wild West of information, where anyone can just put anything. There are people holding what is there accountable.”
P. D. Magnus, a philosopher of science at the University at Albany, State University of New York, learned this firsthand in 2008, when he conducted an experiment in which 36 factual errors were inserted into articles about prominent historical philosophers. More than a third were removed within 48 hours—a ratio that stood when he repeated the experiment 15 years later, in 2023. (Never fear—on both occasions, he removed any remaining inaccuracies after the 48-hour window.) Researchers have repeatedly evaluated Wikipedia’s accuracy, often finding it comparable to that of traditional encyclopedias.
“It’s not a source you would go to if your life were on the line,” Magnus says. “But it’s a perfectly good source for lots of information where there isn’t much of a stake in getting it right.”
As Wikipedia’s use grew, some educators softened their stance, encouraging its use to find leads to sources that students could dig into directly. Others took a different approach, assigning students to edit Wikipedia entries—many through Wiki Education.
Jennifer Glass, a biogeochemist at Georgia Institute of Technology, is one of those professors; she has incorporated Wikipedia editing into her teaching since 2018. She wanted a student project that emphasized the concise and technical but understandable writing style that the site uses. And although she hadn’t done much editing for Wikipedia herself, she was impressed by the website’s breadth of content.
Each semester, her students write one article from scratch about a topic they research, from dolomitization to the tropopause. Glass says the project teaches them the value of institutional access to published literature and the skill of fact-checking their writing line by line.
In the full course on Wikipedia that Park has co-taught at Oregon State University, she has had a similar experience. “It’s always kind of a joy to see students take charge of something that they’ve been told for so long is wrong,” she says. Digging deep into Wikipedia—learning how it has come to be and how to present information on the site—teaches students how verifiability, talk pages and edit histories work—and how to trace claims back to primary sources. “It’s about being able to use it in the right situation and time,” Park says.
The ability to evaluate the quality of information, as well as the skills required to present accurate knowledge online, may matter even more now, 25 years into Wikipedia’s existence. The importance of information literacy has only grown with the rise of content generated by AI and large language models (LLMs), sparking new debates about reliability, accountability and correction.
Many tech companies have rapidly deployed this kind of AI usage, including Google, which has introduced AI-generated summaries atop some search results. The LLMs feeding this technology, trained on large datasets drawn from the web, produce fluent answers simply by predicting the next word, without providing back-up for each claim.
But as in the earliest days of Wikipedia, there’s no public edit history of AI-generated texts. Recall the experience that hooked Ramjohn on editing Wikipedia—the power to fix what was wrong on the Internet, no intermediary required. Users can’t correct an AI summary the way they can fix a sentence on Wikipedia.
In fact, Magnus attributes Wikipedia’s success to four factors: its community, its firm editing policies, the ability to review a page’s entire history and the noncommercial nature of the site. AI summarizers generally lack Wikipedia’s public accountability and transparent governance.
And Wikipedia’s supporters worry that even after the site has weathered its 25 years, generative AI may draw attention and traffic away from it. “Definitely LLMs are an existential threat to Wikipedia,” Ramjohn says. Fewer visitors mean fewer new editors for Wikipedia, and less frequent visits mean slower correction of errors added to the site—even as Wikipedians report that they have been struggling to keep up with a growing volume of AI-generated text. If fewer people visit and fewer people edit, the system that made Wikipedia self-correcting—and unusually resilient—could weaken.
He started editing early enough to have watched Wikipedia’s credibility evolve. It began as a site that was strongest in niche topics—pop culture and tech were overindexed—only to grow into the Internet’s first stop for background on an enormous range of subjects, science included. Whether you want a list of microorganisms that have been exposed to the vacuum of space, a description of every bone in the human body or a guide to the mountains of Jupiter’s supervolcanic moon Io—for many readers, it’s still the quickest way to catch up on a topic.
But now, 25 years after Wikipedia’s founding, it is losing visitors. The Wikimedia Foundation has reported that human page views fell about 8 percent over certain months in 2025 compared with 2024. An external analysis of data from the company Similarweb conducted by the consulting firm Kepios found that, when counting total average monthly visits, the site lost more than one billion visits between 2022 and 2025. Wikimedia and outside researchers have argued that artificial-intelligence-powered searches plays a part because these searches reduce the number of people who click through to source sites. Wikipedia’s trust infrastructure—which includes citations, edit histories, talk pages and enforceable policies—was shaped in part by disputes among editors and visitors to the site over the coverage of evolution, climate change and health topics. What happens when a reference site's content still matters, but fewer people visit, cite or edit it?
Beloved and Reviled
Wikipedia launched on January 15, 2001. The Y2K panic was in the past, Save The Last Dance ruled the box office, and the site’s co-founder Jimmy Wales posted the first home page of what he envisioned would become a fast, free crowdsourced encyclopedia. The experiment grew quickly: By 2003, the English-language site had 100,000 articles. And by 2005, Wikipedia was the Internet’s most popular reference site.In its earliest days, the website relied on people to write about what they knew, which resulted in contributors citing weak sources, including personal blogs. As readership and editing surged, Wikipedians no longer all knew one another. As Wikipedia’s editor ranks swelled, the site codified guidelines such as prioritizing neutrality and flagging unsupported claims for removal, Ramjohn says. Editors began enforcing sourcing more consistently. Early controversies over how Wikipedia presented topics such as evolution, compared with “intelligent design,” as well as debates over how to convey the science and denial of climate change, also led to stricter enforcement and, in some high-conflict cases, even page protection on articles, Ramjohn recounts.
But even as it took steps to become more trustworthy and its popularity surged, using Wikipedia as a source remained discouraged by many instructors, especially in higher education. “There was always a lot of ‘Don’t use Wikipedia,’” says Diana Park, a science librarian at Oregon State University, who has co-taught a course about the website.
A central concern was skepticism about the website’s accuracy because of its open-editing policy. What later became clear is that that very feature could also weed out errors and address outright vandalism. “I think the part of Wikipedia that people don’t know is the peer review,” Park says. “Wikipedia isn’t this Wild West of information, where anyone can just put anything. There are people holding what is there accountable.”
P. D. Magnus, a philosopher of science at the University at Albany, State University of New York, learned this firsthand in 2008, when he conducted an experiment in which 36 factual errors were inserted into articles about prominent historical philosophers. More than a third were removed within 48 hours—a ratio that stood when he repeated the experiment 15 years later, in 2023. (Never fear—on both occasions, he removed any remaining inaccuracies after the 48-hour window.) Researchers have repeatedly evaluated Wikipedia’s accuracy, often finding it comparable to that of traditional encyclopedias.
“It’s not a source you would go to if your life were on the line,” Magnus says. “But it’s a perfectly good source for lots of information where there isn’t much of a stake in getting it right.”
An Educational Tool
As Wikipedia’s use grew, some educators softened their stance, encouraging its use to find leads to sources that students could dig into directly. Others took a different approach, assigning students to edit Wikipedia entries—many through Wiki Education.
Jennifer Glass, a biogeochemist at Georgia Institute of Technology, is one of those professors; she has incorporated Wikipedia editing into her teaching since 2018. She wanted a student project that emphasized the concise and technical but understandable writing style that the site uses. And although she hadn’t done much editing for Wikipedia herself, she was impressed by the website’s breadth of content.
Each semester, her students write one article from scratch about a topic they research, from dolomitization to the tropopause. Glass says the project teaches them the value of institutional access to published literature and the skill of fact-checking their writing line by line.
In the full course on Wikipedia that Park has co-taught at Oregon State University, she has had a similar experience. “It’s always kind of a joy to see students take charge of something that they’ve been told for so long is wrong,” she says. Digging deep into Wikipedia—learning how it has come to be and how to present information on the site—teaches students how verifiability, talk pages and edit histories work—and how to trace claims back to primary sources. “It’s about being able to use it in the right situation and time,” Park says.
Another Wikipedia?
The ability to evaluate the quality of information, as well as the skills required to present accurate knowledge online, may matter even more now, 25 years into Wikipedia’s existence. The importance of information literacy has only grown with the rise of content generated by AI and large language models (LLMs), sparking new debates about reliability, accountability and correction.
Many tech companies have rapidly deployed this kind of AI usage, including Google, which has introduced AI-generated summaries atop some search results. The LLMs feeding this technology, trained on large datasets drawn from the web, produce fluent answers simply by predicting the next word, without providing back-up for each claim.
But as in the earliest days of Wikipedia, there’s no public edit history of AI-generated texts. Recall the experience that hooked Ramjohn on editing Wikipedia—the power to fix what was wrong on the Internet, no intermediary required. Users can’t correct an AI summary the way they can fix a sentence on Wikipedia.
In fact, Magnus attributes Wikipedia’s success to four factors: its community, its firm editing policies, the ability to review a page’s entire history and the noncommercial nature of the site. AI summarizers generally lack Wikipedia’s public accountability and transparent governance.
And Wikipedia’s supporters worry that even after the site has weathered its 25 years, generative AI may draw attention and traffic away from it. “Definitely LLMs are an existential threat to Wikipedia,” Ramjohn says. Fewer visitors mean fewer new editors for Wikipedia, and less frequent visits mean slower correction of errors added to the site—even as Wikipedians report that they have been struggling to keep up with a growing volume of AI-generated text. If fewer people visit and fewer people edit, the system that made Wikipedia self-correcting—and unusually resilient—could weaken.
No comments:
Post a Comment