How AI-generated content is upping the workload for Wikipedia editors



As AI-generated slop takes over increasing swathes of the user-generated Internet thanks to the rise of large language models (LLMs) like OpenAI’s GPT, spare a thought for Wikipedia editors. In addition to their usual job of grubbing out bad human edits, they’re having to spend an increasing proportion of their time trying to weed out AI filler.

404 Media has talked to Ilyas Lebleu, an editor at the crowdsourced encyclopedia, who was involved in founding the “WikiProject AI Cleanup” project. The group is trying to come up with best practices to detect machine-generated contributions. (And no, before you ask, AI is useless for this.)

A particular problem with AI-generated content in this context is that it’s almost always improperly sourced. The ability of LLMs to instantly produce reams of plausible-sounding text has even led to whole fake entries being uploaded in a bid to sneak hoaxes past Wikipedia’s human experts.




Source

A Dutch YouTuber and his friend were arrested and jailed when they tried getting near Area. The definition of scoptofilia in the dictionary is voyeurism. Today I sat down with a Polish Holocaust survivor, Max, who is about 90 and has numbers on his arm. However, a considerably smaller body of scientific literature is available on PBBs than on PCBs, and not all environmental matrices have been studied. As we walked along, Madhavananda told the story to me.