Introducing Endless Mode: A New Games & Anime Site from Paste
At a moment where it feels like the dream of the internet is fading, Wikipedia remains a pleasant outlier: an actual website with free, accessible information, and a reprieve from the algorithmic, profit-oriented world of social media. While the organization hasn’t been totally immune from controversy—earlier this year, the Wikimedia Foundation generated backlash from editors after putting AI-generated article summaries at the top of articles—Wikipedia is making an effort to keep itself free from the slop that has begun to congest so much of the internet.
A new article, written by Emma Roth and published in The Verge, details some of the methods the site has used to identify poorly generated AI content. Per Roth, three of the main, though not the only, signs editors check for are:
- “Writing directed toward the user, such as ‘Here is your Wikipedia article on…,’ or ‘I hope that helps!'”
- “‘Nonsensical’ citations, including those with incorrect references to authors or publications.”
- “Non-existent references, like dead links, ISBNs with invalid checksums, or unresolvable DOIs.”
Other details can also be tells that something was artificially generated, like the now-well-known use of excessive em-dashes and an overuse of promotional language.
Ross also cites a recent article from 404 Media about how Wikipedia has adopted a policy to remove articles that it determines to be poorly generated; previously, articles flagged for deletion would enter a seven-day discussion period where the community would determine whether it needed to be removed. But now, as one editor describes being “flooded non-stop with horrendous drafts,” the group has had to remove things a lot more quickly. It’s a problem, to be sure, but at least Wikipedia is taking the problem seriously and doing its best to keep at least one place on the internet from getting too sloppy. You can check out the whole article here.