Lorie Wimble Lorie is the "Liberal Voice" of Conservative Haven, a political blog, and has 2 astounding children. Find her on Twitter.

Wikipedia is using artificial intelligence to attract more contributors

1 min read

Although a handful of restrictions have been put in place over the years to reduce online vandalism, people are still more or less free to create and modify Wikipedia pages however they please. While this openness is one of Wikipedia’s defining characteristics and greatest strengths, it also opens the door to misinformation and vandalism on its pages, and even with a small army of dedicated editors, it’s still a massive problem for the website. To make matters worse, Wikipedia’s efforts to crack down on that kind of thing has caused it to lose a significant chunk of its active contributors, as a simple mistake on an article can cause hours of work and writing to be automatically deleted by editors. To remedy this, Wikipedia has decided to use the power of artificial intelligence to make itself more contributor-friendly and efficient. 

Software trained to know the difference between an honest mistake and intentional vandalism is being rolled out in an effort to make editing Wikipedia less psychologically bruising. It was developed by the Wikimedia Foundation, the nonprofit organization that supports Wikipedia. One motivation for the project is a significant decline in the number of people considered active contributors to the flagship English-language Wikipedia: it has fallen by 40 percent over the past eight years, to about 30,000. Research indicates that the problem is rooted in Wikipedians’ complex bureaucracy and their often hard-line responses to newcomers’ mistakes, enabled by semi-automated tools that make deleting new changes easy. Aaron Halfaker, a senior research scientist at Wikimedia Foundation who helped diagnose that problem, is now leading a project trying to fight it, which relies on algorithms with a sense for human fallibility. His ORES system, for “Objective Revision Evaluation Service,” can be trained to score the quality of new changes to Wikipedia and judge whether an edit was made in good faith or not. Halfaker invented ORES in hopes of improving tools that help Wikipedia editors by showing recent edits and making it easy to undo them with a single click. The tools were invented to meet a genuine need for better quality control after Wikipedia became popular, but an unintended consequence is that new editors can find their first contributions wiped out without explanation because they unwittingly broke one of Wikipedia’s many rules.

Avatar of Lorie Wimble
Lorie Wimble Lorie is the "Liberal Voice" of Conservative Haven, a political blog, and has 2 astounding children. Find her on Twitter.

Opera will soon come with a free and unlimited…

There was a time when Opera was at the forefront of web browser innovation, and some of the features that it pioneered have become...
Avatar of Lorie Wimble Lorie Wimble
1 min read

ProtonMail ditched its invite system and launched its apps

Living in a post-Snowden world, it’s hard to know which online services you can trust with your information, and that’s spawned a massive wave...
Avatar of Michio Hasai Michio Hasai
1 min read

Facebook wants to be the dictionary for the language…

Often times, it seems like conversing with people on the internet requires knowledge of a completely different form of English from what we use...
Avatar of Brian Molidor Brian Molidor
1 min read

Leave a Reply

Your email address will not be published. Required fields are marked *