Wikipedia Run By Robots? Researchers Develop AI That Updates Site Using ‘Humanlike’ Style, Grammar
Wikipedia Run By Robots? Researchers Develop AI That
Updates Site Using ‘Humanlike’ Style, Grammar
by
John Anderer February 15, 2020
CAMBRIDGE, Mass. — Wikipedia, the world’s
free online encyclopedia, is a true wonder of the internet. As vast a resource
it is on quite literally any subject one can imagine, it’s knock is that it
allows anyone to make edits and provide updates. That means sometimes articles
show outdated or inaccurate information. Misinformation feels like it’s
reaching a fever pitch in 2020, and in an effort to create a more accurate internet,
researchers at MIT have developed an advanced AI system capable of automatically
identifying and correcting inaccurate, outdated information presented in
Wikipedia articles.
For now, Wikipedia articles are painstakingly reviewed and
slowly corrected thanks to the efforts of human volunteers all over the world.
The team at MIT say their system will be able to do these editors’ jobs in a
faster, more efficient manner, all while maintaining language largely similar
to how a human would write or edit.
At least for now, some humans are still necessary for the new
MIT system to work properly. In execution, a human would type into an interface
rough, accurate information without having to worry about forming full
sentences or using proper grammar. Then, the system would search through
Wikipedia, identify any inconsistencies, and rewrite the offending sentences to
reflect accurate information written in a humanlike manner. In the future, the
research team say they plan on improving the AI system to the point that no
human input at all will be necessary.
“There are so many updates constantly needed to Wikipedia
articles. It would be beneficial to automatically modify exact portions of the
articles, with little to no human intervention,” says Darsh Shah, a PhD student
in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and one
of the lead authors, in an MIT release. “Instead of hundreds of people working on
modifying each Wikipedia article, then you’ll only need a few, because the
model is helping or doing it automatically. That offers dramatic improvements
in efficiency.”
While
this isn’t the first bot or system developed to update Wikipedia articles, this
is by far the most advanced. Earlier systems merely do away with blatant
vandalism on Wikipedia or add very limited bits and pieces of information into
already formed sentences. This new system, however, provides a much more
complex service: taking a piece of unstructured information and instantly
identifying where it is needed on Wikipedia and writing a new sentence.
“The other [bot] tasks are more rule-based, while this is a task
requiring reasoning over contradictory parts in two sentences and generating a
coherent piece of text,” Shah notes.
The system isn’t limited to Wikipedia either, researchers also
used it to reduce bias in a popular fact-checking database. In fact, the study
states that the system is useful in identifying fake news, an ever-present problem in today’s online
landscape. More specifically, it can help train other AI systems designed to
find and eliminate fake news by reducing bias.
Today, we all take websites like Wikipedia for granted, and it’s
easy to forget that not that long ago an information source like Wikipedia
would sound like something out of a science fiction novel. Now it seems that in
a few years down the road, it very well may be.
The study is being presented at the AAAI Conference on
Artificial Intelligence.
Comments
Post a Comment