Jump to content

MIT develops AI tech to edit outdated Wikipedia articles


dufus

Recommended Posts

The technology uses machine learning to automatically spot and replace information, with the goal of aiding human editors

 

wikipedia.jpg

 

Artificial intelligence technology could be used to re-write outdated Wikipedia articles, reducing the workload for human editors, thanks to a system developed by Massachusetts Institute of Technology (MIT).  

 

Bots have been used to edit Wikipedia in the past, but while AI systems are now capable of generating text and can check facts, they aren’t always the most capable and often struggle to mimic the tone of human writers. 

 

By training its system on two databases, one with pairs of sentences and another with a relevant Wikipedia sentence, the researchers at MIT were able to train an AI system that can find outdated text in Wikipedia pages and then rewrite the passages in a human-like fashion. 

 

The AI system works by looking at new or updated information typed into the user interface in an unstructured sentence; one without grammar or style. The algorithm then finds the relevant Wikipedia entry, pinpoints the section the updated information is referencing and then rewrites the sentence with amended details in a human-like fashion.

 

The idea is that the AI removed the need for human editors to laboriously search through and amend Wikipedia entries.

 

“There are so many updates constantly needed to Wikipedia articles. It would be beneficial to automatically modify exact portions of the articles, with little to no human intervention,” says Darsh Shah, a PhD student in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and one of the lead authors of a paper detailing the AI research. “Instead of hundreds of people working on modifying each Wikipedia article, then you’ll only need a few, because the model is helping or doing it automatically. That offers dramatic improvements in efficiency.”

 

The AI can also help identify so-called ‘fake news’ as it has been trained to recognise the difference between legitimate additions to a Wikipedia entry and information that is factually incorrect. The system seeks evidence and other information to ascertain if such a claim is true or false. 

 

The system is not ready for full use on Wikipedia just yet, as it has been given a score of four out of five for accuracy by human editors and a score of three-and-a-half out of five for gramma accuracy. As such, it outperforms other AI tools but isn’t quite able to mimic human editors just yet. 

 

sauce

Just now, dufus said:

isn’t quite able to mimic human editors just yet. 

 

lie

Link to comment
Share on other sites


  • Views 287
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...