Wikipedia is in danger due to artificial intelligence

Preset Download

ChatGPT: The community of volunteers at the online encyclopedia Wikipedia is divided on how to respond to the tide of AI-generated content and misinformation.

As creative AI affects more and more aspects of culture, Wikipedia curators are divided on how it should work.

In a recent public call by the Wikimedia Foundation, the nonprofit responsible for maintaining the world’s largest open encyclopedia, there was disagreement over whether to use large language models to create content.

The ability of artificial intelligence to “fake”
While some believe that tools like ChatGPT can help create and summarize articles, others remain cautious. The concern is that lesser-known wikis could be replaced by bad content, as machine-generated content must be balanced against a flood of human commentary.

While creative AI tools are useful for writing human-style text, they also often contain misinformation and even cite non-existent academic sources and articles.

This often results in text summaries that appear to be accurate, but upon closer inspection reveal that they are completely fabricated.

Amy Bruckman, professor emeritus and senior associate dean for interactive computing at Georgia Tech, says large language models are only as good as they can distinguish fact from fiction.

“The only way we can do this is with [big language models], but edit it and have people check the source code,” Bruckman, who wrote the book on Wikipedia, told Motherboard.

Researchers quickly realized that OpenAI’s ChatGPT was a poorly made tool. This is not good for students who rely on this chatbot to write their papers.

Sometimes it invents articles and their authors. Other times, the AI engine will match the names of lesser-known scholars with better ones, maximizing the credibility of the content.

OpenAI even said the model was an “illusion” when it generated facts — an argument criticized by AI experts as a way of deflecting the blame from its tools for spreading misinformation.

“The risk with Wikipedia is that people may reduce the quality by including unchecked content. I see nothing wrong with using it as a draft, but everything should be verified,” Bruckman added.

Share This Article
Leave a comment