There is a problem with the website encyclopedia. The not-too-long-ago renamed Facebook may have the answer.

It is time to back up. One of the largest-scale collaborative projects in human history, with more than 100,000 volunteer human editors contributing to the construction and maintenance of a massive encyclopedia consisting of millions of articles. Over 17,000 new articles are added to the encyclopedia every month, while changes and additions to its existing articles are constant. The most popular articles have been edited thousands of times.

Accuracy is the challenge. Large numbers of humans can come together to create something positive, as shown by the existence of the encyclopedia. In order for the articles to be useful, they need to be backed up by facts. This is where people are cited. For most of the time, the idea is that users and editors alike can confirm facts by adding or clicking hyperlinks that return to their source.

Citation needed

I want to confirm that President Barack Obama traveled to Europe and then Africa in 1988 in order to meet many of his paternal relatives for the first time. I just have to look at the citations for the sentence and see if there are any book references that confirm that the fact is true.

There is no evidence that the author didn't conjure the words out of the digital ether, which is why the phrase "citation needed" is the most damning in all of encyclopedias. It's like telling someone a fact while making finger quotes in the air with the words "citation needed" attached to it.

the wikipedia logo on a pink background

There are citations that don't tell us everything. I think it's plausible that I was the 23rd highest-earning tech journalist in the world last year and that I once gave up a lucrative modeling career to write for Digital Trends.

When you click on the hyperlinks, you will see that they don't support my facts, but instead lead to unrelated pages on Digital Trends. The low barrier to entry to the world of modeling is one of the reasons why the majority of readers who have never met me would leave this article with a lot of false impressions. The existence of citations appears to be factual endorsements in the world of information overload.

Meta wades in

Even if they don't link to pages that actually support the claims, what if they add citations? An example would be a recent article on Blackfeet Tribe member Joe Hipp, who was the first Native American boxer to challenge for a world title, and linked to a website. There was no mention of boxing or Joe Hipp on the website.

Even if the citation was not appropriate, the factoid was accurate. It can be seen how this could be used to spread misinformation.

Mark Zuckurburg introduces Facebook's new name, Meta.

Meta believes that it has a way to help. Meta AI, the research and development lab for the social media giant, has developed a machine learning model that can automatically check if a citation supports a claim. It could be among the most impressive bot that has been used.

"I think we were driven by curiosity at the end of the day." What was the limit of this technology? We didn't know if this artificial intelligence could do anything meaningful in this situation. No one had done something like that before.

Understanding meaning

Meta's new tool is able to effectively analyze the information linked to a citation and then cross reference it with the supporting evidence. This isn't just a simple text string comparison.

Petroni said that there is a component that looks at the similarity between the claim and the source. We built an index of all these websites by chunking them into passages and giving an accurate representation for each passage. Two chunks of text with the same meaning will be represented in a very close position in the space where all these passages are stored.

a single-pane comic from xkcd about Wikipedia citaions
xkcd

The tool has the ability to suggest better references. This tool can be used to suggest references that would best show a point. Petroni doesn't like it being compared to a factual spellcheck, but that's an easy way to think about what it could do.

There is still a lot of work to be done before this point. He said it was a proof of concept. At the moment, it's not usable. In order for this to work, you need to have a fresh index that gives you more data than we have. Every day it needs to be updated.

This could include multimedia as well. The system might be able to direct users to a great documentary that is available on the internet. It's possible that the answer to a claim is hidden online.

A question of quality

Other challenges are also present. There is an attempt to independently grade the quality of sources. It is a difficult area. A brief reference to a subject in the New York Times would prove to be a more suitable, high-quality citation than a more comprehensive, but less-known source. A non-mainstream publication should rank more highly than a mainstream publication.

A high-quality source with a high number of incoming links equates to a high-quality source with a high number of incoming links. There is nothing like this in Meta's artificial intelligence.

It would need to have something like that in order to be effective. Imagine if you will that you were to set out to prove the most reprehensible opinion for inclusion on a Wikipedia page. If the only way to confirm that something is true is to see if similar sentiment can be found elsewhere, then virtually any claim could be proven correct.

Petroni said that they are interested in trying to model the trustworthiness of a source. There is a list of websites that are considered trustworthy and not. It would be great if we could find a way to promote these in a way that is efficient.

Editors' Recommendations