Informed Crowds Can Effectively Identify Misinformation

08/17/2021
by   Paul Resnick, et al.
0

Can crowd workers be trusted to judge whether news-like articles circulating on the Internet are wildly misleading, or does partisanship and inexperience get in the way? We assembled pools of both liberal and conservative crowd raters and tested three ways of asking them to make judgments about 374 articles. In a no research condition, they were just asked to view the article and then render a judgment. In an individual research condition, they were also asked to search for corroborating evidence and provide a link to the best evidence they found. In a collective research condition, they were not asked to search, but instead to look at links collected from workers in the individual research condition. The individual research condition reduced the partisanship of judgments. Moreover, the judgments of a panel of sixteen or more crowd workers were better than that of a panel of three expert journalists, as measured by alignment with a held out journalist's ratings. Without research, the crowd judgments were better than those of a single journalist, but not as good as the average of two journalists.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset