Characterizing Abhorrent, Misinformative, and Mistargeted Content on YouTube

05/20/2021
by   Kostantinos Papadamou, et al.
0

YouTube has revolutionized the way people discover and consume video. Although YouTube facilitates easy access to hundreds of well-produced and trustworthy videos, abhorrent, misinformative, and mistargeted content is also common. The platform is plagued by various types of problematic content: 1) disturbing videos targeting young children; 2) hateful and misogynistic content; and 3) pseudoscientific misinformation. While YouTube's recommendation algorithm plays a vital role in increasing user engagement and YouTube's monetization, its role in unwittingly promoting problematic content is not entirely understood. In this thesis, we shed some light on the degree of problematic content on YouTube and the role of the recommendation algorithm in the dissemination of such content. Following a data-driven quantitative approach, we analyze thousands of videos on YouTube, to shed light on: 1) the risks of YouTube media consumption by young children; 2) the role of the recommendation algorithm in the dissemination of misogynistic content, by focusing on the Involuntary Celibates (Incels) community; and 3) user exposure to pseudoscientific content on various parts of the platform and how this exposure changes based on the user's watch history. Our analysis reveals that young children are likely to encounter disturbing content when they randomly browse the platform. By analyzing the Incel community on YouTube, we find that Incel activity is increasing over time and that platforms may play an active role in steering users towards extreme content. Finally, when studying pseudoscientific misinformation, we find that YouTube suggests more pseudoscientific content regarding traditional pseudoscientific topics (e.g., flat earth) than for emerging ones (like COVID-19) and that these recommendations are more common on the search results page than on a user's homepage or the video recommendations section.

READ FULL TEXT

page 1

page 3

page 6

page 8

page 9

page 11

page 12

page 21

research
10/22/2020

"It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations

YouTube has revolutionized the way people discover and consume videos, b...
research
06/30/2022

Deradicalizing YouTube: Characterization, Detection, and Personalization of Religiously Intolerant Arabic Videos

Growing evidence suggests that YouTube's recommendation algorithm plays ...
research
01/22/2020

Understanding the Incel Community on YouTube

YouTube is by far the largest host of user-generated video content world...
research
03/25/2022

An Audit of Misinformation Filter Bubbles on YouTube: Bubble Bursting and Recent Behavior Changes

The negative effects of misinformation filter bubbles in adaptive system...
research
12/24/2019

Algorithmic Extremism: Examining YouTube's Rabbit Hole of Radicalization

The role that YouTube and its behind-the-scenes recommendation algorithm...
research
10/14/2022

No Video Left Behind: A Utility-Preserving Obfuscation Approach for YouTube Recommendations

Online content platforms optimize engagement by providing personalized r...
research
07/21/2021

Auditing the Biases Enacted by YouTube for Political Topics in Germany

With YouTube's growing importance as a news platform, its recommendation...

Please sign up or login with your details

Forgot password? Click here to reset