If YouTube’s algorithms radicalize people, it’s hard to tell from the data

Images of rioters attacking police at the US Capitol.

Enlarge / YouTube's recommendation algorithm probably didn't send them to Washington, DC. (credit: Brent Stirton / Getty Images)

We've all seen it happen: Watch one video on YouTube and your recommendations shift, as if Google's algorithms think the video's subject is your life's passion. Suddenly, all the recommended videos—and probably many ads—you're presented with are on the topic.

Mostly, the results are comical. But there has been a steady stream of stories about how the process has radicalized people, sending them down an ever-deepening rabbit hole until all their viewing is dominated by fringe ideas and conspiracy theories.

A new study released on Monday looks at whether these stories represent a larger trend or are just a collection of anecdotes. While the data can't rule out the existence of online radicalization, it definitely suggests that it's not the most common experience. Instead, it seems like fringe ideas are simply part of a larger self-reinforcing community.

Read 15 remaining paragraphs | Comments


Post a Comment

0 Comments