YouTube reduces conspiracy videos but struggles to rein in climate change denial and more

YouTube has decreased the amount of conspiracy videos it recommends but still struggles to rein in content pushing climate change denial and ancient aliens, study says

  • Researchers say that YouTube has mitigated conspiracy videos on its platform
  • A study assessed 8 million videos over 15 months
  • Conspiracies are now recommended 40 percent less than when the platform first announced its initial efforts to reduce them
  • They say efforts to help mitigate radicalization are still ongoing 

Researchers say that YouTube’s efforts to crack down on videos purveying conspiracy theories hasn’t been entirely in vain.

According to a new study from experts at University of California, Berkley, the Google-owned platform has succeeded in reducing conspiracy recommendations.

The study analyzed more than 8 million video recommendations over the past 15 months and employed the use of an algorithm that rated videos – reading their title and description as well as the comment – on a scale of 0 to 1 for likelihood that it pedaled conspiracies.

To help account for errors they only counted videos that scored a .5 or higher on its scale. 

 YouTube has made some progress in its effort to help clamp down on conspiracy videos on its platform according to a new study by researchers from the University of California, Berkley 

Researchers found that YouTube’s claim that it had lowered conspiracy video recommendations by 50 percent in June last year to be ‘mostly consistent’ with their own analysis though that number is constantly changing.

The study says conspiracy videos are now about 40 percent less common than they were since their low point in June.

In 2018, the amount of conspiracy videos being recommended peaked with about 10 percent of all suggested content being conspiratorial in nature. 

Researchers say that the platform’s efforts have varied depending on category.

For instance, they say it has all but wiped out some categories of conspiracy videos including flat earth theories and videos that claim that the US government orchestrated the September 11 terror attacks in New York City. 

Other varieties of video have been allowed to remain on the platform, however, including those alleging the Great Pyramids were built be aliens and others denying human-driven climate change.

This disparity could be more of a matter of YouTube’s priorities according to researchers, with the company attempting to hash out what it qualifies as a threat to its users and the public and what should remain.

Though the efforts mark significant progress, researchers are quick to note that the YouTube’s continued efforts don’t necessarily signify a victory over conspiracies on the platform. 

In fact, they say that the threat of radicalization on YouTube remains as present as ever

YouTube has come under fire for its role in radicalizing some users. Experts say its personalized recommendation algorithm is key to the problem (stock)

YouTube has come under fire for its role in radicalizing some users. Experts say its personalized recommendation algorithm is key to the problem (stock)

‘The overall reduction of conspiratorial recommendations is an encouraging trend. Nonetheless, this reduction does not make the problem of radicalization on YouTube obsolete nor fictional, as some have claimed,’ they write.

Though the study was among the most comprehensive analyses of conspiracy videos on YouTube, researchers were not able to draw a linkage between the decline and likelihood that one might be radicalized by watching the content.

As noted by the New York Times, researchers were only able to assess videos recommended on YouTube without being logged in, meaning that personalized recommendations, which the majority of its users interface with, is still an unknown.

Until a study formulates a way to account for the personalized videos of an individual user, researchers say it will be impossible to judge the actual impacts that conspiracy videos have on radicalization.