As the 2024 US Presidential election approaches, the role of content distribution platforms like YouTube is being scrutinized as authorities, lawmakers, and regulators worry that moderation policies are simply not strict enough to filter out and deter extremists from disseminating harmful content.

In past years, YouTube has faced growing criticism for its role in the spread of extremism and hate speech.

Researchers have long argued that the platform’s powerful recommendation algorithm plays a key part in “radicalizing” users by leading them down “rabbit holes” of increasingly fringe and harmful content.

A study published in the journal Science Advances in 2023 provided factual information on this issue.

The researchers, led by Brendan Nyhan from Dartmouth College, found that the consumption of alternative and extremist content on YouTube is not primarily driven by the platform’s recommendations but rather by a small group of users who are already subscribed to these channels.

“The problem of potentially harmful content on YouTube is real,” says Nyhan. “The challenge is understanding the nature of the problems, so we can think about how best to address it.”

The study followed the browsing behavior of 1,100 people and included individuals who have repeatedly engaged with this type of content.

The researchers observed that only a small percentage of participants visited these “alternative channels”, which promoted debates on topics like men right’s activism, while only 6% of the group viewed content associated with extremist views.

Also read: AI-Generated Election Deepfakes Are Already a Massive Problem in the US and It’s Only Going to Get Worse

Moreover, the study found that the majority of people who were exposed to harmful content were already subscribed to these channels or to another one that shared similar content, meaning that they opted in to consume it.

This suggests that YouTube’s algorithm is not frequently recommending this kind of content to non-subscribers during the study period. These findings diminish, at least to the extent that the study is considered representative of the entire YouTube user base, the rumors spread in the media about “rabbit hole” recommendations.

Nyhan notes that these results could be attributed to the changes that YouTube made to its recommendation system in 2019. The company claimed back then that these modifications in the algorithm’s structure reduced watch time of “borderline content” by 50% and watch time from non-subscribers by 70%.

A Small Group of Users is Responsible for Most of the Viewing Time for These Videos

YouTube’s algorithm may not be leading people to extremist channels but the study did reveal a concerning pattern. It appears that people who scored high on hostile sexism and racial resentment were more inclined to see this type of video and engage with these channels.

majority of those who view harmful content are subscribed to extremists channles

“What really stands out is the correlation between content subscribers’ prior levels of hostile sexism and more time spent watching videos from alternative and extremist channels,” says Nyhan. “We interpret that relationship as suggesting that people are seeking this content out.”

Meanwhile, Annie Chen, a co-author from the CUNY Institute of State and Local Governance, explains: “Although almost all participants use YouTube, videos from alternative and extremist channels are overwhelmingly watched by a small minority of participants with high levels of gender and racial resentment.”

Also read: The FTC Just Banned Fake Reviews: Here’s What That Means

There appears to be a small, yet important segment, of YouTube users who account for the majority of the viewing time of extremist content. The study shows that a small fraction of less than 2% of users were responsible for 80% of the viewing time of this type of content.

Even though small, the presence of these groups and the popularity they give to these videos via high engagement levels and viewing stats is still harmful to the platform as they give them credibility.

In addition, the phenomenon highlights how YouTube is acting as a hosting and distribution platform for fringe communities, even if the recommendation algorithm is not the initial driver of viewership.

Most Harmful YouTube Content is Embedded in External Websites

youtube content guidelines already screen certain type of content

Although the study disregards, to some extent, these concerns about YouTube leading people down the extremism “rabbit hole”, researchers still cautioned that they were not exonerating the platform altogether from its responsibility.

“YouTube’s algorithms may not be recommending alternative and extremist content to nonsubscribers very often, but they are nonetheless hosting it for free and funneling it to subscribers in ways that are of great concern,” Nyhan says.

The platform is still being used as a channel to disseminate harmful content as creators are allowed to publish these materials and grow their audiences. In addition, they may even profit if they somehow manage to monetize the content.

Also read: 10+ Best Crypto Youtube Channels for 2024

As Christo Wilson, a co-author from Northeastern University, explains: “If you’re already a political partisan and you’re going to websites with a particular leaning, that’s then leading you to YouTube channels and videos with the same kind of lean. If you started in a place where you’re being exposed to bad stuff, you end up in a place where you’re being exposed to more bad stuff.”

Wilson’s previous research has shown that much of the extremist and alternative content consumption on YouTube actually occurs off the platform as users find these videos embedded on other websites that cater to fringe political views.

His findings show that, although the algorithm is not promoting the content directly, it gives extremists a powerful tool to easily reach their audiences via high-quality video resources hosted on a robust distribution infrastructure.

YouTube Must Implement Stricter Measures to Stop the Proliferation of This Content

Researchers agreed that YouTube must enforce its content moderation policies and take more aggressive approaches to significantly reduce the presence of extremist content on its platform.

Although the company has already made changes to its algorithm, the fact that this type of content is still spreading like wildfire among those who agree with the harmful views these channels promote underscores the need to adopt stricter policies and measures.

“They are aiding and abetting these fringe communities out there on the web by hosting videos for them,” says Wilson. “If they had stronger content-moderation policies, that would definitely help address this.”

Nyhan agrees, noting that YouTube should scrutinize channels that are frequently embedded on websites linked to the spread of misinformation and fake news. “If they see a particular channel being embedded in a website that is a known purveyor of misinformation, that channel should probably be scrutinized,” he stressed.

The platform already demonetizes and removes most of these channels and content but they are still enabling extremists by allowing them to host and grow large communists of like-minded folks.

YouTube’s response to the proliferation of extremist content will become critical during the upcoming U.S. presidential election. Both parties have engaged in heated debates and idealistic clashes that have resulted in high levels of polarization in the country.

Although this study reveals that YouTube does not directly promote extremism through its algorithm, it still helps it thrive indirectly as channels can disseminate these ideas among their viewers mostly through its robust content distribution platform.

As Nyhan emphasizes: “The problem of potentially harmful content on YouTube is real. The challenge is understanding the nature of the problems, so we can think about how best to address it.”