New videos on YouTube show up in mainly two different places - if someone is subscribed to a creator, that creator's new videos will show up in the user's subscription box. Otherwise, YouTube uses a specific algorithm to recommend new content.
This algorithm has been reworked and tweaked a number of times within the past few years, to the satisfaction and frustration of both creators and viewers alike. The most comprehensive change was a few years ago when YouTube updated its algorithm to measure details such as likes, dislikes and watch time before recommending videos to others. Later, after complaints that people's recommendations were too similar, it was changed again to pull recommendations from "a wider set of topics." Now, there's another drastic change - YouTube's algorithm is set to reduce the spread of conspiracy-related content recommended to viewers.
In a recent blog post, YouTube described this change as reducing the spread of harmful misinformation and "borderline content," referring to "content that comes close to — but doesn't quite cross the line of — violating their Community Guidelines." They didn't provide a clear definition of what constitutes such harmful misinformation, but they gave a few loose examples, such as "videos promoting a phony miracle cure for a serious illness, claiming the Earth is flat, or making blatantly false claims about historic events like 9/11."
At first, this change comes off as a little biased or as if YouTube is just blatantly censoring videos they don't agree with. However, considering the platform's history with promoting extremist ideologies and conspiracist content over mainstream news and factual information, there is a slight chance it could be a good thing.
Firstly, with this algorithm tweak, videos won't just be straight-up deleted - which I would absolutely object to as censorship. Videos with such misinformation and "borderline" content will stay up - they just won't show up in others' recommendations, meaning that users can still find and watch the videos by simply searching them on the site or by being subscribed to the channels with those videos. Plus, the allegations surrounding YouTube's hosting and amplifying of not only extremist political ideologies, but of misinformation about real-life events as well, are extremely valid concerns that need to be addressed.
Just this month, The Washington Post found that a YouTube search of "RBG," Supreme Court Justice Ruth Bader Ginsburg's initials, produced few results from reliable news sources and instead recommended videos from the alt-right, some of which falsely claimed that liberal doctors were keeping her alive with illegal drugs. Another investigation by BuzzFeed News found that when searching for news-related content, YouTube would consistently recommend highly speculative conspiracy videos and content from hate groups instead of mainstream news videos. Even during breaking news events, conspiracies go viral - right after the high school shooting in Parkland, Florida, a video that claimed one of the survivors was a "crisis actor" was the top trending video on YouTube. Similarly, following the Las Vegas shooting in 2017 that left 58 dead, the highest-trending videos on YouTube were ones that claimed it was a complete hoax.
This promotion of extremist conspiracies and ideologies over news content is extremely troubling, especially in our current political climate that is full of "fake news" and biased reporting. So a new algorithm that stops recommending so much dangerous misinformation and so many extremist ideas and instead bolsters recommendations for factual news content could be a good thing.
However, I'm not getting my hopes up yet - YouTube has seen some success in reworking its algorithm, like when they changed the system to filter out clickbait. But they've also tragically misstepped, like when people started realizing that the newly updated algorithm restricts and demonetizes LGBT+ content. So we'll just have to wait and see how this algorithm change affects the platform and its community - and in the meantime, we must be cautious about what videos we click on that show up in our recommended list.