Over the course of its near 13-year life, YouTube has refined its system to provide its users with the optimal viewing experience alongside ways to discover more content catered to their preference. Sometimes, however, things don’t go to plan. YouTube’s recommended video algorithm has unfortunately been leading viewers astray with controversial and sometimes extreme content.
YouTube’s algorithm to determine suggested videos has been described as one of the “largest scale and most sophisticated industrial recommendation systems in existence,” and it needs to be, given the sizeable 1.5 billion user base that Google’s platform holds.
While this often gets plenty of accurate hits and will lead to many happy users contributing to approximately 70 percent of YouTube’s entire viewership, sometimes it can lead viewers to the more controversial channels featuring that of extreme viewpoints, conspiracy theories or, in the latest public altercation, YouTube notifying everyone of Logan Paul’s condemned comeback video.
This isn’t quite as bad as the provocative content pushed by YouTube’s algorithm last October, when videos began trending surrounding the Las Vegas shooting supposedly being a government conspiracy – a problem which inevitably prompted YouTube to change its algorithm considerably and focus on taking a more manual approach. But it is still intrusive for those that have never searched or watch any similar content.
Are you kidding me @YouTube!? I’m having a hard time receiving accurate notifications from my actual subscribed channels and this crap pops up!? I’ve never once clicked a video of his… at least try to hide the favoritism… pic.twitter.com/oXtIanLSoU
— Rachel Monday (@rachlm12) February 5, 2018
YouTube went on to apologise to all who received an intrusive notification of Logan Paul’s new video and assured disgruntled users that only his subscribers would receive them in the future, but his content continues to pop up in recommended videos to those that haven’t ever touched content similar to his.
For those not in the know, Paul's comeback video centres around him bragging that his subscriber count increased significantly despite him taking a three-week break in the wake of his other controversial video concerning suicide. This has led to a lot of backlash from the media, which Paul anticipated and has stated that people can “crucify me, vilify me, and I can promise you one thing, guys – I’m not going anywhere.”
It seems that the platform has a long way to go in refining its algorithm, which seems to prioritise trending videos over content that’s actually suited to the individual viewer. “YouTube returns fewer videos from highly partisan channels,” notes the Wall Street Journal, but searches still contain conspiracy videos where they shouldn’t.
KitGuru Says: I’ve found many videos via my recommendations and different results from my searches that I’ve surprisingly enjoyed, while not seeing many conspiracy videos at all. Perhaps I’m in the minority. How has your experience with YouTube’s recommended videos been? Any strange results?