YouTube has revealed its plans to combat conspiracy video misinformation by displaying blurbs from Wikipedia articles next to its video player. This is news to Wikipedia, which has gone on record to say it has not once entered an official discussion or partnership with the platform.
The Google-owned video platform has come under criticism multiple times for its algorithm promoting false information such as conspiracy videos, particularly in the wake of tragic events such as mass shootings seen throughout the United States.
In an effort to stabilise the dissemination of correct information, YouTube CEO Susan Wojcicki has revealed in a conversation to Wired that it will integrate text boxes which she called “information cues,” utilising third-party websites to display correct information around the video player. This won’t be limited to just conspiracy videos, either, as these cues will also appear on topics and events that have triggered a lot of debate.
The Wikimedia Foundation stated that it wasn’t aware of these plans and has not entered a formal partnership with YouTube like the platform is suggesting. This has prompted concerns that YouTube would be exploiting the volunteers as free labour, considering Wikipedia already already has thousands of volunteer editors that attempt to monitor and eventually thwart “conspiracies, pseudo-science, fringe theories, and more.”
Katherine Maher, The Wikimedia Foundation’s executive director, highlights that Wikipedia already lacks the amount of “support that is critical to our sustainability,” which makes YouTube’s move seem all the more exploitative. Maher also calls into question the reliability of Wikipedia’s information, and that the platform doesn’t “want you to blindly trust us.”
Wikipedia’s whole point is to allow users to easily check citations, edit inaccurate information and overall contribute to the platform, but this works two-fold. Maher notes that it often makes the online encyclopaedia “faster than a search engine in offering up-to-date information on world events” but this includes “celebrity death hoaxes” and many more pieces of information that is unregulated and unchecked.
From YouTube’s wording, it seems that Wikipedia might not be the only site it has its sights on, but the company has yet to mention any other platform.
Discuss on our Facebook page, HERE.
KitGuru Says: Disseminating false information with unregulated, potentially false information is quite self-defeating. I also don’t think it shines a good like on the Google-owned platform to forgo an official partnership, supporting those it is relying on to combat its own problem. What do you think about seeing Wikipedia articles by the side of YouTube videos?