YouTube has been taking heat for “not doing enough” to stop or limit so-called conspiracy videos on its social media platform. While the Google-owned company says it’s trying to make some strides in this area, critics argue it isn’t enough.
Some of the most common targets for criticism are the sort of conspiracy theory videos that promote junk science “theories” about a flat earth or “false flag” school shootings or “government chemtrails” poisoning people. These kinds of videos, even though they are routinely denounced by everyone who knows anything about the topics covered have a very dedicated following, especially online.
Purveyors of these kinds of conspiracies have become quite adept at using social media like YouTube, Facebook and Twitter to spread their messages to willing listeners.
So, how can these media companies combat the conspiracies without turning their platforms into draconian places where every video is scrutinized before being uploaded? That’s a question YouTube CEO Susan Wojcicki is trying to answer. At a recent conference, Wojcicki said one idea was to include Wikipedia links that attempt to point people to accurate information related to the conspiracies.
Critics quickly pointed out the flaw in that argument. Wikipedia has a credibility loophole all its own. Just about anyone can edit a Wiki page to say whatever they please. Moderators cannot possibly keep up. Wikipedia is trying to address this, mainly by restricting editorial access to pages about certain topics… but, of course, that strategy is far from foolproof.
So, YouTube had to go back to the drawing board. The company said it would soon be offering other links from verified sources in addition to Wikipedia. When critics and the media asked “which sources,” YouTube spokespeople were mum on the subject.
At this point, there really isn’t a concrete, workable, fully-effective strategy for cutting down on nonsense online. The fact of the matter is, if you open it up to everyone to post what they want, you will get the full spectrum of the human experience, with all the good, bad, ugly and unbelievable.
That means, essentially, it’s a game of digital whack-a-mole. When one out-of-bounds video gets popular, moderators can pull it, flag it or shut down the account. However, that’s still just trying to use an eyedropper to drain Lake Superior. There are just far too many videos for moderators to keep up with, and, when one is taken down, creators can just open a new account and put another video up.
In the end, while having less misinformation online might be a good thing for all users, especially young, impressionable kids who don’t have fully-developed filters in place, the reality is, it’s just not a workable scenario yet. Until that exists, YouTube will just have to manage the complaints and the malcontents the best it can.