Youtube and Censorship
by Caroline Hurley
YouTube has become a hot topic as the company came under fire for its controversial policies on hate speech and some unintended consequences of its new algorithms. What exactly is going on, and who is responsible for it? Well, there’re a lot of different issues pooling together, so let’s break it down.
YouTube’s Confusing Reaction to Homophobic and Racist Content
For several years, a comedian named Steven Crowder has been posting videos which contain homophobic and racist insults directed at Carlos Maza, a gay Mexican journalist for the company Vox. When Maza complained, YouTube initially refused to take down Crowder’s content, as they found it did not violate their hate speech policy.
After backlash on social media, YouTube has retracted its decision and decided to demonetize Crowder’s channel. The company’s reasoning for the decision, however, was unclear. Some of their statements indicated that their action against Crowder was a result of his offensive merchandise (ex. T-shirts claiming “Socialism is for F**s”), rather than his video content.
Overall, many, including Maza, were left unsatisfied with the handling of this issue, as Crowder is still making money from merchandise and reaching a wide audience with hateful speech.
Meanwhile, conservatives blamed Maza and Vox for what they saw as a targeted war on free speech. Fox News called Maza “a fascist posing as a victim” and Ted Cruz tweeted his opposition to YouTube “playing God”.
YouTube’s Algorithm Helps Pedophiles Find Videos
YouTube garnered more negative news attention this week when a particularly concerning pattern was exposed: if a viewer is already watching erotic content, the website’s video recommendation algorithm suggests videos of young girls. The progression from pornographic to pedophilic starts with girls dressed in childlike clothing all the way to home videos of entirely unsuspecting toddlers.
YouTube’s hidden pedophile community and its relation to content with children has been up for debate for a while. Earlier this year, YouTube reacted to backlash against disturbing comments by taking down some videos and limiting advertisements and comments on others.
After this week’s added concerns, republican senator Josh Hawley has announced that he will be proposing a bill banning the recommendation of videos of minors.
YouTube’s Algorithm Is Beneficial to Extremist Alt-Right Channels
When consumers complained of clickbait in the form of misleading titles and thumbnail photos, Youtube changed the focus of their algorithm from number of views to watch time. The policy change had several effects:
It prioritized creators who make videos that engage the viewer for a longer time. Sounds good, right?
It allowed more time for YouTube to display ads. Okay, everyone hates ads, but a reasonable decision for a company to make.
It benefited the alt-right and neo-nazi community of YouTube. Yikes.
As it turns out, alt-right YouTubers were the perfect beneficiaries of this update. Their videos are commonly in elongated “rant” form, and their views are so radical that they can use dramatic titles without exaggerating.
And it’s not just a theoretical danger. The New York Times published an article recounting one young man’s journey down the alt-right rabbit hole, exposing just how real of a danger it is to give white supremacists and other violent radicals this platform.
What’s Next?
As the week concludes, it’s a lot for YouTube’s upper employees to consider. Does YouTube’s system not only allow, but reward content that is bigoted? Could the consumption of this content lead to more pedophilic behavior or hate crimes? Is it YouTube’s fault if it does? How do you pick and choose which videos and comments are acceptable and which are not? Can YouTube’s liberal brand stand to offend so many of its viewers with inaction?
The question it all boils down to is one that only the company can decide: Does YouTube have a greater responsibility toward the safety of its viewers or complete and total freedom of speech?