Picture the two safest statements all people can generally agree with:
- “Lena Dunham is trash”
- “Policing the internet does more harm than good”
Last week, Susan Wojcicki, the chief executive of YouTube, which is owned by Google, announced she isn’t planning to sick Miss Dunham’s version of feminism upon the world.
No, that would be far too innocent and laughable.
Instead, Wojcicki was interviewed and penned an article in The Daily Telegraph unveiling a major company decision to “root out violent extremism and content” on the platform. Attempting to justify this by saying this loosely defined extreme content “endangers children.”
In a drastic deviation from previous YouTube conduct, which scanned through appropriate content through their own coded algorithm, Wojcicki announced in the interview:
“We will continue the growth of our teams, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.”
In her article she writes:
“As the CEO of YouTube, I’ve seen how our open platform has been a force for creativity, learning and access to information. I’ve seen how activists have used it to advocate for social change, mobilize protests, and document war crimes. I’ve seen how it serves as both an entertainment destination and a video library for the world. I’ve seen how it has expanded economic opportunity, allowing small businesses to market and sell their goods across borders. And I’ve seen how it has helped enlighten my children, giving them a bigger, broader understanding of our world and the billions who inhabit it.
But I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm.”
Linking to another Daily Telegraph article, the publication uses the June attack on London Bridge to make the connection between radicalization and YouTube as justification for strict oversight and removal of “hateful speech.”
Approaching her early election against Labour leader Jeremy Corbyn, Prime Minister Theresa May made an address pressuring internet companies to change their policy, pinning them for the murder for “creating a safe space for extremism,” calling for international agreements to “regulate cyberspace.”
There are fair points to both May and Wojcicki’s tackling of jihadist extremism. The Independent details how Manchester bomber Salman Abedi learned how to make an explosive device, which is alleged to have come from both YouTube and access to harmful jihadist/manufacturing sites listed on the dark web.
Wojcicki writes that YouTube’s enforcement teams have reviewed two million videos for violent extremist content since June, resulting in well over 150,000 jihadist videos having been taken down. This is a genuinely trying task that must be taken with caution… but where does the line get drawn when it comes to who is “an extremist, a misleader, or a bad actor”?
On TrigTent, we’ve reported on the shady technical practices of Twitter, another social media platform who have a much broader definition of “extremist.” One that includes jihadists, alt-right agitators such as The National Policy Institute’s Richard Spencer, known for espousing white supremacist views, and center-right critics of politics and religion like Tommy Robinson.
This resulted in the removal of the blue verification checkmark, turning this feature into endorsement more than identity protection, and the possible monitoring of a user’s personal life.
We’ve also discussed the trouble regarding YouTube’s suppression of war crimes which could have been used as case evidence.
“It’s something that keeps me awake at night,” Julian Nicholls, a senior trial lawyer at the International Criminal Court who prosecutes war criminals, told The Intercept. “The idea that there’s a video or photo out there that I could use, but before we identify it or preserve it, it disappears.”
This puts YouTube in a tight situation, caught between whether to stick by their creators, using security for the obvious hazards like jihadist content, or whether the limit will go as high as big brother can reach.
Ms. Wojcicki stresses in her article that YouTube remains a “force for creativity and learning,” but her leanings become clear when details how “bad actors” are creating “problematic content.”
“In the last year, we took action to protect our community against violent or extremist content… Now, we are applying the lessons we’ve learned in order to tackle other problematic content. Our goal is to stay one step ahead, making it harder for policy-violating content to surface or remain on YouTube.”
Unlike a constitution, which has a secure foundation for the people’s rights, liberties, and entitlements, what violates a policy can be ever changing. Evidenced in Twitter’s own 2009 policy of being the “free speech wing of the free speech party” turning to censor, removal and endorsement. It’s the liberty conundrum as explained by Benjamin Franklin:
“Those who would give up Essential Liberty, to purchase a little Temporary safety, deserve neither Liberty nor Safety.”