It appears the #YouTubeWakeUp movement, which called out pedophiles exploiting the online platform, remains a thorn in YouTube’s side. Once again, the video-sharing hub has been forced to take additional steps to restrict children from being targeted by its underbelly of sexual predators, which now includes banning young kids from live streaming without adult supervision, according to a recent blog post from administrators.
For months, YouTube has struggled with both monitoring and moderating the content on its platform when it comes to “minors in risky situations.” This refers to the “soft-core pedophile ring,” discovered by Matt Watson that consisted of millions of user comments across hundreds of videos — videos featuring unwittingly exploited children who were used as a form of “gateway drug” content for this community of pedophiles to direct each other towards more insidious material. It appears as though the YouTube suggestion algorithm helped to amplify these efforts.
Watson’s original report demonstrated how the YouTube algorithm could lead someone from simply searching bikini hauls with consenting adults to falling down a “wormhole” of disturbing videos featuring these objectified children. This is a fundamental concern considering the Pew Research Center found that 81% of American parents let their children use YouTube, not to be confused with the more restrictive YouTube Kids platform known for curated content such as Peppa Pig and Dora The Explorer, which at least gives the perception of a family-friendly YouTube.
YouTube’s largest advertisers, such as Disney, McDonald’s, Nestle and “Fortnite” developers Epic Games, weren’t buying the look. These entities soon began withholding marketing funds from the YouTube platform in light of the scandal, which forced executives to update their policies. These were introduced in a YouTube memo published in February announcing their plans to remove the ability to leave comments on nearly all videos featuring kids, restricting the spread of children’s content in inappropriate ways.
ome June 3, however, the NYT found YouTube’s recommendation system was still suggesting videos of “prepubescent, partially clothed children” to users who had previously watched sexually themed content. It was the same shit, different targets and all YouTube could say was that they had disabled 400 channels and removed millions of comments. The fundamental problem still persisted. These targets sadly include YouTube creators with “family friendly content” caught in the cross-fire of the site’s evolving algorithms.
One of those is Stephen Sharer, a family-friendly channel that focuses on innocent challenges featuring silly string and pools, who told Newsweek about YouTube’s responsibilities: “I’ve always wanted to make content that my parents or grandparents could feel comfortable watching and get a positive message from it. Along with that came a young demographic. I want to make sure I provide a positive message and safe environment for everyone.”
“With YouTube being so new,” Sharer continues, “there were a lot of policies that were left up in the air. As time evolved, YouTube has taken precautions and is sometimes learning the hard way, sometimes after it happens. You don’t necessarily send your kid to a playground without watching them and if you do there’s always a risk of something happening. It’s the same concept with YouTube.”
Lawmakers, such as Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), are reportedly “demanding answers” from YouTube after the NYT’s cited Harvard researchers released its findings, even sending a letter to YouTube’s CEO Susan Wojcicki:
“The sexualization of children through YouTube’s recommendation engine represents the development of a dangerous new kind of illicit content meant to avoid law enforcement detection. Action is overdue,” the senators wrote.
This effort was supported by Sen. Josh Hawley (R-MO), an outspoken critic of big tech, who announced he would introduce legislation requiring sites like YouTube to leave videos of children out of its recommendation engine:
“Every parent in America should be appalled that YouTube is pushing videos of their children to pedophiles,” Hawley said in a statement to The Hill. “It’s equally outrageous that YouTube refuses to take the most effective step necessary to fix the issue.”
This broad-stroke approach could be necessary for undermining this pedophile racket, though will no doubt serve a blow to family-friendly content creators. When it comes to live streaming, YouTube’s new policies are only enforceable once it’s been manually reviewed, considering the current artificial-intelligence classifiers can’t distinguish between fascist content vs anti-fascist content vs historical content, as we’ve reported before, let alone whether someone is actually of a mature age. YouTube claims its AI can “find and remove more of this content,” but I’ll believe that when I see it. On the issue of underground child abuse, the issue requires more effort than a monopoly enterprise can handle.
“YouTube appearing in the press again for predatory comments and practices of users of its platform is not a surprise,” one of the advertiser sources told Adweek. “When you’re dealing with a platform that generates 300-plus hours of video per minute, the realities of being able to check and verify the content become daunting. For anyone saying human vetting is the only way to go, be prepared for a vastly reduced YouTube with ‘waiting times’ and liberal arguments of censorship and free speech. If people are asking for machine learning to solve their problems, be prepared for issues like this to keep appearing. No amount of investment in people or technology will solve these issues for YouTube; it’s ingrained in the very DNA of the platform.”