PornHub’s Video Purge May Be A Preview Into A Post-Section 230 Internet

In 2020, there’s no law so fundamentally misunderstood as Section 230 of the Communications Decency Act, the controversial big tech provision being sniped at from all political sides of Capitol Hill. Passed in 1996 during the dawn of the internet, this law protects any “interactive computer service” from being treated as the publisher or speaker of third-party content, such as when users violate copyright, provide sex-work and violate federal criminal law.

To put it simply, the law tries to hold the individual criminals responsible for their own crimes, shielding the establishments and their owners from being targeted unfairly. What does this mean? If we imagine a popular highway was used as the key gateway route of a robbery, an illegal activity conducted through abusing an opportune thruway, should the owners of the highway and the robbers be treated as legally inseparable? Even if they were simply the unwitting accomplice whose existence was the key to success? What if they profited from the robbers’ toll booth fees? And who is expected to repay the debt? The ones holding the guns? Or whichever state or company involved has the biggest wallet?

If the highways of physical transport were held to this legal standard, turning all travelers into their effective representatives, would it be any surprise if they exercised extreme caution? Warrantless cargo searches and identity checks as though your property were their own? Such provisions would go against the principles of the fourth amendment, robbing citizens of their right to privacy during free movement. In the highways of digital information, a similar ethical dilemma is arising, albeit more questionable in its applications. And the politicians on Pennsylvania Avenue have little interest in discussing the necessary nuance.

On Monday, Pornhub restricted 8.8 million videos after changing its policies to ban unverified users from uploading content to the platform. The world’s most popular erotic website decided to enact the nuclear option, purging nearly 65% of its original video library overnight so long as they weren’t uploaded by official content partners or members of its model program. “This means every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat, and Twitter have yet to institute,” the company said in its announcement. 

This follows a report from Internet Watch Foundation which alleges to have found 118 instances of child sexual abuse material on the site across three years, which the platform said was “less than one percent” of content uploaded. 

This statistic should be taken with a pinch of salt given the nature of the site. Pornhub’s response was to cite Facebook’s own damning transparency report which found 84 million instances of child sexual abuse material. It begs the question of how a pornographic site with nowhere near the staff and resources of Facebook could have so few instances of illegal material on it. Are the users of Pornhub just that perfect or were the searches conducted on the platform not as thorough as that IWF figure would have us believe? 

The IWF states their reports only included a “human-eyes assessment” of “confirmed” cases of child abuse, suggesting there could be an unknown amount of unconfirmed, hidden, obscured and debatable instances of such illicit material. Pornhub has not produced its own transparency report on illegal content, opting to purge the website of anything uploaded by any remotely semi-anonymous users. As it stands, the exact number remains unconfirmed.

To be clear, Pornhub was not persecuted under Section 230. Rather, their actions seem to be at least in part motivated by the fallout from a recently published opinion piece by The New York Times writer Nicholas Kristof, questioning how the website could continue to profit off of videos of genuine cases of exploitation, revenge porn, assault, and rape. Although Kristof presents flawed, omnipotent speculation on what constitutes sex work versus sex trafficking, there are numerous cases of performers speaking publicly about how the site continually allows pirated and non-consensual content to remain on the platform, profits from the performers' work, refuses to take removal actions when pressured, or takes some action only to allow the same videos to be re-uploaded by different users. A culture of exploitation was at play, whether cases were in the hundreds or the millions.

As child porn became the rotten cherry on top, boycott decisions by both Mastercard and Visa led to dumping their support for the platform. Though neither of these instances involves 230 directly, the story does offer us a look into the potential fallout of the provision’s demise. Instead of walking the ethical tightropes of prosecuting sex trafficking and protecting sex work, bipartisan calls from President Trump and President-elect Joe Biden have broadly called for reforms which would harm the sex industry across the globe, whereas Sen. Lindsey Graham (R-SC) and Rep. Tulsi Gabbard (D-HI) have taken further steps to outright repeal the tech liability shield which, in no hyperbolic terms, could send the entire internet economy into a panic spiral given the massive legal implications.

Currently, digital giants like Facebook, Twitter, Google and the like are free to make their own terms of service policies, all of which vary in their liberties and restrictions. Morality aside, if the companies themselves keep their noses clean, government prosecutions for crimes are relatively stable against the offenders and the market is relatively free to self-regulate, however unjust their practices may be. To stop sexual exploitation, the vile scourge of the internet, we deserve a better class of opposition. If all users went from third-party users to effectively first-party representatives of a platform, it’s likely that we’d see PornHub’s impulsive response repeated time and again, resulting in only stricter self-regulation policies where admins don’t have the skills, expertise, resources, time, or man-power to govern digital speech, let alone harmful content. In this defacto climate of prohibition, nuclear options like the one undertaken by Pornhub may become commonplace to avoid legal reprisal.  

As announced by Pornhub, these videos will be removed pending verification and review, stating their new verification process will begin sometime in the new year. This places their once active creators, posting consensual, legal, ethical work under non-verified accounts, in a worrying position. According to Vice, Pornhub creators are unable to receive their payouts through the two biggest credit card companies, leaving workers and activists to believe this is a “dangerous, discriminatory decision — one fueled by anti-porn campaigners and conservative activist groups” who want to force the work-force out. 

And given that workers can’t upload videos or receive income during the pandemic economy, who can blame them for moving elsewhere? Like any attempt to repeal and replace 230, it merely offsets the costs of restrictions onto its most vulnerable users rather than the site itself and doesn’t address the issues of how to shutdown exploitative productions, how to get those victims out of these horrific situations and how the site will pay them back, if at all. 

“If you wonder what the internet would be like without Section 230, Pornhub’s response to losing its payment processors offers a pretty good preview. ‘Verified’ content only; everything else disappears,” tweeted Platformer’s Casey Newton.

In a statement published Friday, Sex Workers Outreach Project Behind Bars wrote that the decision will force more sex workers into the margins, calling it a “war” on sex workers. “We say ‘war against sex workers’ because the damage they do does not impact the labor as much as it affects the laborers who depend on the Pornhub platform to earn a living,” it wrote. “[…] Violence against sex workers includes the societal and institutional violence that has led to the shuttering of our online platforms that give us a measure of safety and allow us the critical resource that is the ability to access banking.”

The Pornhub announcement backed up this sentiment, speculating the lack of action on a social media monopoly like Facebook versus their own platform is motivated by a selective bias, though attributes this to be working within the erotic content genre rather than status or how a boycott could hurt the processors financially. And there is some merit to these accusations given, as discovered by Vice, one of the major online campaigns calling for action against Pornhub is Traffickinghub, which claims to oppose the trafficking of women. It just so happens to be propped up by another group called Exodus Cry, a Christian fundamentalist group that aims to abolish porn and opposes decriminalizing sex work.

“It is clear that Pornhub is being targeted not because of our policies and how we compare to our peers, but because we are an adult content platform,” the announcement stated. “The two groups that have spearheaded the campaign against our company are the National Center on Sexual Exploitation (formerly known as Morality in Media) and Exodus Cry/TraffickingHub. These are organizations dedicated to abolishing pornography, banning material they claim is obscene, and shutting down commercial sex work. These are the same forces that have spent 50 years demonizing Playboy, the National Endowment for the Arts, sex education, LGBTQ rights, women’s rights, and even the American Library Association. Today, it happens to be Pornhub.”

And it may be Pornhub who establishes the new standard for Congress, who see the new restrictions as a “harbinger of how the web might change if it has threatened, removes a key liability protection for online platforms,” write Axios journalists Scott Rosenberg, Ina Fried, and Ashley Gold. “Every major online platform — Facebook, YouTube, Twitter, TikTok, and beyond — is built on a foundation of the material posted by the public… Before tinkering with Section 230 again, lawmakers should look at SESTA/FOSTA’s record of effectiveness as well as the collateral damage it inflicted. As the Pornhub example shows, public and media pressure might change sites’ behavior faster than legislation.”

We’re essentially presented, then, with which poison we prefer. As I reported last year, SESTA/FOSTA were two pieces of legislation that “took a contradictory sledgehammer” to the issue of online sex trafficking by making the entire sex-work industry more dangerous “as a means of preserving America’s freedoms” while at the same time offering little concrete explanation as to how it would help victims. Part of the issue was that the bills would completely strip 230 protections from websites that “promote or facilitate prostitution,” who would face between “10 to 25 years in a federal prison” per offense. The problems were immense, offering no definition of “promote or facilitate,” giving an uncomfortable amount of leeway for Congress to deem any form of filmed sex work—consenting or otherwise—illegal and offered little insight as to how the law would affect international prostitutes and clients where sex work is legalized.

This was different from other reformist bipartisan Senate bills that would rightly allow victims of rape or sex trafficking to sue porn sites that profit from their images and videos. With a flawed repeal and replace strategy, Congress would effectively force Pornhub and other sites into a scenario similar to Tumblr, which was removed from necessary outlets like the Apple iOS store until to outright removed pornography of any kind. As a society, we’re stuck between granting these powers to mega-corporations like Apple, Visa, and Mastercard or to government authorities who don't understand the technologies they seek to regulate, whether it’s the case of online speech, abuse, commerce, or simple removals.

“I don’t think it’s going to come anywhere close to fixing the whole problem,” argued Scott Berkowitz, CEO of the Rape, Abuse & Incest National Network, speaking with Axios. “For example, verifying posters is an important step, but doesn’t go far enough to ensure that everyone depicted is a willing, consensual adult. There’s been a staggering increase in the amount of child sexual abuse material that’s available in addition to the posting of revenge porn and other videos that are posted without the consent of all participants.”

As such, Pornhub will only have to increase the resources it puts toward moderation, try to assess between what is consensual adults enacting all sorts of sex scenes and “the minefield that is user-generated content.” Berkowitz argues it’s almost impossible to know if users consented to the acts shown, if users consented to broad distribution of the video, and if everyone was of the age of consent. 

“The key question is, is [Pornhub] going to implement these changes fully?” added Yiota Souras, senior vice president and general counsel for the National Center for Missing and Exploited Children. “On paper, it’s great. But there must be investment and follow-through.” 

Given PornHub’s track record, it’s very unlikely to invest anywhere other than puffing up their own pockets and peckers, forcing those evils to take the reigns as we face that impossible unknown. As far as I’m concerned, brace for impact. A new era of tech regulation, whether private or public, is just getting started.

Related News
Comments