“We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”
This was Twitter’s (now called X) appalling response to a mother who reported child sexual abuse material (CSAM, a.k.a. “child pornography”) of her 13-year-old son.
The material had been posted to Twitter, gathering 167,000 views and 2,000 retweets. When it circulated at his high school, the boy became suicidal.
But when the teenager and his mother reached out to Twitter for help, providing proof that he was a minor and asking for the CSAM to be taken down, Twitter flatly refused. Instead, the company confirmed in writing that it reviewed the content but chose to leave it active. Chose to continue disseminating illegal child sexual abuse material. Chose to continue profiting from the unbearable trauma of a 13-year-old boy.
This is why the NCOSE Law Center is representing the boy and his mother in a lawsuit against Twitter.
One would think a case like this would be a shoo-in. After all, Twitter knowingly possessed and distributed child sexual abuse material, which is illegal under federal law.
But here’s the appalling truth: the NCOSE Law Center has pursued this case all the way up to the 9th Circuit Court of Appeals. And so far, the Judges’ rulings have cleared Twitter of all charges.
?????
“How is this possible?!” one may well ask.
It’s possible because of a specific law, which has been repeatedly misinterpreted as granting blanket immunity to tech companies for participating in a vast assortment of crimes. The law is Section 230 of the Communications Decency Act (CDA 230). And it is the single greatest enabler of online sexual abuse and exploitation today.
BRIEFING TO US CONGRESS NATIONAL CENTER ON SEXUAL EXPLOITATION from NCOSE on Vimeo.
How CDA 230 Has Been Misinterpreted
In the early 1990’s, something confusing was happening in the courts. Two interactive computer services were sued for defamatory comments posted by a third party. One provider, CompuServe, was not held liable, because it wasn’t clear that it knew or should have known about the comments. The other provider, Prodigy, was held liable, because the court ruled that it should have known about the comments. Why did the court think Prodigy should have known? Because it had made an active effort to monitor and remove harmful content. In doing so, the court argued that Prodigy had acted as a “publisher,” rather than merely a host of third-party authored content.
The perplexing conclusion of these two court cases was as follows: if an online service did little or nothing to curb illegal or harmful content, it would not be held liable. However, if an online service made an active effort to curb such content, it could be held liable precisely because of those efforts.
Clearly, this was unfair, and damaging as it disincentivized companies from trying to combat illegal or harmful content. To resolve this issue, and to combat the dissemination of obscene material to children, Congress passed the Communications Decency Act (CDA) in 1996. Section 230 of the CDA was expressly meant to ensure that an interactive computer service’s “good faith efforts” to monitor third-party content could not be used as a reason for liability.
However, the initial intention of the CDA and Section 230 has since been grossly warped.
In Reno v ACLU (1997), the Supreme Court struck down most of the provisions of the CDA which protected children from obscene material, but left Section 230 standing. Ironically, Section 230 has since been interpreted by many courts as granting blanket immunity to tech companies for any third-party content—including when such companies are very clearly acting in bad faith, not good faith (such as in the Twitter case). Congress’s goal was to avoid disincentivizing tech companies from combatting harmful content—but thanks to the courts’ misinterpretations, that is exactly what has happened! Nothing disincentivizes tech companies from combatting harmful content more than blanket immunity.
How CDA 230 Enables Sexual Abuse and Exploitation
As a result of misinterpretations of CDA 230, tech companies have knowingly facilitated and profited from sexual abuse and exploitation, with impunity for more than a decade.
We have been fighting for clarifications of Congressional intent around CDA 230, and had a victory when the U.S. Senate passed FOSTA-SESTA in 2018. This law clarified that there is no immunity for tech companies that knowingly or recklessly facilitate sex trafficking and/or promote the prostitution of others (pimping).
However, since the passage of FOSTA-SESTA, trends in the courts have not been encouraging. Reddit was sued by six minors and their parents, for ignoring requests to remove child sexual abuse material over an entire decade. Astonishingly, in 2022, the judge ruled in favor of Reddit due to CDA 230 “immunity” and the survivors received no justice.
Likewise, the NCOSE Law Center and co-counsel’s Twitter case faces an uphill battle due to negative rulings founded on Section 230. Although the lawsuit has not yet concluded, and the NCOSE Law Center continues to fight for a favorable outcome, tech companies across the nation are already citing the 9th circuit’s ruling in this case to justify violating federal laws against child pornography.
Meanwhile, reports of child sexual abuse material are increasing at a staggering rate, jumping from just 600 thousand in 2008 to over 32 million in 2022.
It’s past time for something to change.
Take Action! Call for Clarifications of CDA 230
Together with our allies and supporters (people like YOU!), we are working to pass the EARN IT Act, which clarifies that tech companies do not have immunity for knowingly or recklessly facilitating the distribution of child sexual abuse material.
This is a vital bill that would stop harmful misinterpretations of CDA 230 in courts, give survivors a path to justice, and stem the widespread proliferation of CSAM online. You can learn more about the EARN IT Act here.
Please take 30 SECONDS to help us pass the EARN IT Act, by completing the action form below!