Grief claws at your heart as you tell the attorney your daughter’s story.
It’s hard enough to explain how your 12-year-old daughter was raped by a boy in her class.
It’s harder still to explain how a video of the rape was posted to a social media site.
But the part which finally causes hot, angry tears to start spilling over your cheeks is when you tell the attorney how the social media company refused to take the video down.
“I don’t understand,” you say, furiously wiping the back of your hand across your eyes. “I reported it. I explained that the video was of child sexual abuse. That it’s child pornography. I thought they’d take it down immediately, but no. They were so dismissive . . .”
You tell the attorney that you want to sue this social media company. You understand it isn’t their fault that your child was raped—but it is their fault that the video of her abuse has been viewed over and over again, by hundreds of thousands of individuals who take pleasure in your child’s abuse and perpetuate her trauma. It is the company’s fault that your daughter has had to see comments and messages flooding in, jeering, threatening to find her and rape her too, telling her she should kill herself . . .
When you’ve finished talking, you look to the lawyer for answers, hoping she can help save your child from the abuse and trauma she continues to experience from a single video.
The attorney’s eyes are compassionate . . . but her mouth is set in a grim line.
“I’ll do my best to help you,” she says. “But I have to warn you that I can’t promise success. You see, there’s an ambiguity in the law. Most judges have interpreted current law to mean that online platforms can’t be held liable for third-party uploaded content, regardless of what that content is.”
For a moment, you just stare at her, uncomprehending. “B-but . . .” you stammer, “But the video is illegal, right? And the company knew about it! I reported the video, I told them it was child rape! If it’s illegal and they’re knowingly hosting it, letting it get passed around, letting it rack up views . . . how can they not be held liable for that??”
The above hypothetical story echoes the experience of real parents who tried to fight for justice against online platforms that knowingly disseminated videos of their child’s sexual abuse.
The fight is an uphill battle because of Section 230 of the Communications Decency Act (CDA 230). While never intended to have this effect, CDA 230 has been interpreted to mean that online platforms can’t be held liable for the dissemination of child sexual abuse material (CSAM, a.k.a. “child pornography”), so long as the material is uploaded by a third party.
Fortunately, a crucial bill has been reintroduced which seeks to resolve this problem: the EARN IT Act.
BREAKING: @RepAnnWagner and @RepSylviaGarcia reintroduced the EARN IT Act!
— National Center on Sexual Exploitation (@NCOSE) April 19, 2023
This is the strongest piece of bipartisan legislation to confront the explosion of online child sexual abuse material.#ProtectKidsOnline #Detect2Protect #EARNITActhttps://t.co/rcne9hvRNZ
Help Us Pass The EARN IT Act!
The EARN IT Act is the strongest piece of bipartisan legislation to confront the explosion of online child sexual abuse material.
Having been introduced in previous sessions and passed unanimously by the Senate Judiciary Committee twice, the EARN IT Act has been reintroduced by Senators Richard Blumenthal and Lindsey Graham in the Senate, and by Representatives Ann Wagner and Sylvia Garcia in the House of Representatives.
With your help, we hope this will be the year we bring EARN IT all the way to the finish line!
Please take 30 SECONDS to contact your Members of Congress, urging them to Co-Sponsor the EARN IT Act! (Keep reading below the action form for more information on the EARN IT Act)
What Does EARN IT Do?
The EARN IT Act does several key things:
- Clarifies there is no immunity for social media and technology companies that knowingly facilitate the distribution of child sexual abuse material (CSAM)
- Gives victims a path to justice and possibility of restoring their privacy
- Updates existing federal statutes to replace “child pornography” with the more accurate term “child sexual abuse material” (CSAM). This content is crime scene documentation; “child pornography” fails to convey the seriousness of the abuse
- Allows the National Center on Missing and Exploited Children (NCMEC) to update its tools
The Senate version has an additional significant component: establishes a commission of survivors, technology representatives, civil rights experts, and other stakeholders to recommend best practices for Big Tech to implement to respond to the astronomical increase in online sexual exploitation of children—including enticement for sex trafficking.
Does EARN IT Violate My Privacy?
A common misconception about the EARN IT Act is that it poses a threat to the privacy of Internet users. This is not true; nothing more than a red herring pushed by Big Tech.
The tools that companies use to scan for CSAM are similar to how they scan for malware and spyware. These existing processes are what help to secure platforms and protect privacy.
The EARN IT Act not only protects your privacy but will incentivize companies to protect children’s privacy—especially victims of CSAM whose privacy is being violated in the most heinous way, as images of sexual abuse are distributed via the internet for thousands to view.
For a thorough analysis of the myth that EARN IT is a threat to privacy and encryption, read this blog.
To learn more about the EARN IT Act, please visit endsexualexploitation.org/earnit/