
At the age of 15, Jane met a stranger on Facebook. Her and this man had a lot of mutual Facebook friends, but one thing she found to be odd: his profile featured many pictures of young women in sexual positions.
The man began messaging Jane on Facebook quite frequently. He showered her with compliments and was adamant that she pursue a career in modeling.
One day, Jane Doe told the stranger about a family argument. Taking advantage of Jane’s vulnerability in that moment, the man offered her a position as a model and said they should meet in person. When the two met up, he took photos of her in sexual positions, like the ones of the girls on his Facebook profile. Unbeknownst to Jane, the stranger then uploaded the photos to Backpage and advertised her for prostitution, which for a minor is always sex trafficking. Because of the advertisements, Jane Doe was “raped, beaten, and forced into further sex trafficking.”
Despite Facebook’s dangerous features that allow adult predators to easily connect with minors, when Jane Doe filed a lawsuit against the company for facilitating sex trafficking, the case was dismissed under Section 230 of the Communications Decency Act.
If Facebook hadn’t allowed adult strangers to contact and groom children in the first place, this sex trafficking never would have occurred. And yet, they can’t be held responsible?
John had a turbulent childhood. His father left him and his mother was tragically murdered. At age 15, his female science teacher sensed that John was vulnerable because of these hardships. After getting John alone in a room after class, she was able to get his Snapchat account username. She began messaging him on the app, sending sexual pictures of herself.
Eventually the interactions escalated to the teacher sexually abusing John in person. She gave John money to buy certain prescription drugs, which she would use them before molesting him. Until one day, he overdosed.
During John’s lengthy recovery in the hospital, his legal guardian sued multiple defendants, including Snap, Inc., alleging that Snap failed to prevent the teacher’s abusive behavior and designed an application that allowed sexual predators to thrive.
But the court ruled Snap could not be held liable. Why? Section 230 of the Communications Decency Act.
At 15 years old, John Doe downloaded and created an account on Grindr. He was lonely, and often bullied at school for being on the spectrum. He thought he could find some companionship on Grindr.
Despite him being younger than the platform’s professed age limit, there was no robust age verification process in place. All he had to do was check a box to claim that he was over 18.
Almost immediately upon downloading the app, Grindr matched John with four unknown adult males. When he met up with the men in person, they raped him several times.
Undoubtedly, those four adult men shoulder the blame for John’s rape. But so does Grindr.
Grindr claimed to be an 18+ dating app, yet it is aggressively marketed to children. It did not verify John’s age when he signed up for an account. Instead, it welcomed the child onto the app, geolocated him, and matched him with nearby adult predators.
But when John and his family filed a lawsuit against Grindr for facilitating this child sexual abuse, the court dismissed the case under Section 230 of the Communications Decency Act. Section 230 has been interpreted to protect tech companies from facing repercussions when their platforms harm their users, even when the company clearly took an active role in causing that harm.
TRIGGER WARNING: The following content contains descriptions of sexual assault and rape that may be upsetting for some readers.
J.B.’s sex trafficking began when she was only a minor. She was raped by sex buyers, some who inflicted physical violence upon her and threatened her with weapons. One night, a man came up to the hotel room to sexually abuse her. When she cried out for help, a hotel customer overheard and called the police.
While J.B. was afraid of the dangers she faced from the sex buyers who raped her, what she feared even more was what her trafficker would do to her if she got him in trouble with the authorities. Terrified for her life, she pleaded with the officers not to arrest him. The police officers left, and the hotel employees allowed the trafficker to stay with J.B. in her hotel room.
All of this was made possible through craigslist. J.B.’s trafficker advertised her on craigslist’s “Erotic Services” webpage.
J.B. eventually filed a lawsuit against craigslist, alleging that they financially benefited from the ads and made “an estimated $36 million in revenue” from trafficking on its site. She also accused the platform of knowing minors were being trafficked on its site and failing to take action to stop it.
Sadly, when J.B. brought her claims into court, they ruled that craigslist had immunity under Section 230 of the Communications Decency Act.
C.A. was only 12 years old when an adult man groomed her on Twitter and then on Snapchat.
The two first interacted on Twitter, but he quickly found her on Snapchat. Though the perpetrator had already previously been charged with engaging in illegal sexual behavior with minors, Snapchat allowed him to create an account. The predator messaged C.A. on Snapchat, bombarding her with child sexual abuse material (CSAM), and then manipulated her into creating CSAM of herself and sending it back to him.
Following these digital exchanges, this child predator sought out C.A. at her home and sexually molested, recording the acts and distributing them online.
You might be thinking: “Rewind. How did a man who had previously been charged with sexually abusing minors so easily create an account on a platform that is a hotspot for young teens?”
We’re wondering the same thing. Despite Snapchat’s clear disregard for user safety by allowing this predator to join their platform and interact with kids, when C.A.’s family filed a lawsuit against Snap, the case never went anywhere.
The court ruled that due to Section 230 of the Communications Decency Act, Snapchat is immune from facing consequences for the nefarious actions of their users. Even when its faulty platform design was what allowed these actions to take place.
Backpage, a website similar to craigslist, was known for online advertisements selling things like personal items or cars. It featured job postings and rental opportunities. Something else it was known for? Sex trafficking.
Jane Doe was sex trafficked as a minor on Backpage. The website had a section entitled, “Escort Services,” where users can find “local postings.” These postings are tailored to a user’s location information, allowing users to find ads of women and girls located in their area. Advertisements of the underage Jane were posted on Backpage, and inquiries from sex buyers led to her being raped over 1,000 times.
Jane Doe sued Backpage for allowing this sex trafficking to occur, for facilitating sex trafficking by making it easier for users to sell people online, and for financially benefiting from the illegal behavior.
But courts dismissed her case under Section 230 of the Communications Decency Act.
When she was in her teens, Jane was sex trafficked. Her trafficker repeatedly assaulted her and forced her to record sexually explicit videos of herself, which he later uploaded to pornography websites.
This child sexual abuse material (CSAM) was viewed on multiple pornographic sites over 160,000 times. Jane reported the CSAM to the pornography websites several times, but they took no action to remove it. Only after Jane got an attorney involved did they take it down.
Jane sued these websites for failing to remove illegal child sexual abuse material, after her repeated requests. You would think this case would be a slam dunk. Surely, these websites are in the wrong for refusing to take down videos of Jane’s sexual abuse and instead exploit her trauma for their financial gain... Right?
Well, you would think.
Tragically, due to Section 230 of the Communications Decency Act, Jane was denied justice, even though she was so clearly wronged. The judge even said that the pornography websites suggesting search terms like “toddler” was “neutral” and they could not be held accountable for that.
K.B. met a complete stranger on Instagram who began sending her direct messages and posting comments on her Instagram page. The statements he posted consisted of commonly-known grooming tactics, but Instagram failed monitor these messages, even though they had the technology to do so.
K.B.’s trafficker suggested meeting up in person, and within two days of their first meeting, he began to sex traffic Jane Doe on Instagram. She was promised the safety of a relationship, a lavish lifestyle, and a family, but instead was met with sexual abuse, threats of being left on the street, and threats of violence if she did not fully participate in her trafficker’s venture.
The accounts used to sex traffic and advertise K.B. were created under fake names, and still exist on the platform to this day. Even after K.B.’s direct trafficker was convicted and sentenced to 40 years, Instagram still has not removed the trafficker’s Instagram account.
When Jane Doe sued the platform for its faulty product features that contributed to her trafficking, the case was dismissed. Section 230 of the Communcations Decency Act comes to Big Tech’s rescue once again.
Jane Doe, a minor, downloaded the Kik messaging app,naive to the hidden dangers that lie beneath its surface.
It wasn’t long before the trouble started. Jane began receiving messages from strange adult men on Kik. They sent her sexually explicit images of themselves and coerced her to do the same.
When Jane’s father discovered what was happening, he was appalled. He reported the behavior to the police immediately. They also sued Kik for facilitating this sexual abuse, as Kik knew its site was being used to sexually exploit children, but did not implement policies to help stop this.
But it didn’t matter. Section 230 of the Communications Decency Act allowed Kik to deflect any accountability, as usual. And Jane and her family were left with nothing after the abuse they suffered at the hands of this tech giant.
At 16-years-old, Jane discovered her ex-boyfriend had shared sexually explicit images and videos of her on Reddit, without her consent.
She frantically reported the content to Reddit, seeking its removal. But for days, it remained on the website, despite the fact that it amounted to illegal child sexual abuse material (CSAM) since Jane was a minor. Eventually, the content was removed and Jane could breathe a sigh of relief...
...But not for long.
To her dismay, the CSAM was uploaded to Reddit again. Every time she reported it and it was taken down, it was posted again. Jane got Reddit to ban her ex-boyfriend’s account, but he was relentless. He created a new account and continued to repost the CSAM again and again.
Jane felt hopeless. Seeking some justice, she filed suit against Reddit for failing to take adequate measures to prevent CSAM from spreading on its site. Enter Section 230 of the Communications Decency Act: case dismissed.
The court ruled that Section 230 gave Reddit immunity from liability, despite their lethargic approach to content moderation. Reddit was knowingly profiting from advertising on subreddits that were known for spreading CSAM. Yet the court said this just meant Reddit “turned a blind eye” to sex trafficking, and could not be sued for benefiting from it.
“We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time.”
This was how Twitter (now X) responded to John Doe, after he reported child sexual abuse material (CSAM) of him that was circulating on the platform.
It all started when he was 13 years old and met a stranger on Snapchat. Thinking this person was a girl at his school who had a crush on him, the two exchanged nude photos. But he was wrong. The “girl” was actually a sexual predator.
The predator began blackmailing John, requiring that he provide more sexually explicit content or have his photos released to his friends and family. John Doe eventually summoned up the courage to block the trafficker.
“You’re making a huge mistake,” the trafficker told him.
Years later, John discovered his images and videos were posted on Twitter. He urgently requested their removal, providing proof of his identity and age at the time, but Twitter refused. The videos remained on the site, accruing over 167,000 views. Only after the Department of Homeland Security intervened did Twitter finally take them down.
When John sued Twitter for refusing to remove the illegal child sexual abuse material, the case was dismissed under Section 230 of the Communications Decency Act, which courts have interpreted to give platforms almost complete immunity from anything connected to user content.
But Twitter knew CSAM depicting John was on its platform. How can they be protected when they knowingly possessed, distributed, and profited from illegal material?
TRIGGER WARNING: Disturbing account of child sexual exploitation
The world was on lockdown from the COVID-19 pandemic. Seeking connection,
C.H. logged onto Omegle to video chat with other kids. After ending the call with them, she was randomly placed in another chatroom, but this time, she could not see the other person on the line. The screen was black...
Suddenly, text began to appear on the screen.
The stranger with whom Omegle had connected this 11-year-old started rattling off C.H.’s personal information. He threatened to hack her devices if she did not comply with his demands to remove her clothes and touch herself in a sexual way. She pleaded for him to stop, but he was relentless. She eventually complied, as the predator took screenshots.
After the horrific experience, C.H. told her parents what happened. Heartbroken and disturbed, they promptly called the police. They also sued Omegle for its lack of regulations, which allowed C.H. to be connected with an unknown adult man, ultimately leading to her exploitation.
But their case never went anywhere. Why? Section 230 of the Communications Decency Act.
Despite the fact that it was Omegle’s dangerous product design that allowed an 11-year-old to be connected with an adult predator, Section 230 allowed Omegle to claim it was not responsible for the behavior of users on its website.
This is why Section 230 MUST be repealed. Section 230, the greatest enabler of sexual exploitation, is the sole focus of this year’s Dirty Dozen List. With YOUR help, we can help survivors, like C.H., get the justice they deserve.
*None of the images on this page depict actual survivors
The Dirty Dozen List is an annual campaign that historically called out twelve mainstream entities for facilitating, enabling, and even profiting from sexual abuse and exploitation. Since its inception in 2013, the Dirty Dozen List has galvanized thousands of individuals like YOU to call on corporations, government agencies, and organizations to change problematic policies and practices. This campaign has yielded major victories at Google, Netflix, TikTok, Hilton Worldwide, Verizon, Walmart, US Department of Defense, and many more.
However, despite years of advocacy and victories, sexual abuse and exploitation is increasingly rampant online. Progress is slow and piecemeal, as Big Tech lacks a foundational incentive to prioritize online safety.
It is time to stop accepting incremental change.
The very foundation of our online society, the Communications Decency Act Section 230, was laid in the early days of the Internet back in 1996, but its cracks have now become chasms. What once seemed a necessary legislative underpinning for online business to thrive now stands as the greatest opus to shield technology companies from any and all accountability—especially when it comes to the proliferation of sexual exploitation. Misinterpretations of Communications Decency Act Section 230 grant Big Tech blanket immunity for any and all types of sexual abuse and exploitation they facilitate. Until we amend CDA 230, corporations can’t be held accountable!
Enacted in 1996, Section 230 predates social media, modern smartphones, and the modern internet; it governed just 20 million American users, not today’s 300 million.
Established before the advent of Google (1998) and YouTube (2005), Section 230 has not been meaningfully changed, outdated for today’s digital giants hosting billions of content pieces.
Drafted long before the rise of modern social media, this law was already in place eight years before Facebook launched in 2004, eventually connecting billions globally.
By 2012, with Snapchat’s new ephemeral messaging, Section 230 had been static for 16 years amidst vast technological changes.
Despite the rise of deepfake technology and widespread application of AI, the law continues to allow blanket immunity from liability in an era of exponential internet growth.
Enacted in 1996, Section 230 predates social media, smartphones, and the modern internet; it governed just 20 million American users, not today’s 300 million.
Established before the advent of Google (1998) and YouTube (2005), Section 230 remains unchanged, outdated for today’s digital giants hosting billions of content pieces.
Untouched since its inception, it was already in place eight years by the time Facebook launched in 2004, connecting billions globally.
By 2012, with Snapchat’s new ephemeral messaging, Section 230 had been static for 16 years amidst vast technological changes.
Despite the rise of deepfake technology and widespread application of AI, the law remains unchanged in an era of exponential internet growth.
Call on legislators to repeal Section 230 of the Communications Decency Act
This year, instead of highlighting 12 companies that facilitate sexual exploitation, we are highlighting 12 survivors who were denied justice in the courts because of Section 230 of the Communications Decency Act.
These survivors were silenced, and the companies that enabled their abuse were given immunity.
Misinterpretations of Communications Decency Act (CDA) Section 230 have granted Big Tech blanket immunity for facilitating rampant sexual abuse and exploitation. Until we repeal Section 230, corporations have NO INCENTIVE to make their products safer.
Spread the word to end Section 230
Download powerful graphics to share on social media and help spread the word to repeal Section 230 of the Communications Decency Act.
“We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time.”