At 10:15 pm, John DeMay bid his 17-year-old son Jordan goodnight.
Minutes later, Jordan received an Instagram message from someone who appeared to be a sweet, pretty young lady. They began chatting. The “young lady” eventually convinced John to send her a sexual photo.
Suddenly, everything changed. The young lady was not a young lady at all, but an organized criminal group of six men on another continent. The men used the image they’d received from Jordan to viciously extort him.
By 3:45 am—just five and a half hours after John said goodnight to his son—Jordan died by suicide.
“My home is locked,” John said. “My home is secure. And Instagram allowed 6 men to come into my home at three o’clock in the morning and murder my son.”
John Demay is one of the many parents who have lost their children to suicide due to sexual extortion and exploitation on social media platforms. Alongside other parents, survivors, and subject matter experts, DeMay testified at an issue briefing hosted by NCOSE: #BigTechBS: How Tech Lies About Platform Safety, Potential Fixes, and the Needs of LGBTQ+ Youth.
Held the day before Big Tech CEOs were grilled by Congress at a January 31st hearing, this briefing called out the lies and excuses tech companies use to avoid accountability for the immense harm they propagate.
As child protection legislation gains momentum in the wake of the January 31st hearing, those lies and excuses continue to be spread in a frenzied attempt at resistance.
So, let’s break down some of those lies here.
“Stop Scapegoating the LGBTQ+ Community!”
Tech companies frequently excuse their failure to provide safeguards for children by claiming that such safeguards would harm the LGBTQ+ community.
Aaron Crowley (Survivor, Pastor, Author, and member of the LGBTQ+ community) called out this tactic at the January 30th issue briefing, saying, “Please stop scapegoating my community. Please stop misusing us.”
Alongside other LGBTQ+ speakers, Crowley explained that safeguards for children on tech devices serve LGBTQ+ youth just as much and more as other youth:
“Because we are often ostracized by our families and our in-person communities, many LGBT youth turn to online spaces, online communities for safe spaces. But since there aren’t any safeguards, this leaves LGBT kids more vulnerable to cyber-bullying, exploitation, blackmail, being outed, and being exposed to explicit material.”
Research shows that LGBTQ+ youth are statistically more likely to experience sexual exploitation and other harms online. For example, according to 2023 research from Parents Together, LGBTQ+ youth are 2-3x more likely to be asked for sexual images and acts on social media, compared to non-LGBTQ+ youth.
According to 2022 and 2023 research by Thorn, LGBTQ+ youth are more likely than non-LGBTQ+ youth to have online sexual interactions with someone they believe to be an adult (45% vs. 26%) and are also more likely to agree that adults grooming minors online is a common experience (91% vs. 82%).
This makes safeguards crucial for protecting LGBTQ+ youth online. And it makes it especially offensive when Big Tech tries to wiggle their way out of accountability by claiming that such accountability would harm the LGBTQ+ community.
The Kids Online Safety Act (KOSA) Will Not Harm the LGBTQ+ Community
Tech lobbyists have persistently protested the Kids Online Safety Act (KOSA)—a bill which would require tech companies responsibly design products centering child safety from the get go—by claiming it would lead to censorship of the LGBTQ+ community.
At the briefing, Laura Marquez-Garrett, Esq. (Victims’ Attorney, Social Media Victims Law Center, and member of the LGBTQ+ community) explained that KOSA has been carefully revised in consultation with the LGBTQ+ community to specifically address concerns they had about earlier drafts of the bill. Marquez-Garret said:
“KOSA is not about content. It’s not about censorship … KOSA does not hold platforms liable for what a child asks to see—that’s Section 3b1 of KOSA. It also does not prevent these platforms from using personalized recommendation systems to display content to minors as long as they’re only using information like the language that’s spoken, geolocation, age—that’s section 4e3c … You will not see Big Tech talking about those provisions in KOSA, or several others that the co-authors put in there in specific response to LGBTQ concerns.”
Crowley expressed gratitude for these revisions and considers passage of the updated version of the bill urgent: “I’m so thankful changes have been made and have been implemented to ensure this bill is not misused and weaponized against LGBT people and our online spaces,” he said. “So now, what are we waiting for? We need these protections.”
Jay Benke (Survivor, Human Trafficking Consultant, and member of the LGBTQ+ community) similarly called for passage of KOSA, saying, “I absolutely support KOSA. I believe that all children deserve protections and I believe that queer children deserve the same protection.”
[See also: Addressing Misinformation about the Kids Online Safety Act (KOSA)]
Pornography is Neither Necessary nor Healthy for LGBTQ+ Youth
Tech companies have sometimes tried to rationalize allowing children access to pornography by claiming that such access is necessary for LGBTQ+ youth to explore their sexual identities. For example, in 2021, Apple recanted on a promise they’d made to automatically blur pornographic images received or sent by all minors (they have since begun automatically blurring nude images only for children 12 and under).
In explaining its decision, Apple cited the supposed needs of LGBTQ+ youth.
Crowley called this excuse “absurd and ridiculous.” He said that pornography doesn’t help people understand their sexuality—rather, it distorts sexuality. He shared his own story as an example:
“I was only 9 years old when I was first exposed to pornography. I didn’t even know what sex was. So when I saw these scenes of sexual violence, I learned sex is violent … It normalized sexual violence for me, so much so that when I was raped, I didn’t even realize I was raped. A group of guys I met on social media raped me. They took pictures and shared them online. The Internet taught me it was normal. And so, because I thought it was normal, I didn’t have a healthy way to process that trauma. Instead, I thought, ‘If it’s going to happen to me anyway, I might as well get paid for it.’ So, when I met my pimp on social media and he gave me the opportunity to do ‘mainstream porn,’ I was already groomed to say yes.”
[See also: Research Spotlight: “Sexual Violence as a Sexual Script in Mainstream Online Pornography”]
LGBTQ+ youth need respectful representation and age-appropriate, healthy sexual education—not pornography.
Pornography’s harm to children has been demonstrated time and again in research. It is the last thing we should be providing to vulnerable LGBTQ+ children who are already struggling with sexuality and already at greater risk for sexual abuse and exploitation.
Exposing the Double Standard
The tech industry enjoys a shocking privilege that no other industry in America enjoys: near blanket immunity from being sued for the harms it causes. This is due to Section 230 of Communications Decency Act (CDA 230), which U.S. courts have misinterpreted as meaning tech platforms can’t be held liable for any third-party content.
That’s essential background for understanding how Big Tech CEOs have the brazenness to stand before Congress and a room full of parents whose children died due to social media, and claim that they were doing nothing wrong.
But as Tim Estes (Tech Innovator, CEO of Angel AI) aptly stated at the briefing:
“If this were a different industry, like the airline industry, this would be like the CEO of Boeing showing up in front of Congress and explaining why it was okay when one plane fell out of the sky every week. Because the scale of harms are nearly equivalent.”
It’s time for this double standard to end.
John DeMay vehemently criticized Section 230 of the Communications Decency Act:
“Section 230 needs to go away immediately. If social media companies think that their platforms are so safe and they’re doing these wonderful things that are great, then the government and the people need to get rid of Section 230 and we will let that sort out in court … If they’re truly doing what they need to be doing and what they say that they’re doing and the protections are actually there, then the social media companies will win the lawsuits all day long. And if they’re not, they’re going to go broke.”
It’s a Matter of WON’T, Not a Matter of Can’t
Big Tech loves to pretend that it’s really doing everything it can to protect kids, and if companies aren’t doing more, that’s just because the technology doesn’t exist to implement better solutions.
That’s a lie.
Tech innovator Tim Estes explained that solutions absolutely do exist which allow Big Tech to stop the devastation being inflicted on millions of children. Big Tech simply chooses not to implement these solutions, because their priority is profit and growth—not safety.
“[Meta] is going to spend 10 billion dollars on Nvidia chips to build an artificial general intelligence system that it plans to open source,” Estes explained. “The company spends less than 3 billion a year on safety. If Meta took one year to do the level of work that they put into their AI research on the kids’ safety problem, this would all go away. And that sadly has been true for years. So that’s the dirty little secret hiding behind historic limitations that are five or six or seven year old excuses when it really is about not being willing to allocate the capital to fix the problem…”
ACTION: Ask your Representatives to Support the Kids Online Safety Act (KOSA)
Take 30 SECONDS to fill out the quick action form below.