Picture this:

You’re a parent of a 13-year-old child who is just starting to discover the world of social media and video games.

As you’re aware that the online world can pose some dangers to kids, you try to do a bit of research on the various apps your child is using, to make sure they’re safe.

Luckily, most platforms have safety centers that brag about their focus on well-being—grooming and sexual exploitation are barely mentioned so you assume that means kids are largely protected. In fact, most of the app stores rate the popular platforms as appropriate for 12-year-olds and up. You feel reassured that your child will be safe.

. . . Or so you thought.

Could you have imagined that an app which was purportedly “built especially for kids” would be rife with sexual predators?

Could you have imagined that your 13 year-old would be groomed by a grown man into sending him sexually explicit images?

Could you have imagined that you’d wake up one night to find your child gone, because they’d agreed to meet this man in person?

These are the all-too-real dangers children face, alone, online every day.

Children deserve safe environments where they can connect, learn, love, and thrive both off and online. Advances in technology have sparked positive, new opportunities for children to grow. However, they have also opened the door to serious threats and profound harms to children’s safety and wellbeing.

Rather than improving with time, these problems are worsening at an exponential rate, with the Internet Watch Foundation reporting 2021 as the worst year on record for online child sexual exploitation.

The scope and magnitude of the dangers may seem daunting. But change is possible if every one of us fights to secure a safer, more joyful future for our children.

True Story of a Minor Exploited on Twitter

True Story: Exploitation on Twitter

*Jane and John are pseudonyms, but the experiences happened to real people

It was around 11:30 pm when the text message came that would blow Jane’s world apart.

[Your son] is on the phone with my niece right now and he’s discussing that he doesn’t want to live anymore and is suicidal.

Jane couldn’t believe it. In fact, at first she didn’t. She thought, surely there must be some mistake. By all appearances, Jane’s son seemed to be thriving. He was performing well in both academics and extra-curricular activities, he was involved at his church . . . There was simply no way that he was suicidal.

But that night, Jane spoke to her son, and he told her that it was true. He was suicidal because of a video that was being passed around at his school.

As Jane listened to her son, the horrifying story unfolded.

Three years ago, when John was thirteen, a sex trafficker contacted him and his friend on Snapchat, posing as a sixteen-year-old girl. This “girl” sent the two boys nude photos and asked that they send some photos and videos in return.

“Mom, I was thirteen,” John said to his mother, three years later. “I’d never seen a naked girl before.”

So, John and his friend agreed.

Immediately, what had appeared to be a flirtatious conversation devolved into blackmail and extortion. The sex trafficker threatened to send the videos to John’s parents and mentors, as well as post them online, unless John and his friend sent him even more graphic videos of them performing sexual acts.

At this point, John realized that he was not talking to someone dangerous. But he didn’t know how to reach out for help.

The blackmail continued for weeks, and eventually, the sex trafficker demanded that John meet him in person.

John refused.

The sex trafficker made good on his threats. A sexually explicit video compilation of John and his friend was posted to Twitter. There, it gathered 167,000 views and was retweeted more than 2,000 times. It was also passed around by people in John’s school.

John and his mother contacted Twitter multiple times, begging them to take the video down. Only to be met with this chilling response:

"We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time."

While eventually Twitter did remove the video after a direct request from the Federal Government, the terrifying reality is that any number of those 167,000 people who watched the video could have downloaded and re-uploaded it elsewhere on the internet.

“To this day, we have no idea where it is and how many people are still viewing it,” Jane said.

That video will likely haunt John for the rest of his life.



Current Harms Facing Youth Online:


Countless children are being groomed for sexual abuse and sex trafficking on social media platforms and apps, often through direct messages, video calls, and live streaming.

Pornography Exposure Abuse

A lack of or incorrect set up of filters or safety setting on devices (including school-issued electronics) or apps leaves children at risk for viewing hardcore pornography, exposing them to the extreme, violent, degrading, abusive, and racist depictions of sex acts in mainstream pornography, disrupting natural child sexual development, and serving as the first or primary “education” about sexual relationships. Childhood pornography exposure, a form of child sexual abuse, is also a major catalyst for child-on-child harmful sexual behavior.

Self-Generated Child Sexual Exploitation Material

There has been a dramatic rise in children producing sexually explicit images of themselves—called self-generated child sexual exploitation material (SG-CSEM) or self-generated child sexual abuse material (SG-CSAM). This includes activities such as sending and receiving of sexually explicit material (“sexting”), posting of sexually explicit images online, livestreaming explicit sex acts, and even selling this content on social media or other platforms. 

Misleading App Ratings

Apps are currently self-rated with no accountability to account for the unique criteria that could increase risk factors for sexual abuse or exposure to pornography, and so many parents are unaware of the risks associated with different apps used by their children. Further, there are no industry standards for age-based default filtering.

Why is This Happening?

Technology corporations have prioritized profit and innovation at the expense of child safety, placing the burden on parents and schools—even children themselves—to protect minors from inherently harmful platforms and products. Governmental regulatory agencies have been slow to recognize and address online sexual abuse and exploitation of children.

It is time to put the onus of responsibility back on those in the public and private sector with the power to enact commonsense, systemic solutions. Change can be instigated by private corporations, legislators, and school officials embracing their responsibility to keep children safe so that they can connect, learn, love, and thrive online and off.

What You Can Do To Help

1. Download our Building A Safe Internet brochure to learn how these sectors can act to dramatically improve child online safety.
2. Take action or share your story using the form below

Download NCOSE's new FREE resource:

Building a Safe Internet so Youth Can Connect, Learn, Love, and Thrive

Submit your email address in this form to receive your FREE resource directly in your inbox!

Take Action

Check Your Local School’s Online Databases

Download the National Center on Sexual Exploitation’s action packet

Chromebook Safety 101

5 Easy Tips for a Better School Year

Public Health Harms of Pornography

Download the research summaries of studies on the harm of pornography

Pornography In Our K-12 Schools

Download the National Center on Sexual Exploitation’s presentation on this issue

Netflix Parental Controls

Talk Today, Safer Tomorrow

10 Easy Conversation Starters To Talk With Kids About the Dangers of Pornography

Share Your Story

Have your kids been exposed to pornography and sexual exploitation online?



You can be a leader in your area! We have many resources to help you.

You can download our “GETTING STARTED” packet right here.

Please let us know that you’re leading the charge in your area and let us help you! We are here to help you organize, coordinate, network, and be successful! Send us your name, phone number, and location. Email public@ncose.com.

Children’s Internet Protection Act (CIPA)

Additionally, it is important to note that a specific law pertaining to libraries and schools regarding the Internet was passed by Congress in 2000 and was found to be constitutional by the U.S. Supreme Court in 2003. The ALA and the ACLU commonly misinterpret this law and disseminate misleading information to libraries and schools regarding their rights to place filters on computers. You will likely face this.

The Children’s Internet Protection Act (CIPA) is a federal law enacted by Congress to address concerns about access to offensive content over the Internet on school and library computers. CIPA imposes certain types of requirements on any school or library that receives funding for Internet access or internal connections from the E-rate program – a program that makes certain communications technology more affordable for eligible schools and libraries. In early 2001, the FCC issued rules implementing CIPA.

What CIPA Requires

  • Schools and libraries subject to CIPA may not receive the discounts offered by the E-rate program unless they certify that they have an Internet safety policy that includes technology protection measures. The protection measures must block or filter Internet access to pictures that are: (a) obscene, (b) child pornography, or (c) harmful to minors (for computers that are accessed by minors). Before adopting this Internet safety policy, schools and libraries must provide reasonable notice and hold at least one public hearing or meeting to address the proposal.
  • Schools subject to CIPA are required to adopt and enforce a policy to monitor online activities of minors.
  • Schools and libraries subject to CIPA are required to adopt and implement an Internet safety policy addressing: (a) access by minors to inappropriate matter on the Internet; (b) the safety and security of minors when using electonic mail, chat rooms, and other forms of direct electronic communications; (c) unauthorized access, including so-called “hacking,” and other unlawful activities by minors online; (d) unauthorized disclosure, use, and dissemination of personal information regarding minors; and (e) measures restricting minors’ access to materials harmful to them.
  • Schools and libraries are required to certify that they have their safety policies and technology in place before receiving E-rate funding.
  • CIPA does not affect E-rate funding for schools and libraries receiving discounts only for telecommunications, such as telephone service.
  • An authorized person may disable the blocking or filtering measure during any use by an adult to enable access for bona fide research or other lawful purposes.
  • CIPA does not require the tracking of Internet use by minors or adults.

Visit the Federal Communications Commission’s info page about CIPA by clicking here.


Opposition to CIPA:

In 2001, the ALA and the ACLU challenged the law on the grounds that the law required libraries to unconstitutionally block access to constitutionally protected information on the Internet. They specifically argued that “no filtering software successfully differentiates constitutionally protected speech from illegal speech on the Internet.” In 2003, upon appeal to the U.S. Supreme Court, the law was upheld as constitutional and that it was permissible to install filters on all school and library computers, and further held that it was constitutional to mandate libraries receiving specific funding to have filters installs.

The ALA and ACLU often argue that it is against a person’s First Amendment rights to have to ask a librarian to remove a filter for a desired search. However, this is exactly what the High Court said was sufficient in instances where an individual wanted to access material blocked by a filter. Adults may ask the librarian to unblock material. This is an important added barrier to individuals viewing indecent material in our schools and libraries. The mere need to ask would deter most individuals from attempting to view such material, and the requests would largely remain for material that is reasonable or for a specific purpose other than gratifying one’s personal desire to view porn.

  • Local: Your local community might have specific regulations already in place. You’ll have to check your specific area.
  • State: Some states have legislation in place already. You can find details about your state here:  http://bit.ly/xPCEuW. The list is not complete, but we are working on providing an updated database for you.
  • Federal: There are a number of federal laws relating to pornography and specifically to filtering in schools and public libraries. See below for more details on these. You can also learn more about federal obscenity laws here: www.WarOnIllegalPornography.com.
  • Children’s Internet Protection Act: It is important to note that a specific law pertaining to libraries and schools regarding the Internet was passed by Congress in 2000 and was found to be constitutional by the U.S. Supreme Court in 2003. This law mandates that libraries and schools must have filters in place if they opt-in to receive specific e-rate federal funding. The ALA often recommends that libraries refuse this funding so that they don’t have to filter.

Much of the available hard-core adult pornography online is actually illegal.

First, one common misconception people have is that pornography is legal and protected by the First Amendment. The truth is that obscenity (hardcore adult pornography) is prohibited under existing Federal laws. These laws prohibit distribution of hardcore, obscene pornography on the Internet, on cable/satellite or hotel/motel TV and in sexually oriented businesses and other retail shops. Additionally, it is important to understand that obscenity is not protected by the First Amendment. This has been repeatedly upheld by the U.S. Supreme Court. The role of the Federal Government should be, as it has been in the past, to prosecute the major producers and distributors of obscene pornography. However, the U.S. Justice Department is not currently enforcing these laws and for the last 20 years only sporadically enforced them. Thus, illegal, obscene pornography is flooding our nation and the harm is great.

Many people do not understand that obscenity is actually illegal. The American Library Association (ALA), the American Civil Liberties Union (ACLU) and other anti-filtering groups often exploit this common misunderstanding to argue against the use of filters. You should often point out that filters would block obscenity, which is a majority of the hardcore adult pornography accessed on the Internet and likely accessed at your library/school.

For details on federal obscenity laws and U.S. Supreme Court rulings upholding these laws, visit https://endsexualexploitation.org/doj/

What is pornography?

The term “pornography” is a generic, not a legal term. As noted by the Supreme Court in the landmark 1973 obscenity case, Miller v. California, 413 U.S. 15, 20, n.2, the term: “Pornography” derives from the Greek (harlot, and graphos, writing). The word now means “1: a description of prostitutes or prostitution 2. a depiction (as in a writing or painting) of licentiousness or lewdness: a portrayal of erotic behavior designed to cause sexual excitement.” Webster’s Third New International Dictionary [Unabridged 1969]…

What is obscenity?

The term “obscenity” is a legal term, and in Miller v. California, supra the Supreme Court established a three-pronged test for determining whether a “work” (i.e., material or a performance) is obscene and therefore unprotected by the First Amendment. To be obscene, a judge and/or a jury must determine:

First, that the average person, applying contemporary community standards, would find that the work, taken as a whole, appeals to the prurient interest; AND second, that the work depicts or describes in a patently offensive way, as measured by contemporary community standards, “hardcore” sexual conduct specifically defined by the applicable law; AND third, that a reasonable person would find that the work, taken as a whole, lacks serious literary, artistic, political and scientific value.

SIDE NOTE: Typical “hardcore pornography” (e.g., a website, DVD or magazine) consists of little if anything more than one depiction of hardcore sex after the other (i.e., it’s “wall-to-wall” sex).