Sex Trafficking Survivors Reveal Instagram is a Popular Online Slavery Auction Block

By:

A few months ago I met with 3 young girls who were sex trafficking survivors in Washington DC.

They showed us their Instagram accounts, which were set to private. Any parents would assume this meant the app was locked down and safe for a minor to use. However, even though their accounts were set to private they received a dozen or more messages from strangers, adult men, on a regular basis. They would ask them to meet up, or ask them for sexually explicit photos.

Sometimes they complimented girls and got them to feel loved and like they were their boyfriends. Sometimes they used sexually explicit photos they got from the girls to extort and blackmail them into sex trafficking or abuse—sometimes both!

We know from the research that psychological manipulation and coercion is the most prevalent and popular tactic to coerce a victim, including falsely proclaimed love, establishing superiority to intimidate, and also manipulating other emotional needs. Not all chains are visible, and as we know from the Domestic Violence sector, psychological coercion can keep individuals in a death grip.

These girls shared how, almost universally, these men would use pornography of sex trafficked girls to advertise them on Instagram, or would use livestream features on Instagram to auction them off to sex buyers.

While exploiters seeking to groom children and teens used to have to find them in person, they now have the ability to anonymously reach them with a few clicks of a button.

Instagram is rated as safe for children 12+, along with Snapchat, and TikTok which have all been methods for sex trafficking, and abusers, to groom and abuse children. These app ratings are misleading and leave parents and children unaware of the risks involved, because each app gets to rate itself. Right now the industry has no accountability and transparency to make safe digital spaces for kids.

There are two solutions that we need from the tech industry: first, we need both devices and social media apps to embrace age-based default safety settings—where by default safety features are turned on and you have to go in to intentionally turn them off. This would be the opposite of our current system where it’s assumed that you want zero safety measures and the parent or individual has to go searching for the safety controls to turn them on. Second, we need an independent app ratings board with sanctioning power for non-compliance. This system would be like the ratings board created for the movie and videogame industry—operating independently of the government, run by a cross-section of industry and child development experts. While Congress would have no control over this independent board, we are calling on Congress to request that the tech industry takes the initiative to set up this review board.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.