A few months ago I met with 3 young girls who were sex trafficking survivors in Washington DC.
They showed us their Instagram accounts, which were set to private. Any parents would assume this meant the app was locked down and safe for a minor to use. However, even though their accounts were set to private they received a dozen or more messages from strangers, adult men, on a regular basis. They would ask them to meet up, or ask them for sexually explicit photos.
Sometimes they complimented girls and got them to feel loved and like they were their boyfriends. Sometimes they used sexually explicit photos they got from the girls to extort and blackmail them into sex trafficking or abuse—sometimes both!
We know from the research that psychological manipulation and coercion is the most prevalent and popular tactic to coerce a victim, including falsely proclaimed love, establishing superiority to intimidate, and also manipulating other emotional needs. Not all chains are visible, and as we know from the Domestic Violence sector, psychological coercion can keep individuals in a death grip.
These girls shared how, almost universally, these men would use pornography of sex trafficked girls to advertise them on Instagram, or would use livestream features on Instagram to auction them off to sex buyers.
While exploiters seeking to groom children and teens used to have to find them in person, they now have the ability to anonymously reach them with a few clicks of a button.
Instagram is rated as safe for children 12+, along with Snapchat, and TikTok which have all been methods for sex trafficking, and abusers, to groom and abuse children. These app ratings are misleading and leave parents and children unaware of the risks involved, because each app gets to rate itself. Right now the industry has no accountability and transparency to make safe digital spaces for kids.
There are two solutions that we need from the tech industry: first, we need both devices and social media apps to embrace age-based default safety settings—where by default safety features are turned on and you have to go in to intentionally turn them off. This would be the opposite of our current system where it’s assumed that you want zero safety measures and the parent or individual has to go searching for the safety controls to turn them on. Second, we need an independent app ratings board with sanctioning power for non-compliance. This system would be like the ratings board created for the movie and videogame industry—operating independently of the government, run by a cross-section of industry and child development experts. While Congress would have no control over this independent board, we are calling on Congress to request that the tech industry takes the initiative to set up this review board.