Published on BBC.com.
Despite attempts by social networks to clamp down on child porn, some Twitter users have been swapping illegal images and have used tweets to sexualise otherwise innocent photos.
They begin as innocuous selfies or pictures taken by friends or family members. But in the eyes of a small cohort of warped Twitter users, they become something else entirely.
“The pictures are usually young girls in their school uniform or a swimsuit,” says Joseph Cox, a freelance journalist writing for Motherboard, part of Vice News. “Some have been taken by the girls themselves. It’s not clear whether they’ve then sent them to a boyfriend who’s uploaded them… others appear to have been ripped from their social media sites.”
Cox’s investigation into this underground world started with a search of one hashtag which threw up one of the otherwise innocent-looking photos.
“Users were asking to trade pictures of similar aged girls and they were commenting on her appearance and how attractive they found her,” he says. “Some of the comments did get very explicit.”
The pictures themselves are not pornographic but Twitter’s guidelines are clear: child sexual exploitation isn’t tolerated.
It’s policy on the issue states: “When we are made aware of links to images of or content promoting child sexual exploitation they will be removed from the site without further notice.” In addition, users face a permanent ban for promoting child sexual exploitation. Most of the posts that Cox found were later taken down by Twitter.
But the murky world of comments and replies is not the only exploitation problem on the social network. One American woman who spoke to BBC Trending said she uncovered a huge amount of child pornography on Twitter after reading rumours about it on Reddit.
“There was a minimum 14,000 accounts involved in the creation, distribution or retweeting of child porn,” says Molly (not her real name).
The victims? “Girls as young as five, and definitely under 15.”