Mainstream Contributors To Sexual Exploitation
The parental controls are tone deaf and the filters need to remove sexually explicit images and prevent predators from accessing kids.
Pornography on Spotify?? Child Predators on Spotify??
Yes, that’s right!
We too were surprised when a parent reached out to us, raising concerns about pornography being accessible to kids on Spotify. But upon conducting our own research, we realized it was true. And soon the problems on the streaming platform were hitting mainstream news outlets and being raised by numerous other child safety experts and influencers. Part of the purpose of the Dirty Dozen List is to highlight when harm is facilitated by an entity most people trust. This is the case for Spotify—a platform which seems to simply be used for music, educational podcasts, and the like, but which is actually harboring hidden dangers.
Pornography (including content that normalizes sexual violence, child sexual abuse, and incest) can be easily found on Spotify in the form of thumbnails graphically depicting sexual activity and nudity, as well as “audio pornography” (recordings of sex sounds or sexually explicit stories read aloud). Spotify claims to prohibit sexually explicit content, but clearly this policy is very poorly enforced.
In addition to this there was a high-profile case of a child who was groomed and exploited through Spotify by predators who communicate with them through editing playlist titles, asking the child to upload sexually explicit material of themselves as thumbnails. NCOSE researchers easily found multiple profiles on Spotify that seemed to be dedicated to soliciting or sharing “nudes.”
Despite increased awareness of how the platform can be used by child predators, Spotify offers no clear, accessible reporting procedure for child sexual exploitation—something NCOSE researchers discovered when we were attempting to report apparent child sexual abuse material that we found on the platform.
Spotify assures parents: “We have designed Spotify to be appropriate for listeners 13+ years of age.” Clearly, they have much more work to do to turn this lie into a truth!
Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Roblox for more details.
Spotify’s Platform Rules prohibits “sexually explicit content,” which they say “includes but may not be limited to pornography or visual depictions of genitalia or nudity presented for the purpose of sexual gratification.” However, despite these policies, such content is easily found on Spotify.
See more examples in the PDF linked below.
Inexplicably, Spotify also surfaces designated children’s content next to results forpornography (next page).
Newspapers and influencers have also been raising concerns about the pornographic content on Spotify, in thumbnails as well as in audio content.
Spotify offers an “explicit content filter” which they assure parents will block “content which may not be appropriate for teens.”
Content which is blocked by the filter is marked with an “E.” However, NCOSE researchers found that the majority of pornographic content on the platform is not marked explicit, and therefore not blocked by the filter.
For example, in one sampling of sexually explicit thumbnails 8 out of 10 were not marked explicit.
In addition, audio pornography (recordings of sex sounds or sexually explicit stories being read aloud) is extremely prevalent on Spotify, and is often not marked as explicit.
We have demonstrated how thumbnails can often be sexually explicit or highly inappropriate for teens, and the same is true for descriptions which can detail sexual encounters. See the below PDF for screenshot proof.
It is not difficult for kids to stumble on highly sexual content on Spotify. For example, NCOSE researchers stumble on it when trying to search for songs popular with teens. In some cases, the researcher only needed to search the first innocuous word in the title before being served highly sexual results. It is therefore of the utmost importance that Spotify’s explicit content filter function to a high standard, to prevent children from being exposed.
Spotify’s platform rules prohibit “Advocating or glorifying sexual themes related to rape, incest, or beastiality.” However, NCOSE researchers were able to easily find content which normalized sexual violence, sexual abuse, and incest.
For example, we found a podcast called “The Rape Show With The #1 Rapist In The Country,” which included tips on how to commit rape and how not to get caught by the authorities.
Further content normalizing or trivializing rape can surface with unrelated searches such as when researcher searched for “rap,” indicating how easy it is to come across this horrific content.
NCOSE researchers also very easily found content on Spotify that participated in a trend of “step” incest, which has been normalized by pornography. A search for “stepbro” instantly pulls up an abundance of such content to scroll through.
In January 2023, a case of an 11-year-old girl being groomed and sexually exploited on Spotify recently made news headlines. Sexual predators communicated with the young girl via playlist titles and encouraged her to upload numerous sexually explicit photos of herself as the cover image of playlists she made.
Survivor-advocate Catie Reay recently released a TikTok showing evidence of grooming on Spotify, such as playlists that tag other users and ask them to send sexually explicit content of themselves.
NCOSE researchers very easily found a number of suspicious Spotify profiles which may have been or be dedicated to the soliciting or sharing of nudes. In one case, we found a profile that shared an email address and tagged other users in playlists, asking them to send nudes.
Further, while gathering proof of pornography on Spotify, NCOSE researchers came across apparent child sexual abuse material (CSAM) and consequently discovered that Spotify had no clear, accessible report procedure for CSAM.
Numerous Google searches attempting to determine how to correctly CSAM/child sexual exploitation to the platform failed to turn up instructions from Spotify. Instead, the search results simply turned up Spotify podcasts that talked about child sexual abuse, and news articles about cases of child sexual exploitation on Spotify.
Eventually, after trying many different search queries, the first result for a query of “report content on Spotify” was a Spotify page with instructions on how to report “infringing, illegal, or hate content.” However, after trying the instructions on the CSAM, it did not work. We eventually discovered that the reporting mechanism is not available for podcasts (the majority of pornography NCOSE researchers found on Spotify was in podcast form!!)
Trying again, the second result for the query “report content on Spotify” finally took the NCOSE researcher to forms for reporting content. There was no category for reporting child sexual exploitation, nor any category for reporting illegal content. It was not obvious which category the researcher should report the CSAM under.
In addition to indisputable cases of real child sexual exploitation, NCOSE found a plethora of content which it was not clear whether real child sexual abuse was involved, but which certainly normalized and fetishized child sexual abuse.
For example there was an audio pornography podcast NCOSE found which depicted child sexual abuse (translation: “He’s not your son anymore! When the whole barracks attends the anal preparation of her child by her supervisor”) In a similar vein, our team also found plenty of “ABDL” (Adult Baby/Diaper Lover) content on Spotify. This is another term denoting the roleplaying of child sexual abuse—specifically the sexual abuse of infants.