Mainstream Contributors To Sexual Exploitation

Porn is Too Easy to Spot On Spotify

The parental controls are tone deaf and the filters need to remove sexually explicit images and prevent predators from accessing kids.

Pornography on Spotify?? Child Predators on Spotify??  

Yes, that’s right! 

We too were surprised when a parent reached out to us, raising concerns about pornography being accessible to kids on Spotify. But upon conducting our own research, we realized it was true. And soon the problems on the streaming platform were hitting mainstream news outlets and being raised by numerous other child safety experts and influencers. Part of the purpose of the Dirty Dozen List is to highlight when harm is facilitated by an entity most people trust. This is the case for Spotify—a platform which seems to simply be used for music, educational podcasts, and the like, but which is actually harboring hidden dangers.

Pornography (including content that normalizes sexual violence, child sexual abuse, and incest) can be easily found on Spotify in the form of thumbnails graphically depicting sexual activity and nudity, as well as “audio pornography” (recordings of sex sounds or sexually explicit stories read aloud). Spotify claims to prohibit sexually explicit content, but clearly this policy is very poorly enforced. 

In addition to this there was a high-profile case of a child who was groomed and exploited through Spotify by predators who communicate with them through editing playlist titles, asking the child to upload sexually explicit material of themselves as thumbnails. NCOSE researchers easily found multiple profiles on Spotify that seemed to be dedicated to soliciting or sharing “nudes.”  

Despite increased awareness of how the platform can be used by child predators, Spotify offers no clear, accessible reporting procedure for child sexual exploitationsomething NCOSE researchers discovered when we were attempting to report apparent child sexual abuse material that we found on the platform. 

Spotify assures parents: “We have designed Spotify to be appropriate for listeners 13+ years of age.” Clearly, they have much more work to do to turn this lie into a truth! 

Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Roblox for more details.

See our Notification Letter to Spotify here.

Take Action

Our Requests for Improvement

Proof

Evidence of Exploitation

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.

Spotify’s Platform Rules prohibits “sexually explicit content,” which they say “includes but may not be limited to pornography or visual depictions of genitalia or nudity presented for the purpose of sexual gratification.” However, despite these policies, such content is easily found on Spotify. 

See more examples in the PDF linked below. 

Inexplicably, Spotify also surfaces designated children’s content next to results forpornography (next page).

Newspapers and influencers have also been raising concerns about the pornographic content on Spotify, in thumbnails as well as in audio content.  

 

See the proof we’ve collected in this easy to view or download PDF.

Spotify offers an “explicit content filter” which they assure parents will block “content which may not be appropriate for teens.”  

Content which is blocked by the filter is marked with an “E.” However, NCOSE researchers found that the majority of pornographic content on the platform is not marked explicit, and therefore not blocked by the filter.    

For example, in one sampling of sexually explicit thumbnails 8 out of 10 were not marked explicit. 

In addition, audio pornography (recordings of sex sounds or sexually explicit stories being read aloud) is extremely prevalent on Spotify, and is often not marked as explicit. 

We have demonstrated how thumbnails can often be sexually explicit or highly inappropriate for teens, and the same is true for descriptions which can detail sexual encounters. See the below PDF for screenshot proof.

It is not difficult for kids to stumble on highly sexual content on Spotify. For example, NCOSE researchers stumble on it when trying to search for songs popular with teens. In some cases, the researcher only needed to search the first innocuous word in the title before being served highly sexual results. It is therefore of the utmost importance that Spotify’s explicit content filter function to a high standard, to prevent children from being exposed.

See the proof we’ve collected in this easy to view or download PDF.

Spotify’s platform rules prohibit “Advocating or glorifying sexual themes related to rape, incest, or beastiality.” However, NCOSE researchers were able to easily find content which normalized sexual violence, sexual abuse, and incest.

For example, we found a podcast called “The Rape Show With The #1 Rapist In The Country,” which included tips on how to commit rape and how not to get caught by the authorities.

Further content normalizing or trivializing rape can surface with unrelated searches such as when researcher searched for “rap,” indicating how easy it is to come across this horrific content.

NCOSE researchers also very easily found content on Spotify that participated in a trend of “step” incest, which has been normalized by pornography. A search for “stepbro” instantly pulls up an abundance of such content to scroll through.

See the proof we’ve collected in this easy to view or download PDF.

In January 2023, a case of an 11-year-old girl being groomed and sexually exploited on Spotify recently made news headlines. Sexual predators communicated with the young girl via playlist titles and encouraged her to upload numerous sexually explicit photos of herself as the cover image of playlists she made. 


Survivor-advocate Catie Reay recently released a TikTok showing evidence of grooming on Spotify, such as playlists that tag other users and ask them to send sexually explicit content of themselves. 


NCOSE researchers very easily found a number of suspicious Spotify profiles which may have been or be dedicated to the soliciting or sharing of nudes. In one case, we found a profile that shared an email address and tagged other users in playlists, asking them to send nudes. 

Further, while gathering proof of pornography on Spotify, NCOSE researchers came across apparent child sexual abuse material (CSAM) and consequently discovered that Spotify had no clear, accessible report procedure for CSAM. 

Numerous Google searches attempting to determine how to correctly CSAM/child sexual exploitation to the platform failed to turn up instructions from Spotify. Instead, the search results simply turned up Spotify podcasts that talked about child sexual abuse, and news articles about cases of child sexual exploitation on Spotify. 

Eventually, after trying many different search queries, the first result for a query of “report content on Spotify” was a Spotify page with instructions on how to report “infringing, illegal, or hate content.” However, after trying the instructions on the CSAM, it did not work. We eventually discovered that the reporting mechanism is not available for podcasts (the majority of pornography NCOSE researchers found on Spotify was in podcast form!!) 

Trying again, the second result for the query “report content on Spotify” finally took the NCOSE researcher to forms for reporting content. There was no category for reporting child sexual exploitation, nor any category for reporting illegal content. It was not obvious which category the researcher should report the CSAM under. 

In addition to indisputable cases of real child sexual exploitation, NCOSE found a plethora of content which it was not clear whether real child sexual abuse was involved, but which certainly normalized and fetishized child sexual abuse. 

For example there was an audio pornography podcast NCOSE found which depicted child sexual abuse (translation: “He’s not your son anymore! When the whole barracks attends the anal preparation of her child by her supervisor”) In a similar vein, our team also found plenty of “ABDL” (Adult Baby/Diaper Lover) content on Spotify. This is another term denoting the roleplaying of child sexual abuse—specifically the sexual abuse of infants.  

See the proof we’ve collected in this easy to view or download PDF.

Fast Facts

Apple App Store 12+; Google: T

Common Sense Spotify Review (suggested rating 15+)

68% of teens have used Spotify for streaming services over the last 6 months, with 44% of teens opting to subscribe/pay for Spotify services

30.5% of music streaming subscribers worldwide have a subscription with Spotify – almost double the share subscribed to Apple Music.

Similar to Spotify

According to this blog by Bark, “nearly every music platform has pornography on it, including YouTube Music, Amazon Music, and Pandora”. Given that Spotify was in the news for children being groomed, holds majority of market share for music streaming, and 68% of teens had used Spotify the later part of 2022, we felt it was critical parents were alerted about this popular platform. We’ll be looking into the other ones throughout the year: stay tuned. 

Updates

Stay up-to-date with the latest news and additional resources

Spot the Dangers

Inside Spotify ‘grooming network’ targeting children online and forcing kids as young as 11 to share nude photos

What’s Going On With the Hardcore Porn Images on Spotify?

Hardcore porn keeps showing up on Spotify even though it’s not allowed

People keep uploading nudes and hardcore porn images to the music site — even though it’s against Spotify’s rules.

Claims schoolgirl, 11, was groomed on Spotify

From BBC News: "An MP has demanded action after an 11-year-old schoolgirl's family told how she was groomed by paedophiles on the music streaming service Spotify."

Recommended Resources

Spotify Has A Porn Problem

Here's what parents need to know

Download this resource about the harms of pornography on children

Hentai and the Pornification of Childhood

How the Porn Industry Just Made the Case for Regulation

Learn about the 10 Ten Ways Kids Find Porn, by Defend young Minds

Share

Help educate others and demand change by sharing this on social media or via email:

Facebook
Twitter
LinkedIn
Email

Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.