WASHINGTON, DC (March 20, 2023) – The National Center on Sexual Exploitation called out Snap for releasing an AI chat tool that may engage children in inappropriate conversations that could possibly influence their behavior and lead to serious harm. The AI tool is promoted by Snap as a “friend” to users.
A Washington Post headline proclaimed, Snapchat tried to make a safe AI. It chats with me about booze and sex. After posing as a 13-year-old on Snap, the journalist said, “In another conversation with a supposed 13-year-old, My AI even offered advice about having sex for the first time with a partner who is 31.” His conclusion: “Tech companies shouldn’t treat users as test subjects – especially young ones.”
“Snap must stop treating teens as a testing ground for experimental AI. My AI should be blocked for minors if and until safety measures can be put into place. Instead of releasing new, dangerous products, Snap should prioritize stemming the extensive harms already being facilitated on its platform,” said Lina Nealon, vice president, National Center on Sexual Exploitation.
“In my conversations with law enforcement, child safety experts, lawyers, survivors, and youth, I ask them what the most dangerous app is, and without fail, Snap is in the top two. Just in the past few months, three separate child protection agencies noted Snapchat to be the top app together with Instagram for sextortion, one of the top three places children were most likely to view pornography outside of pornography sites, and the number one online site where children were most likely to have a sexual interaction, including with someone they believe to be an adult. Multiple grieving families are suing Snapchat for harms and even deaths of their children because of sex trafficking, drug-related deaths, dangerous challenges, severe bullying leading to suicide, and other serious harms originating on the popular platform.
“Shortly after the news exposing the dangers of My AI was public, Snap announced Content Controls that ‘will allow parents to filter out Stories from publishers or creators that may have been identified as sensitive or suggestive.’ While this may seem like a good step toward child safety, the feature is only available on the recently released Family Center only the caregiver can turn it on, and it only applies to one section of Snapchat. It’s not on by default and is not available to any teen directly.
“As social media and gaming apps are facing long overdue and increasing scrutiny, apps like Snap are scrambling to appease policymakers, the public, and especially parents that their platforms have children’s best interest at heart. But the public needs to be aware that Snap’s My AI stands in stark contrast to the best interest of children and that the recent tools Snap has provided parents do very little to keep all kids safe on one of the most dangerous apps,” Nealon said.
About National Center on Sexual Exploitation (NCOSE)
Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national non-partisan organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health harms of pornography.