VICTORY! Instagram taking more steps toward teen safety

We’re thrilled to share that Instagram has continued to adopt recommendations made by the National Center on Sexual Exploitation and our #WakeUpInstagram campaign allies to improve the safety of its youngest users: most specifically around direct messaging—a primary tool used by predators for grooming minors.

In October 2020, we reported that Instagram expanded their reporting system and added the option for users to opt out of receiving direct messages from strangers. Two weeks ago, our contacts at Instagram shared several additional changes they’re making to enhance teen safety. (*note: children aged 12 and under are not allowed to be and should not be on Instagram).

Restricting direct messages (DMs) between teens and adults they don’t follow

One of the most common methods adults use Instagram and other apps to groom children is by sending direct messages (similar to a private inbox). Previously, adults could message children directly—thereby establishing contact and even sending sexually graphic pictures. The October changes created the option for all users to block receiving direct messages from strangers. This change now makes it the default that adults who have not been “friended” by the minor cannot send a DM at all. Furthermore, if an unconnected adult tries to do so, they’ll receive a notification that DM’ing a teen isn’t an option.

Prompting teens to be more cautious about interactions in direct messages

One of the things we encourage all social media companies to do is to include more in-app education for all users, but especially for minors. Not all children have caretakers who can be or want to be involved in their kids’ online activities—which includes understanding the risks themselves, teaching their kids, and/or monitoring online safety. We believe in-app education is a critical responsibility of the social media corporations providing these platforms to kids.

Instagram will now be using prompts to encourage teens to be cautious in conversations with adults they’re already connected to. Furthermore, Instagram can monitor suspicious behavior by adults and alert the teen. For example, if an adult is sending a large amount of friend or message requests to teens, Instagram would alert the teen about this behavior within their DMs. The prompt will also give the teen an option to end the conversation, or block or restrict the adult.

Making it more difficult for adults to find and follow teens

Instagram will also be exploring ways to make it more difficult for adults who have been exhibiting potentially suspicious behavior to interact with teens. According to our contacts, this may include Instagram taking actions such as restricting these adults from seeing teen accounts in ‘Suggested Users’, preventing them from discovering teen content in Reels or Explore, and automatically hiding their comments on public posts by teens.

Encouraging teens to make their accounts private

Instagram recently added a new step so that when someone under 18 signs up for an Instagram account and they have the option to select either a public or private account, they’ll see a screen with education on what the different experiences mean. If the teen doesn’t choose ‘private,’ Instagram will send them a notification later educating them on the benefits of a private account and reminding them to check their settings.


What we’d still like to see Instagram change:

Explicit discussion of grooming, sextortion and other risks in the Parent’s Guide

Instagram recently published a new Parent’s Guide. It includes the latest safety tools and privacy settings, as well as a list of tips and conversation starters to help parents navigate discussions with their teens about their online presence. However, glaringly absent is any information or resources about major safety risks we know are prevalent on Instagram such as grooming, sex trafficking, child sex abuse material, or pornography. It is irresponsible for Instagram to leave off these extreme dangers on a resource that most parents—if parents are engaged at all—would go to first to understand how to keep their kids safe. Caretakers need to know about the very real risks to which their children may be exposed. Instagram should be forthcoming about those risks and use the opportunity to educate parents and provide resources to families who may not be aware of how predators use online platforms to abuse kids or what to do if their child is targeted. Social media apps are influencers in themselves, and they should use that status to inform and equip minors and parents. Hiding the truth is not only duplicitous, it’s dangerous.

Improved comment filtering

One of the ways predators access and groom children is through comments. Instagram users do have control over who can comment on photos and videos. Users can choose to allow comments from everyone, people they follow and those people’s followers, just the people they follow, or just their followers. They can also block people from commenting. Teens can even create their own list of words or emojis they don’t want to appear on their posts by using the “Manual Filter” feature in Comment Controls.

Furthermore, through their “offensive comments” filter, Instagram can automatically hide comments that are intended to bully or harass. However, we’re still seeing the rampant sexualization of children on Instagram—despite algorithms to try to remove predatory comments and emojis. We hope Instagram continues developing algorithms that can identify and remove predatory behavior on minors’ accounts so that the onus is not on the kids do so themselves.

Reconsider Instagram Kids

While we were thrilled to hear about the recent improvement by Instagram, the enthusiasm was dampened by the news that Instagram plans to roll out Instagram for Kids under the age of 13. Extensive evidence exists about the harms of social media on children and teens, as well as excessive screen time. We also know that where children play, predator prey—and we have seen time and time again that despite touted safety features and parental controls on social media apps (even ones specifically designed for young kids) predators still easily circumvent them to gain access to children (e.g., Facebook Messenger and YouTube Kids). We’ll have more on this later.


Thank Instagram for their recent improvements!

Thank you @Instagram for making your platform safer for teens. Keep up the positive trend! Click To Tweet

And if you have kids in your life, know what social media apps they and their friends are using, turn on any controls that enhance privacy and safety (including ones for Instagram), and talk to them regularly about online risks. We have some great resources here.


TAKE ACTION: Want to do more to protect kids online?

Urge Congress to update child online privacy laws last updated 1998. It’s time to take greater measures for our kids!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.