Donate Now

Parental Control Progress on Meta’s Messenger, Instagram, and TikTok!

By:

A slew of parental supervision improvements has been made at Meta’s Messenger, Instagram, and TikTok—is it enough to protect kids?

Parental involvement and supervision is an important factor in online child safety, and so NCOSE is glad to report on the below improvements—many of which answer recommendations NCOSE has made.

More tools and transparency are always steps of progress worth celebrating!

At the same time, we recognize at NCOSE that, too often, multi-billion-dollar, multi-national corporations push off a disproportionate responsibility onto parents to keep their children safe online. This leaves parents and guardians scrambling and exhausted, trying to keep up with the latest technology and risky online behavior trends, ever-changing multi-step control features that often still have significant limitations, difficult-to-find parent centers, and more. And, importantly, this complex landscape leaves children without the privilege of tech-savvy, highly involved caregivers completely unprotected.

NCOSE is very pleased to see the below improvements, and thanks the companies for taking these steps forward. That being said, we continue to call on these powerful corporations to take more responsibility—to always have safety in mind as they build new product features, and to improve and automatically turn on high safety settings for minors. The burden of online safety cannot rest on parents alone. (Learn more about how to call for corporate accountability at DirtyDozenList.com)

Meta’s Messenger

Meta announced the release of Parental Supervision tools on Messenger—something NCOSE asked Meta to implement across all their platforms in 2022. While parental controls were already available on Messenger Kids (the app for 6-12 year-olds), now parents and guardians of teens who use the main Messenger app may be able to use parental controls as well. However, these controls are only available if the teen chooses to opt-in.

If the teen chooses to allow parental controls, parents and guardians will be able to:

  • View how much time their teen spends on Messenger
  • View and receive updates on their teen’s Messenger contacts list, as well as their teen’s privacy and safety settings
  • Be notified if their teen reports someone
  • View who can message their teen (only their friends, friends of friends, or no one) and see if their teen changes this setting
  • View who can see their teen’s Messenger stories and get notified if these settings change

Meta does not allow parents to see the contents of the teen’s messages.

Currently, Messenger’s Parental Supervision Tools are only available in the US, UK, and Canada, though Meta has stated it plans to expand to more countries soon.

Further improvements NCOSE wants to see on Messenger:

NCOSE applauds Meta for these improvements to Parental Supervision tools to Messenger. We are also asking them to consider the following next steps:

  1. Expand Parental Supervision tools so that parents and guardians can see who their teen can message in end-to-end encrypted chats (otherwise known as “secret conversations”), as well as the time spent using these chats.
  2. Have pin-protected Parental Supervision tools that are ON by default. Currently, both parents and teens need to opt-in to the Parental Supervision tools, and the teen can opt-out at any moment, leaving the parent helpless to protect them.
  3. Automatically blur all sexually explicit imagery sent through Messenger and send parents a notification that sexually explicit imagery was sent or received by their teen.

Instagram

Instagram (also owned by Meta) was named to the 2023 Dirty Dozen List for numerous failures to protect children and prioritize safety on their platform. Among the many recommendations NCOSE made to Meta, we asked them to improve parental controls and safety in Direct Messaging. Now they have released several improvements in these areas:

Improvements to Instagram’s Parental Supervision Tools:

Instagram recently added new features to their Parental supervision tools, including:

  • A new notice to teens after they’ve blocked someone, encouraging them to add their parents to supervise their Instagram account as an extra layer of support.
  • The ability for parents to see how many friends their teen has in common with accounts they follow or are followed by. This may help parents understand how well their teen knows the people behind these accounts (though it is not a foolproof measure, as groomers have been known to befriend a child’s online friends, in order to gain trust and create the illusion of familiarity).
  • More ways for parents to customize which notifications from Parental Supervision on Instagram they want to receive and how often they receive them. 

Similar to Messenger, Instagram’s Parental Supervision Tools require both the teen and the parent to opt-in, and the teen can turn of these tools at any time.

Increased Protections around Direct Messaging

Instagram has added features to limit how people can interact with and message others who don’t follow them:

  • Before being able to message someone who doesn’t follow them, people must now send an invitation to obtain their permission to connect
  • People can only send one invitation at a time and can’t send more until the recipient accepts. This will hopefully help limit groomers befriending large swaths of people they don’t know.
  • These invitations are limited to text only; people can’t send any photos, videos, or voice messages, or make calls, until the recipient has accepted the invitation to connect. This will hopefully help limit people receiving unsolicited sexually explicit images, or other unwanted photos, videos, or other types of media from people they don’t follow.

Encouraging Teens to Better Manage Time Spent on Social Media

In addition to the safety changes around Parental Supervision Tools and Direct Messaging, Meta has announced that it will soon be rolling out new features to help teens better manage the time they spend on Instagram, as well as Facebook:

  • Instagram’s “Quiet Mode,” which was rolled out in the US in January, will be available globally in coming weeks. Quiet Mode allows people to turn off notifications from the Instagram app, and changes their profile activity status to let people know they’re in Quiet Mode.
  • Teens will soon receive nudges after spending 20 minutes on Facebook, prompting them to close the app and set daily time limits. 
  • Meta is currently testing a new feature that would nudge teens to close the Instagram app if they are scrolling through Reels at night.

Further improvements NCOSE wants to see on Instagram:

NCOSE thanks Instagram for the changes they’ve made, and urges them to consider the following further improvements:

  1. Remove or limit high-risk features for 13 – 15-year-olds, such as removing direct messaging (as TikTok has done), removing Vanish Mode, and restricting all adults from seeing minors in “suggested user” or “discover people”
  2. Invest in greater proactive measures to identify online grooming, as well as accounts or networks trading child sexual abuse materials, the trading of any explicit material where Instagram does not identify verified age and consent (including links to other platforms), and predatory sexualization of children (such as accounts that primarily feature children and garner sexualized comments.)
  3. Invest in technology to scan for and block sexually explicit content in messages
  4. Have pin-protected Parental Supervision tools that are ON by default.

Join us in asking for these changes by completing the action form below! It will only take you a few seconds! (Keep reading about TikTok below the action form)

TikTok

NCOSE has been asking TikTok to provide parents and minors with more resources to manage their accounts and enhance safety, and to improve processes to find and block harmful content and hashtags. TikTok announced some recent changes which are steps forward in that direction!

Last year, TikTok launched  a content filtering tool that allow people to block words or hashtags they want to avoid seeing in their For You or Following feeds. TikTok has now added this tool to Family Pairing, so that parents can use it to protect their teens from harmful content.

TikTok also announced the creation of a Youth Council, intended to help the platform build safety measures based on teens’ experiences and reports. TikTok stated that the council will launch later this year and will aim to “provide a more structured and regular opportunity for youth to provide their views.”

Further improvements NCOSE wants to see on TikTok:

While NCOSE recognizes the positive impacts of TikTok’s recent safety updates, the nature of the tools place unrealistic expectations on parents to stay up to speed with trending keywords and hashtags that may contain inappropriate content—including workarounds that people develop to avoid being caught by filters. TikTok should assume more of the burden in filtering harmful content for minor accounts, and should default minor accounts to the highest level of safety by automatically turning on current opt-in features like Restricted Mode, Duet Restrictions, and Comment Restrictions.

ACTION: Call for Corporate Accountability through the Dirty Dozen List!

Every year, NCOSE calls on 12 mainstream entities to change harmful policies and practices that are facilitating sexual exploitation. Please visit dirtydozenlist.com to learn more and to join us in asking these companies to change!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.