KOSA “Duty of Care”: What is it and Why Does it Matter?

“There were bad videos that gave me nightmares one time. Told me to hurt my family.” 

The above quote is from an 11-year-old girl talking about her experiences on YouTube.

Social media algorithms are addictive and predatory, tormenting children on the Internet. For example, researchers at Common Sense Media took testimony from young girls; one said her algorithms were flooded with anxiety and depression content after only watching one or two videos. Meanwhile, almost three-quarters of boys, ages 11 to 17 are being exposed to so-called “hyper-masculine” content (e.g. about fighting and weapons) that leave them with loneliness, low self-esteem, and a likelihood to suppress their emotions.

Tech executives have shown us time and time again that they are not interested in protecting child users if it negatively affects their bottom line. This is why we must pass the Kids Online Safety Act.

The Kids Online Safety Act (KOSA) is a bill that would hold Tech accountable to the bare minimum when it comes to protecting child users online.

The most important feature of the Kids Online Safety Act is the “duty of care,” which essentially requires Tech to design their platforms with a reasonable amount of care to protect child users.

This is a basic standard that every other industry except the tech industry is already held to. For decades the tech industry has gotten a free pass; KOSA would finally change that.

What KOSA’s “Duty of Care” Does and Does NOT Do

Despite promises of prioritizing safety, Big Tech has consistently made cuts to trust and safety staff and lobbied against legislation that would help keep kids safe online. Further, they have been spearheading a disinformation campaign on Capitol Hill, trying to scare lawmakers into thinking that passing KOSA would be an infringement on free speech.

But the fact is that KOSA actually has nothing to do with speech. KOSA focuses exclusively on platform design—specifically, the way harmful content is pushed on kids who are not looking for it.

Nothing in KOSA stops content from being published, nor does it stop someone from accessing content they are actively seeking out. It merely says that the platform cannot design its algorithm in such a fashion that pushes harmful content on kids who don’t want it, or otherwise design its platform to lead to harm.

Importantly, KOSA provides a very clear and specific list of what kinds of harms the bill covers, so as to prevent the bill from being misused. This list is not controversial:

  1. Eating disorders, substance use disorders, and suicidal behaviors.
  2. Depressive disorders and anxiety disorders when such conditions have objectively verifiable and clinically diagnosable symptoms and are related to compulsive usage.
  3. Patterns of use that indicate compulsive usage.
  4. Physical violence or online harassment activity that is so severe, pervasive, or objectively offensive that it impacts a major life activity of a minor.
  5. Sexual exploitation and abuse of minors.
  6. Distribution, sale, or use of narcotic drugs, tobacco products, cannabis products, gambling, or alcohol.
  7. Financial harms caused by unfair or deceptive acts or practices (as defined in section 5(a)(4) of the Federal Trade Commission Act (15 U.S.C. 45(a)(4))).

The mere fact of these harms occurring to the child as a result of their use of the platform is not enough to trigger liability. Liability only applies if the platform has not exercised reasonable care to prevent the harms.

Further, KOSA can only be enforced by state attorneys general or the Federal Trade Commission. Individuals cannot litigate to try and enforce KOSA, which should eliminate a fear of “frivolous lawsuits” in the name of KOSA.

The standards outlined in KOSA are already standards held by the FTC for every other industry. Section 230 has given the tech industry a pass for too long.

The duty of care outlined in KOSA is essential in protecting child users on the Internet. Without it, KOSA is merely an appearance of action on child online safety, rather than actual change. It would just be more of the same.

KOSA Has Broad Bipartisan Support—But We Must Keep the Pressure On to Ensure it Becomes Law!

Despite the intensely divided political climate we are currently living in, Republicans and Democrats agree on this: Protecting children online must be a priority. In July 2023, KOSA passed the Senate 91-3. It has massive bipartisan support. But sadly, due to Big Tech’s lobbying efforts, KOSA was never put up for a vote before the end of last Congressional session, putting us back at square one.

This congressional session, we’ve been hard at work to push KOSA forward. The NCOSE Public Policy team met with several congressional offices, managing to bring the co-sponsors for KOSA up to 61—enough to secure Senate passage and push leadership to bring it to a vote soon.

In partnership with Issue One, Common Sense Media, Eating Disorders Coalition, Fairplay for Kids, and ParentsSOS, the National Center on Sexual Exploitation developed a letter of support for KOSA that will be sent to Congress urging them to safeguard our children and pass KOSA. This letter was signed by over 400 organizations.

Now, we need your voice too! Sign the quick action below to contact your Congressional Representatives and encourage them to support KOSA!

ACTION: Ask Your Representatives to Support KOSA!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.