BILL ANALYSIS: Analysis of S. 1409, the Kids Online Safety Act of 2023

August 1, 2023

Key Takeaways

Children have more Internet access than ever, which leaves them more exposed to exploitation and the dangers to mental health. KOSA (S. 1409) aims to fix this issue while still allowing children to benefit from the freedoms that come with being online.

KOSA includes new privacy protections for children, broader restrictions on the collection of personal data, and stricter enforcement of consumer protection rules.

The bipartisan Kids Online Safety Act (KOSA) of 2023 was introduced in the U.S. Senate by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) on May 2, 2023, and referred to the Committee on Commerce, Science, and Transportation.

This bill would create accountability, promote transparency, and advance the development of tools to protect children online. The following analysis provides a closer look at some of the main provisions of this legislation and examines how it would effectively address some existing problems.

If Enacted, this Legislation Would:

Establish a Duty of Care to Prevent Harm to Minors.

Section 3(a) of the bill would require covered platforms to take reasonable measures to prevent and mitigate harm to minors from many different types of content and activities. These potential harms include suicidal behavior, addiction-like behavior, physical violence, bullying, harassment, and sexual exploitation/abuse, as well as content related to narcotics and predatory, unfair, or deceptive marketing practices or other financial harms.

Nothing in law currently requires platforms to consider, prevent, or mitigate the harm their online platform could cause to minors. This section of the bill aims to address this problem.

Create Readily Accessible and Easy-to-Use Safeguards for Minors.

Section 4(a)(1) establishes a series of safeguards to protect minors online, including:

  • Limiting who can communicate with minors;
  • Preventing others from viewing a minor’s personal data;
  • Limiting existing features that encourage minors to stay on the platform for an extended period of time; and
  • Creating the ability to opt out of algorithms and limit the types of content with which a minor can interact.

Currently, platforms have the discretion to decide what safeguards minors have when using them. Section 4(a)(1) would address this inconsistency and make the bills safeguards the baseline for all covered platforms.

Make Privacy by Default” the Standard Setting for MinorsOnline Accounts.

Section 4(a)(3) of the bill would mandate that covered platforms establish the most protective level of controls as the default setting for minorsaccounts. This would still allow minors to change the settings if they want to, but when they create new accounts, the default settings would be as restrictive as possible.

Privacy by Default” is a growing security philosophy that is widely accepted, and this legislation would enable Congress to codify it into law for the online platform accounts of minors.

Empower Parents to Protect Their Kids Online.

Section 4(b) would provide parents with the tools to protect their kids online. This section would require platforms to give parents the following:

  • Access to view and manage the privacy settings of their child’s account;
  • The ability to restrict purchases from their child’s account; and
  • The capability to view the total time their child has spent on an online platform.

Section 4(c) would mandate that covered platforms allow parents, minors, and schools to submit reports of harm to minors. It would also require platforms to provide a platform point-of-contact and a means for reporters to track their reported incidents.

Section 5(a)(2) would require platforms to notify (or make a reasonable effort to notify) parents about certain information regarding their childs account.

Under current law, parents are not empowered with the tools to help protect their children online. These provisions would change that by returning power to parents and enabling them to have a more significant role in promoting their childrens safety.

Promote Platform Transparency.

  • Third-Party Audits & Public Risk Assessments

Section 6(a) would require platforms to release an annual public report of a third-party audit that identifies the foreseeable risk of material harms the platform poses to minors. The report would also have to provide descriptions of the prevention and mitigation efforts the platform has taken to address the identified risks.

This knowledge would empower the American people to make an educated decision about whether the rewards of having a presence on an online platform outweigh the risks. Unfortunately, this is lacking in our current system. Parents have often been unable to assess the risks and rewards thoroughly, and this measure would go a long way toward improving this situation.

  • Independent Research

Section 7(b) would establish a program for qualified researchers to access data sets from covered platforms to perform an independent analysis of the platforms harm to children.

This section would help hold online platforms accountable for transparency practices by having independent researchers investigate their data sets.

Protect Children Through Age Verification.

Section 9 would direct the National Institute of Standards and Technology (NIST) to conduct a study with the goal of establishing best practices for developing systems that covered platforms could use to verify a users age.

NIST has developed standards for the U.S. for more than 100 years and is the worlds leader in creating critical measurement solutions and promoting equitable standards.” Age verification practices put forward by NIST would protect children online and force predators to verify their age instead of creating false accounts to lure minors. For too many years, predators have not been subject to age verification practices. Section 9 of the Act would help solve this problem.

Create the Kids Online Safety Council.

Section 10 of the bill would order the creation of the Kids Online Safety Council (the Council) within six months of the bill being enacted. The Council would have to consist of a wide range of personnel, including:

  • Academic experts;
  • Health professionals;
  • Civil society and privacy experts;
  • Parents and youth;
  • Covered platforms;
  • National Telecommunications and Information (NTIA), National Institute of Standards and Technology (NIST), Federal Trade Commission (FTC), Department of Justice (DOJ), and Health and Human Services (HHS) representatives;
  • State attorneys general; and
  • Representatives of socially disadvantaged communities.

The Council would be a step in the right direction to ensure online safety for kids and is not an afterthought but a continuous conversation. New problems requiring new solutions will arise, and this all-encompassing Council would be at the forefront of tackling those problems so we can avoid situations in the future that endanger Americas youth.

Join The
Movement



By providing your information, you become a member of America First Policy Institute and consent to receive emails. By checking the opt in box, you consent to receive recurring SMS/MMS messages. Message and data rates may apply. Message frequency varies. Text STOP to opt-out or HELP for help. SMS opt in will not be sold, rented, or shared. You can view our Privacy Policy and Mobile Terms of Service here.