Breaking Down The KIDS Act: Part 1 – Creating The Parental Controls

Two children face a board with "The KIDS Act Part 1" and "Title I-IV Breakdown," set against a backdrop of the American flag.

Kids Internet and Digital Safety (KIDS) Act. H.R. 7757 is a bundle of bills proposed by House Representative Brett Guthrie of Kentucky that takes multiple bills and condenses them into one. The goal is to take the omnibus bill, the KIDS Act, and break it down for ease of understanding. With so much going on, condensing it to understandable measures and breakdowns could be essential as we advance.

In this article, we’re exploring the KIDS Act by walking through Titles I–IV and the bills Congress bundled into each one. Below is how those titles appear in the bill.

  • Title I — Shielding Minors From Obscenity (SCREEN Act)
  • Title II — Online Platforms
    • Kids Online Safety Act (KOSA)
    • Safe Messaging for Kids (SMK) Act
    • Stop Profiling Youth and Kids (SPY Kids) Act
  • Title III — Social Gaming Platforms (Safer GAMING Act)
  • Title IV — Artificial Intelligence Chatbots (SAFE BOTs Act)

Title I — “Shielding Minors From Obscenity (SCREEN Act)”

The “SCREEN Act.” H.R. 1623 (Senate Companion bill S. 737 by Senator Mike Lee of Utah) is sponsored by House Representative Mary Miller of Illinois, introduced in Feb. 2025 with the requirement that websites or apps that make available material appealing to prurient (obsessive) interest in explicit or obscene content to verify every user’s age, not a simple check box confirming, “Are you 18?” They must be transparent with the identity verification used in compliance with the “SCREEN Act,” while protecting the individual’s confidentiality.

This act’s purpose is based on estimates Mrs. Miller read by Pew Research, showing only 39% of parents use parental controls — meaning 61% of kids have unfettered access online unless employing school technology.

Title II — “Harms to Minors on Online Platforms”

Addressing “Harms to Minors on Online Platforms” specifically talks about platforms such as websites, software, applications or electronic services connected to the internet. These are publicly available for use by consumers, enabling the creation of usernames or user identifiers that can be searched for on the platform by others, further allowing users to follow other users on the platform. It lists platforms whose purpose allows the sharing and access of user-generated content through text, image, video, audio, or any other form of interactive medium. Similarly, platforms that have features promoting user engagement or use personal information to advertise, market or make content recommendations.

Subtitle A — “Kids Online Safety Act (KOSA)”

The “Kids Online Safety Act (KOSA)” (Sections 211-221) requires covered platforms to create and follow reasonable policies that address four specific harms to minors: 

  • Severe physical violence that could harm major life activities
  • Sexual exploitation or abuse
  • Promotion of illegal substances
  • Financial scams

Users known by the platform to be minors will have the safest privacy and safety settings turned on automatically. These include:

  • Limiting who can message them
  • No automatic friend suggestions to strangers
  • Reduced addictive features (endless scrolling, heavy notifications)
  • Controls for personalized recommendations and location sharing.

Parents are given easy-to-use tools to:

  • Manage child settings
  • Restrict spending
  • Viewing time
  • Receive notices

Minors under 13 have even more protective measures, and parents must be involved.

Every platform must provide a “report harm” button and contact point, with the platform confirming the report and responding within 10 days (less for emergencies).

Large platforms must hire independent experts to audit their system every 18 months at first, then once a year. Audits must publicly report how many minors use the app, how much time they spend and how well the safety tools are working.

This section does not require age verification; it relies on parental tools and default safety settings. If a platform breaks the rules in this section, the FTC can treat it as an unfair or deceptive business practice and impose fines, while state attorneys general can also sue for refunds or fixes on behalf of families.

Subtitle B — “Safe Messaging for Kids (SMK) Act

“Safe Messaging For Kids (SMK) Act” applies to any covered platform (social apps or websites) that offers users a direct message feature where the message, image, video or audio is directly sent privately to another user on the same platform. The act prohibits the use of ephemeral or disappearing messages in private with known minors. Platforms cannot allow any direct messages to minors under 13. 

The “SMK Act” requires that parents be given easy-to-use tools needed to ensure that any teen covered user (age 13-17) is using the platform in a safe manner consistent with their wishes. 

Default settings will: 

  • Alert parents about requests from unapproved contacts
  • Alert parents if the child changes their listed age on their profile, and if the change would affect parental controls
    • Manage contacts (approve or deny who can message their teen)
    • Disable any direct messaging feature
    • Block any user, group or all users from starting or continuing messages
  • Enable the teen to set their own profile as hidden on search
    • Their profile cannot be found
    • Prevents others from seeing their online/offline status
    • Others are prevented from initiating/continuing to engage in direct messaging

Subtitle C — “Stop Profiling Youth and Kids (SPY Kids) Act

The “Stop Profiling Youth and (SPY) Kids Act” is a targeted but strict limit on research regarding known minors on platforms. It puts a ban on platforms using minors’ personal information for market research, tracking their clicks to influence later purchases. Likewise, it bans the creation of behavioral profiles. They can only perform research for improving safety features or complying with the law, such as identifying and tracking predators.

Title III — “Social Gaming Safeguards (Safer GAMING Act)

Examining “Safer Guarding of Adolescents from Malicious Interactions on Network Games Act”, shortened to “Safer GAMING Act.” This section targets social games that allow users to connect to the internet and communicate with others, either by voice, text, or visual messaging. It applies to companies that provide access to these games, such as consoles (PlayStation, Nintendo Switch, Xbox), digital stores, mobile platforms, and cloud gaming services. The law requires providers to make parental controls that are easy to use and turn on automatically at the strictest setting. They will be required to have a single dashboard allowing the parent to control all the following, and the child must be notified when safeguards are active:

  • Limit or block all in-game communications
  • Stop a child’s profile or personal info from being suggested to adult players
  • Limit or block purchases and financial transactions
  • Limit time spent in the game

States are barred from adding conflicting rules on these features of the KID Act.

Title IV — “Artificial Intelligence Chatbots (SAFE BOTs Act)

The “Safeguarding Adolescents From Exploitative BOTs Act,” or “SAFE BOTs Act” talks about interactions with standalone AI Chatbots, such as Grok, Replika, Gemini, more specifically, that providers of AI Chatbots must have clear disclosures when interacting with minors (under 17), informing them that they are not a real person, reiterating if necessary during the conversation. The AI cannot pretend to be licensed as a doctor, therapist, lawyer, or any other profession for which one might seek AI. If a child brings up suicide or self-harm, the chatbot is required to share resources for a crisis hotline.

There are two mandatory policies:

  • After 3 hours of continuous use, the bot must suggest that the minor take a break
  • Provider must have reasonable policies to limit minors’ easy access to:
    • Sexual material that is harmful to minors
    • Promotion of gambling that is illegal/restricted for minors
    • Promotion of illegal drugs, tobacco, or alcohol

What Titles I–IV Mean for the KIDS Act

Together, Titles I–IV outline the core framework of the KIDS Act, setting the stage for how Congress aims to regulate parental controls to help keep their children safe on online platforms, gaming spaces, and AI tools. These sections form the backbone of the bill, and the remaining titles build on this foundation with additional enforcement and implementation details.