What Is the Kids Digital Safety Act and Why Does It Matter?

The Kids Digital Safety Act, packaged within the broader Kids Internet and Digital Safety (KIDS) Act, was advanced by the House Energy and Commerce Committee in March 2026 after bipartisan negotiations collapsed before the markup (The Hill, 2026). The package incorporates 12 bills, with the Kids Online Safety Act (KOSA) as its centerpiece, the product of a multi-year push that began in the Senate.
We've watched this legislative cycle before: broad public concern, a high-profile vote, and a final bill that delivers less than it promises. That pattern is worth naming upfront, because the kids online safety bill debate is following it almost exactly, and the public deserves to understand what's actually moving through Congress.
What Does the Kids Online Safety Act Do? A Clear Summary
The Kids Online Safety Act would establish core obligations for social media, video games, messaging, and streaming platforms used by or likely to reach users under 17 (Congress.gov, S.1748, 2025). At its core, the bill requires platforms to:
Duty of Care - Take reasonable steps to mitigate harms to minors, including exposure to content promoting eating disorders, self-harm, substance abuse, and sexual exploitation
Algorithmic Opt-Out Controls - Give minors the ability to opt out of recommendation systems, with default settings that prioritize protection over engagement
Data Minimization - Prohibit the use of minor user data for targeted advertising without clear consent
Parental Oversight Tools - Provide parents with supervisory controls over their child's account and experience
Annual Compliance Audits - Require independent audits and public reporting on platform compliance
FTC Enforcement - Grant the Federal Trade Commission authority to issue fines and compel platform changes
What Changed in the 2026 House Version
The kids online safety act senate bill required platforms to "exercise reasonable care" to prevent harms including depression, anxiety, and compulsive use patterns. The kids online safety act house version narrowed that to requiring only "reasonable policies" (Washington Times, 2026), a shift that sounds minor but is legally significant.
The actual knowledge standard further limits platform obligations to cases of confirmed knowledge that a user is a minor. Rep. Kathy Castor noted the contradiction plainly: platforms actively design products to attract young users and profit from their presence (Roll Call, 2026).
We should be asking ourselves, if a platform builds its entire product around capturing adolescent attention, at what point does "we didn't know they were minors" stop being a legal defense and start being an admission?
How Would the Kids Online Safety Act Affect Social Media Platforms?
What Platforms Would Be Required to Do
The kids online safety act compliance requirements would formally obligate platforms to:
Conduct internal risk assessments for products used by or likely to reach minors
Redesign default algorithmic recommendation settings toward maximum protective defaults
Implement age verification or assurance mechanisms at onboarding
Build and maintain parental dashboard and monitoring tools
File annual transparency reports with the FTC
Cooperate with independent auditors on algorithmic accountability
Where the Gap Lives
On paper, that sounds meaningful. In practice, we've seen how platforms respond to regulatory language that gives them room to interpret rather than mandates they must meet, and age verification is the clearest example.
Age verification requires document authentication, biometric assurance, and liveness detection that most platforms are neither equipped nor incentivized to build to the level genuine protection of minors demands. We've watched companies treat "implement age verification" as a mandate to add a birthdate field. That's not verification. That's optics dressed as infrastructure.
A coalition of advocacy groups that helped shape earlier KOSA drafts captured what happened to this bill directly: "Every provision we fought for has been stripped or disclaimed. Every loophole the companies sought has been written in" (Washington Times, 2026). When we've watched platforms fail to self-regulate with full documented knowledge of harms to children, a bill built on "reasonable policies" rather than binding accountability is unlikely to break that cycle.
Democrat Backlash: Why Frank Pallone and Others Are Pushing Back
The Democrat backlash to the Kids Digital Safety Act reflects substantive objections to what was stripped from the bill before the markup. Ranking Member Frank Pallone (D-NJ) was direct: "I believe that Republicans are handing Big Tech a giant gift," adding the legislation "would leave kids and their parents worse off than they are today" (House Energy and Commerce Committee Democrats, 2026).
Their objections fall into three areas we think deserve full scrutiny:
1. Kids Online Safety Act Preemption of State Laws
The kids online safety act preemption clause would override state-level protections in California, Texas, Florida, and New York, many of which are more aggressive than this federal framework. Rep. Raul Ruiz (D-CA) framed it plainly: "Congress should set a federal floor for child safety, not erase state laws that are actively protecting kids today" (Washington Times, 2026).
We've consistently seen federal floors become ceilings once industry lobbying has done its work in implementation. That's the reasonable concern here, and it deserves a straight answer, not reassurances that a weaker national standard is somehow an improvement.
2. Kids Online Safety Act Duty of Care Removed
Rep. Kim Schrier (D-WA) put it directly: "This bill has not only eliminated the duty of care standard, it actually bans any duty of care at the federal or state level. How convenient, considering that duty of care is the basis of most lawsuits from parents whose children have died as a result of social media harms" (Washington Times, 2026).
That distinction determines whether grieving families can seek accountability in court. it's the difference between a law with teeth and a law with press coverage.
3. Industry Fingerprints
Rep. Jake Auchincloss (D-MA) called the bill a "false flag operation by Meta," arguing the actual knowledge standard and preemption language "nullifies everything that came before it" (Washington Times, 2026). Pallone added that preemption "could harm existing efforts to hold companies like Meta and Roblox accountable in the courts" (House Energy and Commerce Committee Democrats, 2026).
Whether or not you agree with that characterization, we should at minimum be asking why the companies with the most to lose from stronger standards ended up with a bill shaped so precisely around their preferences.
The Civil Liberties Case Against KOSA
We want to be careful here, because this argument gets flattened too quickly in most coverage. The ACLU's opposition to the Kids Online Safety Act and the broader civil liberties groups oppose KOSA coalition are not defending platforms, they are defending the people platforms have historically harmed through over-removal of content.
Who Is Raising the Alarm
Over 90 organizations signed on to oppose this bill, including:
ACLU - Citing First Amendment and censorship concerns
Electronic Frontier Foundation (EFF) - Warning of surveillance infrastructure risks
GLAAD and GLSEN - Raising disproportionate harm to LGBTQ+ youth
American Library Association - Flagging threats to information access
Wikimedia Foundation - Opposing liability structures that chill open content
The Stop KOSA campaign, led by Fight for the Future, argued that KOSA would pressure platforms into automated filtering of LGBTQ+ content and reproductive health information to minimize regulatory exposure (CDT, 2023; StopKOSA.com, 2024). The ACLU stated the bill "compounds nationwide attacks on young peoples' right to learn and access information" and could restrict anything "objectionable to the government, from sexual health resources to information about gender identity" (ACLU, 2024).
The Problem the Bill Doesn't Solve
The Regulatory Pattern We Keep Repeating
We've seen this play out before in U.S. technology regulation: months of public concern, a committee vote, and a law that is either too narrow to compel real change or too broad to survive constitutional challenge. Similar state-level bills in Indiana, Mississippi, Texas, and Utah were struck down as unconstitutional (Wikipedia/KOSA, 2023). NetChoice litigation signals the same vulnerability at the federal level (Children and Screens, 2026).
The Scale of the Problem Being Left Unaddressed
The underlying architecture of harm remains untouched by either version of this bill. Consider what the data tells us:
Adolescents spending more than 3 hours daily on social media face double the risk of poor mental health outcomes
The average adolescent currently logs 3.5 hours daily on these platforms
Nearly 40% of children aged 8–12 are already on platforms that state a minimum age of 13 (U.S. Surgeon General's Advisory, 2023)
The parents who projected photos of children they say died from social media harms onto Meta's headquarters the night before the markup were not asking for dashboards (The Hill, 2026). They were asking for platforms not engineered to exploit adolescent psychology. This bill, as written, does not address that.
The Verification Infrastructure Gap Nobody Names
We also think there's a technical dimension to this problem that the legislative debate consistently avoids. The reason a 10-year-old can create a social media account under a false birthdate is a failure of verification infrastructure that companies have chosen to leave in place.
When platforms design systems that cannot meaningfully distinguish an adult from a child at onboarding, and then argue they had no "actual knowledge" the user was a minor, we are watching a manufactured loophole, not a genuine technical limitation.
Frequently Asked Questions
What does the kids online safety act do? KOSA requires social media platforms to implement protective policies for minors, restrict algorithmic targeting of users under 17, limit data collection, provide parental oversight tools, and submit to annual FTC compliance reporting.
What is the kids online safety act 2026 house version? The 2026 House version narrows the duty of care to "reasonable policies," adds the actual knowledge standard, and includes a preemption clause overriding many state-level kids online safety laws, changes that collapsed bipartisan support before the markup.
What is the difference between the kids online safety act senate bill and the house version? The Senate bill passed 91–3 with a full duty of care and broader harm coverage. The House version strips that standard and adds sweeping preemption of state protections, drawing immediate opposition from Democrats and civil liberties groups.
Why do Democrats oppose the kids digital safety act? Democrats argue the bill weakens platform accountability by removing the duty of care, preempts stronger state laws, and adopts an actual knowledge standard that lets platforms claim ignorance of minor users they actively design products to attract.
Why do some LGBTQ+ groups oppose the kids online safety act? Over 90 LGBTQ+ and civil rights organizations argue KOSA's liability structure incentivizes platforms to over-remove LGBTQ+ content and reproductive health information to minimize regulatory risk, a pattern already documented under advertiser pressure alone.
What is the ACLU's opposition to KOSA? The ACLU argues KOSA is an internet censorship bill that gives the government power to determine appropriate content for young people, threatening access to mental health resources, gender identity information, and broader educational content.
What does the kids online safety act preemption clause mean? It would override existing state-level kids online safety laws, including California, Texas, and Florida frameworks stronger than the federal bill, replacing more aggressive state enforcement with a weaker federal standard.
How would the kids online safety act affect social media platforms? Platforms would face compliance requirements including policy reviews, default algorithm changes, data minimization, parental tools, and FTC reporting, likely satisfied procedurally while core engagement-maximization architectures remain intact.
What are the kids online safety act compliance requirements? Covered platforms must establish harm-reduction policies for minors, implement default privacy settings for under-17s, restrict targeted advertising, provide parental oversight tools, file annual transparency reports, and cooperate with FTC enforcement.
What is the Stop KOSA campaign? Stop KOSA is a campaign by Fight for the Future, the ACLU, and 90+ organizations opposing the bill on grounds that it will suppress LGBTQ+ content, expand teen surveillance, and cause the most harm to communities it claims to protect.
Where the Kids Internet and Digital Safety Act Stands Now
The kids internet and digital safety act markup is committee-level advancement, not passage. The kids online safety act senate bill passed 91–3 in 2024 with an intact duty of care but has not moved through the 119th Congress Commerce Committee despite 75+ co-sponsors (Children and Screens, February 2026). Floor votes, Senate reconciliation, and sustained platform lobbying remain ahead.
What We'll Be Watching
Whether the preemption clause is narrowed before any floor vote
Whether the duty of care standard is restored to its Senate version languageHow the actual knowledge standard is ultimately defined in final text
Whether age verification mandates carry technical specificity or remain a general obligation platforms satisfy with a birthdate fieldWhether FTC funding and enforcement authority in the accompanying provisions are sufficient to mean anything
Because right now, the gap between what this bill says and what it would actually require platforms to change is large enough to raise a fair question: is this legislation, or is this performance?
References
American Civil Liberties Union. (2024, July 30). ACLU slams Senate passage of Kids Online Safety Act, urges House to protect free speech. Retrieved from https://www.aclu.org
Center for Democracy & Technology. (2023, May 25). Press release: More than 90 human rights and LGBTQ groups sign letter opposing KOSA. Retrieved from https://cdt.org
Children and Screens. (2026, February). Policy update: February 2026. Retrieved from https://www.childrenandscreens.org
Congress.gov. (2025). S.1748 – 119th Congress (2025–2026): Kids Online Safety Act. Retrieved from https://www.congress.gov
House Energy and Commerce Committee Democrats. (2026, March 6). Pallone: Republicans' kids safety bills let big tech off the hook. Retrieved from https://democrats-energycommerce.house.gov
McPherson, L. (2026, March 5). House panel advances kids online safety bill, but path to passage appears grim. Washington Times. Retrieved from https://www.washingtontimes.com
Mollenkamp, A. (2026, March 6). Kids online safety bills move forward from Senate, House panel. Roll Call. Retrieved from https://rollcall.com
Neel, J. (2026, March 6). House panel advances slate of kids online safety bills along party lines. The Hill. Retrieved from https://thehill.com
Office of the Surgeon General. (2023). Social media and youth mental health: The U.S. Surgeon General's Advisory. U.S. Department of Health and Human Services. Retrieved from https://www.hhs.gov
Puig, E. (2026, March 7). House panel marks up kids digital safety act amid Democrat backlash. The Record from Recorded Future News. Retrieved from https://therecord.media
StopKOSA.com / Fight for the Future. (2024). Stop KOSA: Why this bill harms the communities it claims to protect. Retrieved from https://www.stopkosa.com
TIME Magazine. (2025, May). Kids Online Safety Act — What to know as KOSA is reintroduced. Retrieved from https://time.com
Wikipedia. (2023, updated 2026). Kids Online Safety Act. Retrieved from https://en.wikipedia.org



