Science technology concept. Education technology. EdTech.

Protection or Censorship? – New Child Protection Law Sparks First Amendment Debate

Protection or Censorship? – New Child Protection Law Sparks First Amendment Debate

July 31, 2024

Protection or Censorship? – New Child Protection Law Sparks First Amendment Debate

By: Abbey Block

As this congressional session draws to a close, legislators are debating a new bill aimed at protecting kids and teens from the dangers of the Internet and social media. If enacted, the bill known as the Kids’ Online Safety Act (“KOSA”), would impose a “duty of care” on online platforms, requiring them to take affirmative steps to mitigate harm to their users who are under the age of seventeen.

Under the law, covered “high impact online companies” must “exercise reasonable care in the creation and implementation of any design feature” to mitigate “harm” to minors. Although the statute does not define the term “harm” it does provide examples of the kinds of content that may be harmful to minors, including the following:

  • Mental health disorders (e.g., anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors);
  • Patterns of use indicative of addiction-like behavior;
  • Physical violence, online bullying, and harassment of a minor;
  • Sexual exploitation and abuse of minors;
  • The advertisement of “illegal products” including narcotics, tobacco products, gambling, or alcohol to an individual it knows is a minor; and
  • Predatory, unfair, or deceptive marketing practices or financial harm.

The bill goes on to state that covered platforms are not required to prevent minors from “deliberately and independently” searching for or specifically requesting/obtaining content that constitutes “evidence-informed information and clinical resources” for the prevention or mitigation of the aforementioned harms. However, practically speaking, it seems difficult to square these two principles or implement measures that proactively block all “harmful” content while simultaneously providing access to “resources” related to that harmful content. Adding further confusion, the bill does not specify what constitutes “evidence-informed information and clinical resources.”

Additionally, certain protective features must be implemented by default for minor users, such as parental controls and limitations on the minor’s personal information that is made publicly available (e.g., the minor’s geolocation and other personal data). Similarly, a covered platform is also prohibited from implementing design features that encourage or increase the frequency, time spent, or activity of minors on the platform.  The bill provides examples of such features which include auto-play, infinite scrolling, in-game purchases, rewards for engagement, and even the mere  sending of notifications. These examples illustrate just how tricky it may be for online platforms to conform to the bill’s requirements, given that seemingly all features of online platforms are designed to increase engagement. It begs the question of whether the removal of these features would ultimately gut the online platform’s functionality.

The bill has garnered rare bipartisan support and was overwhelmingly approved by the Senate on Tuesday in a vote of 91-3. Now, it’s set to go back to the House of Representatives, where it may undergo further revisions before facing another vote.

Advocates of the bill believe it is a necessary first step to hold social media platforms accountable for prioritizing profit over the health and safety of the country’s kids. In support of this position, they point to the health advisory issued by the Attorney General, which warned of the ways in which social media can negatively impact youth mental health. Last month, the parents of children who died as the result of their interactions on social media lobbied members of Congress and testified before the Senate Judiciary Committee, urging the legislators to pass KOSA.

A law that seeks to protect children and hold formidable Big Tech giants accountable surely must be non-controversial, right? Not so.

Many first amendment and privacy advocates oppose the bill, arguing that it would provide for censorship and surveillance of children online. For example, the Electronic Frontier Foundation argues that KOSA’s regulations will prevent teens from accessing vital mental health resources given that even benevolent or resource-focused content addressing suicide, depression, anxiety, and eating disorders will be blocked from their access. Similarly, many suspect that the easiest and cheapest way to ensure compliance with the statute’s provisions will be to simply block any and all content that could potentially be characterized as “harmful” to minors – resulting in widespread censorship.

This isn’t the first time the Government has tried to regulate speech in an effort to protect children. In the case of Bantam Books, Inc. v. Sullivan, 372 U.S. 58, 71 (1963), the Supreme Court contemplated the legality of a government commission designed to protect minors from offensive or obscene language and images in written publications. The commission was provided with authority to screen published materials for “objectionable” content and to investigate and recommend the prosecution of publishers if they refused to cease the distribution of any materials deemed objectionable. The Court held that the commission’s actions amounted to a prohibited prior restraint upon speech which “[fell] far short of the constitutional requirements of governmental regulation of obscenity.”

More recently, District Courts throughout the country (from Arkansas to California) have blocked laws which seek to protect minors through the regulation of the design and features of online platforms.

So, what does this mean for KOSA? It seems inevitable that, if passed into law, KOSA will face significant legal challenges. Most specifically, opponents will argue that it is a content-based regulation of speech, i.e., a restriction that directly targets speech based on its subject matter (here, “harmful” content). Content-based restrictions on speech are generally subject to the most stringent level of judicial scrutiny – strict scrutiny – and are presumptively unconstitutional.[1] A content-based restriction on speech will be permitted only if it is narrowly tailored to serve a compelling governmental interest.

Here, it will likely be argued that the government maintains a compelling interest in protecting minors from the well-recognized, detrimental harms of social media – an argument easily supported by anecdotes of children who have harmed themselves because of online bullying, harassment, or harmful message boards. Indeed, this argument is likely even more compelling in light of the upcoming election cycle. It seems unlikely that any legislator, from any state or political party, would be willing to publicly disagree that protecting minors from exposure to injurious content on the Internet is a laudable goal.

But KOSA is arguably not narrowly tailored to serve that governmental interest given that it restricts minors’ access to broad categories of content based solely on the content’s subject matter (e.g., depression, drugs, alcohol, gambling, suicide). This means that teens may be blocked from accessing totally innocuous content simply because it implicates topics that are deemed “harmful” under the statute. For example, under KOSA, a minor would likely be unable to access a website that provides helpful resources for victims of sexual assault; or, in a less serious scenario, a teen could be blocked from accessing an online cookbook if that publication happens to also feature cocktail recipes. Simply put, KOSA’s reach extends too far and limits speech that falls outside of the Government’s compelling interest in protecting children.

Given these circumstances, even if the bill is eventually enacted into law, it seems likely that the implementation of its restrictions will be significantly delayed as the government combats the inevitable legal challenges that are to come. While we all agree that the protection of children is a noble objective, there is far less consensus about how to accomplish that goal. KOSA’s regulation of “design features” connected to “harm” is likely to eventually fall apart under the First Amendment’s auspices of content-based regulation. So, while the bill may provide a feel-good sugar high for a few moments, we are likely to end up with the same daunting challenge of protecting children from harmful—and ever-changing—threats posed by social media platforms and unrestricted websites.

[1] Although it should be noted that the Court in the recent Arkansas case applied an intermediate level of scrutiny, requiring that the law be narrowly tailored to serve a significant governmental interest.

Abbey Block

Abbey Block

Abbey Block found her path in law as a journalism major, coupling her passion for advocacy through writing with her litigation experience to create persuasive, effective arguments.

Prior to joining Ifrah Law, Abbey served as a judicial law clerk in Delaware’s Kent County Superior Court, where she was exposed to both trial and appellate court litigation. Her work included analyzing case law, statutes, pleadings, depositions and hearing transcripts to draft bench memoranda and provide recommendations to the judge.

Dolce Vita Ruling a Win for Cookies and Pixels Alike
Nov 21, 2024

Dolce Vita Ruling a Win for Cookies and Pixels Alike

By: Robert Ward
Missouri Votes for Legalized Online and Retail Sports Betting
Ifrah on iGaming |
Nov 6, 2024

Missouri Votes for Legalized Online and Retail Sports Betting

By: Sara Dalsheim
A Modest Proposal to Reduce AI Liability: Add Warnings
Ifrah on iGaming |
Nov 4, 2024

A Modest Proposal to Reduce AI Liability: Add Warnings

By: Abbey Block
FTC’s Operation AI Comply Generated in Part by Fear of Scale
FTC Beat |
Oct 24, 2024

FTC’s Operation AI Comply Generated in Part by Fear of Scale

By: Jordan Briggs

Subscribe to Ifrah Law’s Insights