Internet filter to control social media content for kids. Tiny man, woman and children standing near PC lock screen flat vector illustration. Ban concept for banner, website design or landing web page

New California Law Establishes Broad Protections for Children’s Online Privacy – Exceeding Federal Requirements

New California Law Establishes Broad Protections for Children’s Online Privacy – Exceeding Federal Requirements

October 4, 2022

New California Law Establishes Broad Protections for Children’s Online Privacy – Exceeding Federal Requirements

By: Jake Gray

California made history in September as the first state to enact legislation that punishes technology companies for violations of minors’ privacy and for practices that jeopardize minors’ safety in an effort to prioritize “the privacy, safety, and well-being of children over commercial interests.” 

On September 15th, Governor Newsom signed The California Age-Appropriate Design Code Act (“the Act“) into law. The legislation, which was passed by both the State Senate and Assembly unanimously in August, requires businesses that provide online services, products, or features “likely to be accessed by children” to ensure “that those online products, services, or features are designed in a manner that recognizes the distinct needs of children at different age ranges.”[1] In the Governor’s press-release, First Partner Jennifer Siebel Newsom expressed concern over how technology design impacts children:

“As a parent, I am terrified of the effects technology addiction and saturation are having on our children and their mental health. While social media and the internet are integral to the way we as a global community connect and communicate, our children still deserve real safeguards like AB 2273 to protect their wellbeing as they grow and develop.”

Law Becomes Effective July 2024; Enforced by California Privacy Protection Agency

The Act, which goes into effect July 1st, 2024 in order to give businesses sufficient time to adjust their online products and services in compliance with the Act’s standards, will be enforced by California’s Attorney General and implemented by the California Privacy Protection Agency, which was created by the California Privacy Rights Act of 2020 (CPRA). Businesses that receive notice from the Attorney General will have a 90-day period to cure any violations. While the Act does not create a private right of action, the California Attorney General could impose civil fines ranging from $2,500 per affected child for negligent violations and up to $7,500 per affected child for intentional ones. 

Act Expands Online Protections, Exceeding Federal COPPA Law to Cover Minors up to Age 18

Modeled after the Age-Appropriate Design Code established in the United Kingdom, the Act goes beyond existing state and federal privacy and security laws protecting minors, such as the federal Children’s Online Privacy Protection Act (COPPA) and California’s Parent’s Accountability and Child Protection Act and CPRA. 

Broader Standards for Determining When Online Services Are Accessed by Children

The federal government’s COPPA’s primary requirements, for instance, require online websites or products to have a clear and comprehensive privacy policy as well as provide parental notice and gain parental consent prior to collecting information from children. However, while COPPA defines “children” as individuals under the age of 13, the Act defines “children” as consumers under the age of 18, a standard which significantly expands Internet privacy and safety protections to all minors. Even further, the Act incorporates additional indicators, such whether an online product or service is “likely to be access by children,” beyond those contained in COPPA:

  1. The product is “directed to children” as defined (and applied) by COPPA, which entails that it is either a commercial website or online service targeted to children, or a portion of a commercial website or online service that is targeted to children. [2]
  2. The product is determined to be routinely accessed by a significant number of children based on audience composition as determined by competent and reliable evidence.
  3. Its advertisements are marketed to children.
  4. A significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.
  5. Its design elements are known to be of interest to children, examples of which include, but are not limited to, games, cartoons, music, and celebrities who appeal to children.

These new indicators mean that children will be protected not only by online products and services specifically “directed to” and tailored for their consumption, as in COPPA, but by all products they are likely to access. 

Applicability – Act follows CPRA Coverage Standard

Not all technology companies are affected, however. As the Act “furthers the purposes and intent of the [CPRA],” the new statute applies only to companies subject to the CPRA. CPRA-covered businesses are for-profit entities that meet one or more of the following criteria: have $25 million or more in annual gross revenue, buy or sell the personal information of 100,000 or more users, or derive 50% of annual revenue from selling or sharing consumers’ personal information.

Summary of Key Requirements

In general, the California Age-Appropriate Design Code Act imposes a set of requirements and a set of prohibitions on covered businesses. The set of requirements obliges covered businesses to do the following:

  1. Complete a Data Protection Impact Assessment (“DPIA”) for any online service, product, or feature likely to be accessed by children and maintain documentation of this assessment as long as the online service, product, or feature is likely to be accessed by children. Further, businesses are required to biennially review all DPIAs. A business shall biennially review all Data Protection Impact Assessments. The DPIA must address, among other things, issues such as:
    • whether the design could harm or expose children to harm, either by being exposed to, targeted by, or be subjected to harmful content or contacts;
    • whether targeted advertisements or the use of algorithms could harm children;
    • whether the design is implemented in a way to increase, sustain, or extend use of the product, including the automatic playing of media, rewards for time spent, and notifications; and
    • whether it collects or processes sensitive personal information of children.
    • Create a timed plan to mitigate or eliminate the risk before the online service, product, or feature is accessed by children in accordance with the findings of the DPIA.
  2. Estimate the age of child users with a reasonable level of certainty.
  3. Provide children with default privacy settings that offer a high level of privacy unless the businesses can demonstrate that doing otherwise is in the best interest of children.
  4. Provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature.
  5. Provide an obvious signal to the child when the child is being monitored or tracked if the online service, product, or feature allows the child’s parent, guardian, or any other consumer to monitor their activity or track their location.
  6. Provide prominent, accessible, and responsive tools to help children, or if applicable their parents or guardians, exercise their privacy rights and report concerns.

Prohibited Acts under the New Law

The Act prohibits covered businesses from the doing the following:

  1. Using the personal information of any child in a way that the business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a child.
  2. Profiling a child by default unless (1) appropriate safeguards are in place to protect children and (2) (a) profiling is necessary for the requested online service, product, or feature, and only with respect to those features in which the child is actively and knowingly engaged or (b) if the businesses can demonstrate a compelling reason that profiling is in the best interests of children.
  3. Collecting, selling, sharing, retaining, or otherwise using any personal information that is not necessary to provide an online service, product, or feature with which a child is actively and knowingly engaged, for any reason other than a reason for which that personal information was collected.
  4. Collecting, selling, or sharing any precise geolocation information of children by default unless the collection of that precise geolocation information is strictly necessary and then only for the limited time that the collection of precise geolocation information is necessary. Additionally, any precise geolocation information of a child cannot be collected without providing an obvious sign to the child for the duration of that collection that precise geolocation information is being collected.
  5. Using dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected, to forego privacy protections, or to take any action that the business knows, or has reason to know, is materially detrimental to the child’s physical health, mental health, or well-being.
  6. Using or retaining any personal information collected to estimate age or age range longer than is necessary to estimate age.

Concerns from Businesses

Opponents of the bill expressed concerns about how the rules would affect businesses early on in the legislative process. In a letter to legislators in April, for instance, TechNet and the California Chamber of Commerce claimed that the new rules would subject “far more websites and platforms than necessary” to the bill’s requirements and, further, that “the requirement that companies consider the ‘best interests’ of children is incredibly difficult to interpret.”  TechNet is a “national, bipartisan network of technology CEOs and senior executives that promotes the growth of the innovation economy.” Its members include Amazon, Apple, Cisco, Google, Oracle, Pinterest, Snap, and Meta (formerly, Facebook).

With the California legislation in place, the stakes for big tech giants become much higher when it comes to children’s privacy, especially if more states follow California’s example. 

In 2019, Google and YouTube paid a record-breaking $170 million settlement to the Federal Trade Commission (FTC) for COPPA violations. The lawsuit alleged that the companies illegally collected personal information from viewers of child-directed YouTube channels without parent notice and consent. [3] YouTube collected the information, in the form of persistent identifiers—commonly known as “cookies”—which track users across the Internet, to deliver targeted advertisements, according to the original complaint by the FTC. As part of the settlement, the companies agreed “to create a mechanism so that channel owners can designate when the videos they upload to YouTube are – to use the words of COPPA – ‘directed to children,’” in order to ensure that content creators and YouTube comply with the law. [4]

While COPPA allows for civil penalties of up to $42,530 per violation, a number which is more than five-times the maximum fine amount per intentional violation of the new California law, the number of potential violations under the California Age-Appropriate Design Code Act are far greater because of the rule’s broad applicability. [5] 

For guidance and implementation, the Act creates the “California Children’s Data Protection Working Group” to make prescribed recommendations on best practices, the members of which will include ten experts from areas including children’s data privacy, physical health, mental health and well-being, computer science, and children’s rights. The working group is responsible for delivering a report to the California Legislature regarding best practices for the implementation of the Act, and will receive input from a broad range of stakeholders, including from academia, consumer advocacy groups, and small, medium, and large businesses.

A critical component will be determining which online services are, in fact, covered by the new law, particularly those that are not obviously directed to children. If an influencer creates content initially targeted to a demographic of 20-25 year olds, but older teens flock to the website and sign up for services, is the site covered?  Does the website need to monitor users periodically? The resolution of these issues will be closely watched by industry and consumer groups.

[1] https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202120220AB2273

[2] (15 U.S.C. Sec. 6501 et seq.).

[3] https://www.ftc.gov/news-events/news/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations-childrens-privacy-law 

[4] https://www.ftc.gov/business-guidance/blog/2019/11/youtube-channel-owners-your-content-directed-children 

[5] It should be noted that “the FTC considers a number of factors in determining the appropriate amount, including a company’s financial condition and the impact a penalty could have on its ability to stay in business.” See: https://www.ftc.gov/business-guidance/blog/2019/11/youtube-channel-owners-your-content-directed-children 

Jake Gray

Jake Gray

Jake Gray is a graduate of Columbia University and an established technology researcher, currently working in the betting and futures space as a consultant to a variety of operators. He frequently writes about online gaming and sports betting laws.

Related Practice(s)
Other Posts
FTC Adds COPPA Violations to the Growing List of Privacy Concerns While TikTok is on the Clock
Aug 13, 2024

FTC Adds COPPA Violations to the Growing List of Privacy Concerns While TikTok is on the Clock

By: Jordan Briggs
Chevron Overruling Sparks Regulatory Uncertainty Across Industries
Jul 10, 2024

Chevron Overruling Sparks Regulatory Uncertainty Across Industries

By: Jake Gray
The FTC Kills Noncompetes
FTC Beat |
Apr 30, 2024

The FTC Kills Noncompetes

By: George Calhoun
Social Media Networks’ Section 230 Immunity on the Chopping Block? New York Court Allows Claims to Proceed Stemming from Buffalo Shooting
Apr 1, 2024

Social Media Networks’ Section 230 Immunity on the Chopping Block? New York Court Allows Claims to Proceed Stemming from Buffalo Shooting

By: Michelle Cohen

Subscribe to Ifrah Law’s Insights