Insights < BACK TO ALL INSIGHTS
For the Children!: Children’s Online Safety Becomes Focus of State and Federal Law
For the Children!: Children’s Online Safety Becomes Focus of State and Federal Law
By: Nicole Kardell
Have you seen the latest craze in babysitting? If you are ever out in public, you have. Think of the last time you were at a restaurant, stoplight, or airport, and noticed a child, aged between tot and tween, fixated on his or her device. That’s the craze: the cheapest, most available babysitting option these days is a device that can entertain children for hours, any time, all the time, anywhere [there is wifi]. It is an unfortunate choice that a large number of parents make, which has resulted in an “ongoing youth mental health crisis”: use technology to occupy children’s time (and … to offload parenting duties).
Because so many parents have made technology their babysitting choice, the job to protect children online—from online predators, to age-inappropriate content, to unauthorized use of children’s personal information—has fallen on the backs of lawmakers and has become a very hot topic among legislators, regulators, and consumer advocates. The result is new legal frameworks and new compliance obligations for companies who collect and process children’s data. If you are such a company, be aware of these legal developments. Read further for some background and a cautionary tale of what can happen if you—like so many parents—don’t pay attention. While parents face long term consequences of adverse health effects on their children, companies may face their own unwanted babysitters.
Several states have passed laws to enhance protections for children online. These include Arkansas, California, and Utah. Other states, including Maryland, Minnesota, Nevada, New Mexico, and Oregon, are deliberating legislation. California’s most recent law, the Age-Appropriate Design Code Act, dramatically expands on existing child protection laws and will require impacted companies to, (1) develop data protection impact assessments, (2) institute privacy by default practices, (3) tailor products based upon age, and (4) provide clear policy and terms that outline consumer rights and how those rights may be exercised. The California law takes effect July 1, 2024, so companies have a little over a year to assess and to prepare.
At the federal level, several bills have been introduced in recent months. The proposed measures address a number of different aspects of children’s online safety: limits to children’s social media presence, more robust age verification, expanded scope for the 1996 Children’s Online Privacy Protection Act (COPPA).
Meanwhile, regulators at the FTC (the agency charged with COPPA oversight and enforcement) have been busy furthering child-safety initiatives one company at a time, with a keen focus on Facebook (owner of the popular platforms Instagram and WhatsApp). Earlier this month, the FTC announced proposed changes to the agency’s 2020 consent order with the company. The proposed order would, among other things, expand protections for children using the platform.
The FTC’s proposed order would be the third iteration of such an order by the agency and against Facebook, each one re-opening the former order, accusing Facebook of violating the prior, and adding further requirements that go beyond statutory and regulatory mandates. In a press statement published May 3, 2023, the agency alleged that the company “misled parents about their ability to control with whom their children communicated through its Messenger Kids app.” Alleging violations of COPPA, the proposed order would impose strict limitations on Facebook’s ability to use information it collects from children and teens “(i.e. users under the age of 18).” Facebook would be permitted to use minors’ information only to provide the service and for security purposes. The company would be prohibited from monetizing any information, or using for its own commercial gain, children’s information, even after the reach the age of majority.
Note that the FTC defines “child” in the proposed order as “under the age of 18.” The definition of “child” in COPPA, the law on which the FTC’s authority is based, is “under 13 years of age.” Moreover, the contemplated restrictions exceed current federal legal requirements. Unless and until amended, COPPA does not prohibit the use or disclosure of children’s data. Instead, it generally requires parental notice and consent. The FTC is effectively expanding the scope of child protections by decree.
While few people are going to balk at an agency’s overreach when the purpose is to protect children, the thrice-hit Facebook provides a cautionary tale: Facebook was initially tagged in 2012 by the FTC for its failure to honor its privacy policies. A decade later, the company faces strict oversight and restrictions that go beyond the letter of the law. If you get within the sights of an enforcement agency, be prepared to be at the end of a very short leash… indefinitely (or a very tough Nanny McPhee).
Companies who collect and process children’s data — and by that we mean individuals under 18 years old — should look carefully at their data collection practices. If you collect data from children located in states with enacted, or pending legislation, you should track your policies and practices for legal compliance. If you turn a blind eye to whose data you collect (and we as a society continue to abdicate our own social responsibilities such as parental duties), you may find yourself in a long term relationship with state or federal regulators, one you won’t love but you cannot leave.