As we start 2024, one thing on the minds of many businesses is the current status of privacy laws that impact collection and use of information from children in the United States. And, if not on their minds, perhaps it should be. Historically the children’s privacy law of note in the United States was the federal Children's Online Privacy Protection Act (COPPA). It requires that companies get parental consent before collecting information from children online. The Federal Trade Commission is the primary enforcer, but state regulators can bring cases as well.
There have been dozens of cases brought over the years but the law - which is almost 30 years old - is showing its age. Some have concerns that the definition of “children” is limited to those under 13. Others are concerned that the law does too little, requiring that parents provide verifiable consent but not, for example, significantly limiting children’s ability to use online platforms. While the FTC will be reviewing its COPPA Rule, it cannot change the underlying law. States have consequently begun to take things into their own hands.
First, the new US state “comprehensive” privacy laws have provisions that apply to children. For example, in California, businesses that have actual knowledge that a child is under 13 must obtain parental consent to sell or share that child’s data, in addition to any COPPA consent requirements. For those between 13 and 16, companies under the California law must get the child’s consent for the sale or sharing of their information. Other states define children’s information as “sensitive.” And under these laws, companies must tell consumers (or for minors, their parents), if they share sensitive information with third parties. They must also, in some cases, conduct data protection assessments before processing sensitive information.
Second, one state – California - passed a law (the California Age-Appropriate Design Code Act) which is similar to the UK’s Children’s Code (the Age Appropriate Design Code). The law was supposed to go into effect on 1 July this year, and would have applied to those that provide online products, services or features “likely to be accessed by children” (ie those under 18). The law was temporarily enjoined, however, at the end of 2023. Under the law, if it goes into effect as intended, companies will be prohibited from:
Further, companies will not be able to collect a child’s geolocation (without a compelling reason) and will be limited to collecting only the personal information necessary to provide their services. In addition, notices and terms will need to be provided in age-appropriate language. Finally, companies will need to conduct an impact assessment before offering services accessible to children.
Third, both at a federal level (FERPA) as well as in approximately half of US states, there are laws that address student privacy. The main focus of these laws is what information schools can collect from students. However they also impact how schools’ business partners can interact with students. For example, under the state laws, schools’ business partners must protect student personal information from unauthorized access, use, destruction, or disclosure. They must also delete student information when requested by the school.
Finally, worth keeping in mind are new US state laws aimed at social media platforms. These laws have so far been passed in Arkansas, Montana, Ohio, Texas, and Utah. All that were slated to go into effect have been stayed, however, suffering challenges that the laws are unconstitutional on first amendment-free speech grounds. Generally, these laws would require individuals to provide their age when creating an account and require children's parental consent before the account can be finalized. They would also limit the amount of information the platforms can collect from children and the type of advertising that is accessible to children.
If you will be doing business in the US and collect information from children, what steps can you take to prepare for the laws that are in effect, and those that are upcoming?
These steps can help as companies look to the US market, and consider what activities would be appropriate when interacting directly with children.
Victoria Hordern looks at the impact of AI on children, and at the role of AI and data protection legislation in protecting them from potential AI-related harms.
1 of 5 Insights
Megan Lukins looks at recently updated ICO guidance on data protection compliance when using age verification to protect children from harmful online content, particularly in the context of the UK's Online Safety Act.
2 of 5 Insights
Debbie Heywood looks at what the ICO's Children's Code and the Online Safety Act mean by the term "likely to be accessed by children" and at overlaps and differences in requirements.
3 of 5 Insights
ECIJA's Teresa Pereyra Caramé and Rubén Lahiguera Gallardo look at the Spanish Data Protection Authority's practical guide on protecting minors from inappropriate online content.
5 of 5 Insights