Sheppard Mullin's Liisa Thomas and Kathryn Smith look at what businesses operating in the US need to do to comply with current and incoming children's privacy laws.
As we start 2024, one thing on the minds of many businesses is the current status of privacy laws that impact collection and use of information from children in the United States. And, if not on their minds, perhaps it should be. Historically the children’s privacy law of note in the United States was the federal Children's Online Privacy Protection Act (COPPA). It requires that companies get parental consent before collecting information from children online. The Federal Trade Commission is the primary enforcer, but state regulators can bring cases as well.
There have been dozens of cases brought over the years but the law - which is almost 30 years old - is showing its age. Some have concerns that the definition of “children” is limited to those under 13. Others are concerned that the law does too little, requiring that parents provide verifiable consent but not, for example, significantly limiting children’s ability to use online platforms. While the FTC will be reviewing its COPPA Rule, it cannot change the underlying law. States have consequently begun to take things into their own hands.
First, the new US state “comprehensive” privacy laws have provisions that apply to children. For example, in California, businesses that have actual knowledge that a child is under 13 must obtain parental consent to sell or share that child’s data, in addition to any COPPA consent requirements. For those between 13 and 16, companies under the California law must get the child’s consent for the sale or sharing of their information. Other states define children’s information as “sensitive.” And under these laws, companies must tell consumers (or for minors, their parents), if they share sensitive information with third parties. They must also, in some cases, conduct data protection assessments before processing sensitive information.
Second, one state – California - passed a law (the California Age-Appropriate Design Code Act) which is similar to the UK’s Children’s Code (the Age Appropriate Design Code). The law was supposed to go into effect on 1 July this year, and would have applied to those that provide online products, services or features “likely to be accessed by children” (ie those under 18). The law was temporarily enjoined, however, at the end of 2023. Under the law, if it goes into effect as intended, companies will be prohibited from:
- using 'dark patterns'
- engaging in actions that are “materially detrimental to children” or
- profiling children.
Further, companies will not be able to collect a child’s geolocation (without a compelling reason) and will be limited to collecting only the personal information necessary to provide their services. In addition, notices and terms will need to be provided in age-appropriate language. Finally, companies will need to conduct an impact assessment before offering services accessible to children.
Third, both at a federal level (FERPA) as well as in approximately half of US states, there are laws that address student privacy. The main focus of these laws is what information schools can collect from students. However they also impact how schools’ business partners can interact with students. For example, under the state laws, schools’ business partners must protect student personal information from unauthorized access, use, destruction, or disclosure. They must also delete student information when requested by the school.
Finally, worth keeping in mind are new US state laws aimed at social media platforms. These laws have so far been passed in Arkansas, Montana, Ohio, Texas, and Utah. All that were slated to go into effect have been stayed, however, suffering challenges that the laws are unconstitutional on first amendment-free speech grounds. Generally, these laws would require individuals to provide their age when creating an account and require children's parental consent before the account can be finalized. They would also limit the amount of information the platforms can collect from children and the type of advertising that is accessible to children.
If you will be doing business in the US and collect information from children, what steps can you take to prepare for the laws that are in effect, and those that are upcoming?
- Remember that you need consent if collecting personal information online from those under 13: This requirement has been around since 1998. In that time “online” may have changed, and our understanding of “personal information” has expanded as well. Keep in mind that online platforms include mobile apps and might include interactive toys as well. And personal information is more than just a name or email address - it includes images and voices and, according to the FTC, persistent identifiers as well.
- Consider principles of data minimization: Even if a law doesn’t specifically require that you minimize what you collect from children (but note, many do), the more you have, the more you have to protect. For now COPPA - when applicable - restricts what can be collected to “what is needed to provide the service or feature,” and the new state laws will as well.
- Evaluate whether your collection practice constitutes a 'dark pattern': The FTC, like the EDPB, has expressed concerns about tricking or misleading consumers into giving more information than they would have otherwise. Or, to agreeing to have their information used in ways to which - if they had reflected more - they would not have agreed. These activities are called 'dark patterns' and among other recommendations, the FTC suggests testing programs from the perspective of the user. For kids, that might include not only the child, but also the parent who might be giving consent for COPPA purposes.
- Keep in mind concepts of harm: Of particular concern for regulators are activities that might harm children. In the absence of clear direction from regulators, or case law under these laws, businesses can question what a regulator might think of their platform. Could someone argue it exploits children, for example? What countervailing measures or interests exist for the activities that involve children? What steps are being taken to minimize potential harm? In other words, engage in the thought process that one goes through when conducting a GDPR Data Protection Impact Assessment.
These steps can help as companies look to the US market, and consider what activities would be appropriate when interacting directly with children.