If you’ve been keeping up with the latest privacy regulations for children’s online safety you know that protecting kids online is a rapidly emerging trend in privacy law. Children’s privacy regulations are not new. However, they’ve come a long way since the Family Educational Rights and Privacy Act (FERPA) was passed in 1974 and the Children’s Online Privacy Protection Act (COPPA) in 1998.
Children have been using the internet daily to learn and entertain themselves for decades. Yet the necessary safeguards to protect kids’ privacy are missing from most websites, apps, and other technologies. Recently, the UK Age Appropriate Design Code Act 2021 set out to change this and paved the way for new legislation based on higher standards for applying data protection laws to children and digital services.
These standards apply not only to U.K. websites but to any website or app accessible by children in the UK – whether it’s meant for children to use or not. The UK Age Appropriate Design Code became effective on 9/2/20 with a 12-month transition period.
Inspired by the UK, California was the first U.S. state to pass an Age-Appropriate Design Code Act (AB-2273) to protect children’s privacy. Like the UK code, it’s designed to ensure technology companies proactively take a privacy-by-design and default approach to protect children’s privacy and safety when creating or updating online services, products, or features that children will likely access.
The Gap in U.S. Children’s Privacy Regulations
Unlike the U.S. Children’s Online Privacy Protection Act (COPPA), which provides protection for children aged 13 or under, the CA Kids Code is designed to protect all children under 18 in California. Yet there is no federal regulation to protect children between the ages of 13 and 18.
Despite existing legal protections, Meta continues its plans to let American and Canadian teenagers into its virtual reality app, Horizon Worlds. The app currently allows people 17 and up to physically interact with each other in virtual spaces resembling real life.
Meta claims it will use privacy by default guidelines to protect children, but this is one of many platforms to watch. Gaming platforms and apps have consistently violated children’s privacy regulations such as COPPA.
Protecting Kids Online: The Latest Design Codes and Regulations to Consider
There’s more happening with children’s privacy regulations than the two Acts in the UK and California. Regulators, including the FTC, have warned that they will focus on children’s privacy. All 5 FTC commissioners said they would begin to crack down on companies that illegally surveil children online.
While some geographies, such as Australia and Canada, don’t discriminate between children and adults regarding their data protection laws, the GDPR only protects children’s information until they are 16 years old.
As recently as October 2022, members of Congress have begged the FTC to make updating COPPA a priority. Although it protects the children’s privacy for those under 13, it hasn’t been updated since 2013 and does not cover information collected from adults that may pertain to children.
But because updating COPPA failed to make it to Congress’ 2023 fiscal plan, many states have taken it upon themselves to enact legislation to protect children’s privacy better online. States with active children’s privacy regulations in 2023 include:
- South Caroline
- New Jersey
- New York
New legislation to protect children’s privacy was also attempted in New Mexico, West Virginia, and Virginia but didn’t pass.
Although talks of updating COPPA have stalled, there’s still much focus on protecting kids’ privacy online nationwide. Again, President Joe Biden demanded a ban on online ads targeting children during his SOTUA in February 2023. And beyond the President, Senate leader Chuck Schumer is seeking a June vote on children’s online protection legislation.
Other federal Acts various representatives are still trying to bring to the table include EARN IT, KOSA, and Clean Slate for Kids Online Act. While we don’t know which Act will gain momentum, Congress members enthusiastically support new children’s privacy regulations.
Two States Pass Children’s Social Media Bills
In March 2023, the Governor of Utah signed the Utah Social Media Regulation Act, which requires minors to obtain the consent of a guardian before joining social media platforms. Effective March 2024, this is one of the most aggressive steps so far by U.S. lawmakers to protect kids online.
The Utah Act requires social media platforms to conduct age verification for all Utah residents, ban all ads for minors, and impose a curfew between the hours of 10:30 pm – 6:30 am making the site off-limits for anyone under the age of 18. And despite much controversy, it also requires social platforms to give parents or guardians access to their teens’ accounts.
Within a month, Governor Sarah Huckabee Sanders approved similar legislation in Arkansas. The Arkansas law will apply to new accounts starting September 01, 2023. Senator Tyler Dees, the Arkansas bill’s sponsor, said the new law “sends a clear message that we want to partner with parents and empower them to protect our children.”
It is likely other States will continue to pass similar legislation. However, privacy and free speech experts have raised concerns about the potential harms of surveillance and censorship due to proposed children’s online safety legislation.
At the same time, parents and guardians must consider whether the benefits outweigh the cost, as increased social media use has led to significant negative consequences for teenagers and young adults. The hope is that bills like these will successfully reduce harm to children from social media use.
Key Takeaways from the Latest Children’s Privacy Regulations
While each regulation has its nuances, in general, the following takeaways can help businesses design the appropriate systems to protect children online:
Practice Privacy by Default and Privacy by Design
Design new products and services to consider data privacy proactively. For existing services, set default settings that automatically offer users a high level of privacy. Require explicit consent for any data collection or processing activities. And don’t track users unless they opt-in to tracking – especially users under 18 years of age.
Avoid Dark Patterns
A dark pattern is a term describing a variety of manipulative design choices to persuade users to make a decision they wouldn’t have otherwise made. They can include pre-selections on forms, not giving people opt-out options, hidden opt-out controls, repetitive attempts to collect information or turn on tracking, and using algorithms to change purchase decisions.
Using dark patterns is prohibited under the California Age-Appropriate Design Code Act and other data protection laws.
Conduct Data Protection Impact Assessments with Kids in Mind
Conducting data protection impact assessments (DPIAs) is required for businesses under California’s new child design law for any new online service, product, or feature likely to be accessed by children before it is offered to the public. And the DPIAs must be maintained as long as the online service, product, or feature is available.
DPIAs are also a key part of compliance with the GDPR and, thus, the UK Age Appropriate Design Code. The business must identify the purpose of its service, whether it collects children’s personal information and how it’s used, and the risks of harm to children that the data management practices of the business could cause.
Analyze any harmful content, (addictive) features, or algorithms that children could access. Harm might include contact with predators, exposure to exploitation or other content, and even exposure to ads, among many other things.
>>> Schedule an Assessment Manager demo today and learn how to streamline your data protection impact assessments to prepare for children’s privacy regulation enforcement.
Prioritize the Best Interests of Children
Overall, you should prioritize the best interests of children before business profits and goals. While this may sound counterintuitive, that is the framework these Acts use.
Consider the impact on the child’s physical health, mental health, and wellbeing. And avoid using data in ways that would have any negative impacts on those areas. If you can’t do so, you must find a way to block children from accessing your product or service or risk significant fines or penalties and ongoing compliance program requirements.
Minimize Data Collection
Reduce risk effectively by only collecting data that is absolutely necessary for business functions. Rather than collecting more data, focus on collecting the highest quality data with consent.
Don’t collect precise geolocation collection (except if explicitly necessary). And review your data retention policies to see if there’s room to reduce the amount of time or amount of data retained. As we see more data protection laws for children and adults, data minimization can drastically simplify your data protection program.
Data Protection That Scales as Your Business Grows
No matter where you are in your data privacy journey, PrivacyCentral can help you evaluate, automate, and streamline the common controls of your current and future compliance across jurisdictions. Meet PrivacyCentral today.