You need to open your eyes to get the right to privacy of children

Data privacy is a hot topic, but children’s privacy is often kept in the backburner.

Why?

Compliance with the Child Data Protection Act is difficult because it involves determining the exact age of your visitors – and playing by the rules accordingly.

Although the Children’s Online Privacy Protection Act (COPPA) has become an adult at age 24 – it was passed in 1998 – the advertising industry continues to struggle with the challenge of age verification.

Speaking at an International Association of Privacy Professionals event in Washington, D.C. earlier this week, Virginia Lee, Cisco’s privacy officer for the United States, said: “It’s hard to determine how old someone is.

“If you get it wrong, you’re confused,” Lee said. “And it’s very easy to get it wrong.”

Who knows?

Being on the right side of COPPA is not easy – and one mistake is very costly. Businesses can be fined more than 40,000 for violations.

However, there is a knowledge exception in COPPA, which means that general visitor site or app operators will only be subject to the law if they have actual knowledge that children under the age of 13 are sharing their personal information without parental consent.

Companies that claim ignorance about how old their audience is can use the standard of actual knowledge as a hoax.

It’s a bit of a head scratcher. Some advocates argue that getting rid of the knowledge exception and enforcing strict age verification obligations is what COPPA needs to make it more effective.

Otherwise, the incentive is out of deficit. Companies can either err on the side of caution (which can hinder growth) or close their eyes (which is not good for children’s privacy).

But companies usually want to get this right. Donna Fraser, SVP of the BBB National Program’s privacy initiative, said it was rare for a business to be actively pushing against the child protection railroad.

There are exceptions, though.

Take TikTok. Prior to its rebranding in 2018, when the app was still called Musical.ly, parents sent thousands of requests demanding the app to delete their children’s data. When Musical.ly refused to do so, those requests became complaints, and the following year TikTok was fined millions by the Federal Trade Commission.

But the story could have ended differently for TikTok, Fraser said, if it had considered the response from unhappy parents and used it to better its young viewers.

But even though the allegations did not give Musical.ly real knowledge that children under the age of 13 were using its services, as the FTC found, the app’s viewers should have had “constructive knowledge” about age.

Real knowledge is when a company knows something specific versus constructive knowledge, which is something that a company is expected to know rationally.

All Musical.ly needs to do is find out the Score Press Report itself about the popularity of its app among teenagers and young adults.

Using constructive knowledge will help companies implement best practices and act on the right side of the law before they have a real understanding of whether their audience includes children under the age of 13.

“If you have a platform that is not designed for children, but then [discover] They are starting to use it, you have to [address] This, ”said Lee of Cisco.Comic: "Protect consumer privacy!"

Has begun to capture the value of constructive knowledge. This created a presence in the Children’s Privacy Act, a bill introduced in the Senate last year stating that sites and services must obtain parental consent if they have actual or constructive knowledge that they are processing child data. (Privacy means “protecting the information of our vulnerable children and youth,” I do not underestimate you.)

In addition to its ridiculously overly enthusiastic acronym, the law’s emphasis on structural knowledge could help solve some local problems with COPPA, Fraser said.

Self-control

But when it comes to issues, children’s privacy laws have their own set of rules

The bill, for example, aims to update the COPPA by repealing provisions that allow industry self-regulation, Fraser said. However, doing so would also prevent companies from taking proactive steps to implement improvements to their own online platforms with child safety in mind.

Prohibition of self-control “could have a chilling effect,” Lee said. Companies, seeking to avoid a wrong move, could stop providing meaningful online platforms for adolescent development, he said.

And it will be both a shame and an insult to the youth.

“There is harm in social media, but there are also good things that can be done [allow] Adolescents can expand their worldviews instead of narrowing them down, “said Katherine Ferrara, associate general counsel at Unilever.

It is also difficult for legislation to be effective without incorporating the views of all parties involved.

Policies can “sound good and look good on paper,” Fraser said, “but what does it look like in reality? That’s what we want. If you want something, you need to bring industry together and work with legislators on education.”

And active self-control is a force for good, he said.

“Self-control has helped [implement] COPPA over the last 20 years, “said Fraser, noting that the BBB national program has conducted more than 200 investigations since the law was passed.

By comparison, he claims, the FTC has managed less than 40.

Leave a Reply

Your email address will not be published.