FTC to Turn a Blind Eye to Age Verification Privacy Violations

The FTC has issued a statement saying that it won’t enforce COPPA if the violations relate to age verification.

You really can’t make this stuff up.

One of the reasons why people don’t like age verification is because it is a blatant effort for government to harvest and profile online users in a bid to do away with online anonymity. This is no longer a passive theoretical concern, either. Discord pretty much proved that point twice over.

After swearing up and down that they don’t collect and store personal information, Discords age verification system suffered from a data breach, compromising at least 70,000 users. Little did we know last year that this story was only just getting started.

Shortly after losing that personal information to third party hackers, Discord decided to roll out global age verification. You know, because they didn’t piss off enough people as it was. Still, they swore that this time was different and that your personal information would never even leave your device. Since people were already burned by Discord once, some users started building a tool devoted to defeating the age verification system (not that such a task is particularly hard given how awful age verification truly is). Others responded by flocking to alternatives to the point that the other services had to scramble to expand their server capacity in response.

Well, it was at that point that Discord started experimenting with Persona, a company that would be known for having links to Palantir – a company known for having connections with ICE. You know, because if there is one company you want to send your personal information to, it’s the company with ties to an organization that is randomly murdering Americans on the city streets. In the process, though, the “experiment” ended up having Discord roll back their comments on how your personal information was never going to leave your device. Instead, they admitted that they will, in fact, store your personal information for 7 days. Yeah, they went all Darth Vader and altered the deal.

For those praying the deal wouldn’t be altered further, don’t worry, they inadvertently altered it further. The Persona system suffered from a massive data leak and, it turned out, the system was collecting data points across 269 distinct verification checks. Initially, it was unclear if this was on a government authorized server, but more and more sources are saying that it was a government related server this was happening on – specifically a Federal Risk and Authorization Management Program (FedRAMP). Oh yeah, that personal information is probably going to get stored for 3 years, so probably not just the 7 days originally promised or that whole ‘information never leaves your device’ thing. Oopsey!

None of this makes either Discord or Persona look good. As a result, Discord cut ties with Persona and delayed their global age verification system. This as they basically admitted that they screwed up as they try and manage their Public Relations nightmare that has left them completely freaking out a they realize that they dug themselves into this big of a hole.

Here’s the thing in all of this, though: this story isn’t necessarily isolated to Discord or Persona. What this story did was pull back the curtain to showcase what really goes on with age verification in general when it comes to the storage and collection of personal information. Many had rightfully long suspected this was going on and the Discord age verification quagmire proved it.

After all, what I found noticeable throughout all of this is the fact that other vendors for age verification technology remained extremely quiet throughout all of this. This is important because if the industry standard is to not collect, store, and disclose personal information, you would think that the other vendors would throw Persona under the bus and tout their systems as ones that don’t disclose or collect personal information in the first place. That would’ve been the smart thing to do. The problem is that if they did do that, I never saw it.

Instead, what I’m seeing today is the exact opposite reaction coming from the US government. A longstanding problem is that age verification technology violates various privacy laws. One relevant law in particular is the Children’s Online Privacy Protection Act (COPPA). COPPA, of course, restricts a websites ability to collect the personal information of children without the parents authorization. The thing is, age verification technology is designed to collect such information in the first place. This is far from the only conflict with privacy laws that age verification technology has, but it is, nevertheless, one of those conflicts.

So, what is an organization such as the Federal Trade Commission (FTC) to do in all of this? Apparently, the response is to turn a blind eye to these privacy violations. From TechDirt:

We’ve been pointing out the fundamental contradiction at the heart of mandatory age verification laws for years now. To verify someone’s age online, you have to collect personal data from them. If that someone turns out to be a child, congratulations: you’ve just collected personal data from a child without parental consent. Which is a direct violation of the Children’s Online Privacy Protection Act (COPPA)—the very law that’s supposed to be protecting kids.

So what happens when the agency charged with enforcing COPPA finally notices this obvious problem? If you guessed “they admit the conflict and then just promise not to enforce the law,” you’d be exactly right.

The FTC put out a policy statement last week that is remarkable in what it tacitly concedes:

The Federal Trade Commission issued a policy statement today announcing that the Commission will not bring an enforcement action under the Children’s Online Privacy Protection Rule (COPPA Rule) against certain website and online service operators that collect, use, and disclose personal information for the sole purpose of determining a user’s age via age verification technologies.

The FTC appears to be explicitly acknowledging that age verification technologies involve collecting personal information from users—including children—in a way that would otherwise trigger COPPA liability. If the technology didn’t create a COPPA problem, there would be no need for a policy statement promising non-enforcement. You don’t issue a formal announcement saying “we won’t sue you for this” unless “this” is something you could, in fact, sue people for.

The statement itself tries to dress this up by noting that age verification tech “may require the collection of personal information from children, prompting questions about whether such activities could violate the COPPA Rule.” But “prompting questions” is doing an awful lot of work in that sentence. The answer to those questions is pretty obviously “yes, collecting personal information from children without parental consent violates the rule that says you can’t collect personal information from children without parental consent.” The FTC just doesn’t want to say that part out loud, because then the follow-up question becomes: “so why are you encouraging companies to do it?”

Instead, they’ve decided to create an enforcement carve-out. Do the thing that violates the law, but pinky-promise you’ll only use the data to check the kid’s age, delete it afterward, and keep it secure. Then we won’t come after you. This is the FTC solving a legal contradiction not by asking Congress to fix the underlying law or admitting the technology is fundamentally flawed, but by deciding to selectively not enforce the law it’s supposed to be enforcing.

This alone should tell you everything you need to know about age verification. The collection, storage, and disclosure of personal information isn’t a bug in the age verification system, it is a feature. What happened with Discord was no accident, but rather, just so happens to be the company that inadvertently exposed what really goes on with age verification and personal information.

For those who are still insisting that age verification isn’t designed to collect, store, and/or disclose personal information, then why isn’t the FTC instead choosing to go after Persona in a bid to nail their balls to the wall over what happened? If collection and storage of personal information in age verification is not the intent, why aren’t there announcements of investigations and threats of fines being waved around by now? Where’s the “see you in court” notices? After all, if they can fine Facebook $5 billion over the Cambridge Analytica scandal, why can’t they hand out fines against a company like Persona for privacy violations?

Nope, instead, the governments reaction is to see that they have a conflict between mass data harvesting efforts through age verification and a privacy law. They “solved” this by basically refusing to enforce COPPA when it comes to age verification technology. In the process, the regulator has effectively admitted that the whole point of age verification is to harvest vast troves of highly sensitive personal information. Whether that is from adults or children, it doesn’t matter. Collect it all.

Drew Wilson on Mastodon, Twitter and Facebook.


Discover more from Freezenet.ca

Subscribe to get the latest posts sent to your email.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top