A second security incident has apparently prompted Discord to start rethinking their age verification strategy.
So you want to partake in the surveillance industrial complex.
It seems that the latest security incident has gotten Discord to finally start having second thoughts on their whole age verification thing – at least with the approach that they are taking anyway. Last year, as people were fearing that their personal information could fall into the wrong hands, Discord proved the point by suffered from a data breach where at least 70,000 accounts had their age verification information stolen. Hackers suggested it was actually to the tune of millions, but regardless, it proved the point.
You would think that Discord learned their lesson at that point, but nope. Mere months later, Discord announced that they were rolling out a global age verification system. Since it looked like Discord learned nothing from the last incident, people didn’t take too kindly to this move. So, in response, someone produced a tool dedicated to defeating this age verification. This while others were saying “screw this” and flocked to alternatives to the point where some of those alternatives were struggling to keep up with the sudden influx of users.
While all of that was happening, Discord sought to reassure the public that their age verification system was safe. Initially, they promised users that their information would totally be safe and secure. Then, while experimenting with Persona, they would later admit that they do, in fact, gather your personal information, but don’t worry, it is deleted within 7 days. That’s when thing went from bad to worse as an investigation found that Persona was linked to Palantir and Peter Thiel. Palantir, as many know, is assisting ICE in their efforts to build a massive online surveillance system as Trump continues his campaign to crack down on thought crimes on the internet. This is especially troubling given that Trumps overall crackdown on the American public has proven to be deadly. To make matters even worse, Trump stepped up those efforts by issuing hundreds of subpoenas to major platforms in a bid to track down people who posted those thought crimes as well so that they can have those people arrested and, presumably, deported and/or tortured.
While all of this was really bad, things escalated into the absurdly bad when Discord’s Persona went on to suffer from a massive data leak, exposing the front end of their surveillance program. It exposed a system that stored peoples age verification system for three whole years while performing 269 distinct verification checks on the users submitting that information. This includes checking peoples addresses, phone numbers, social media presence, “adversarial media” relations, and a whole lot more. It proved that the system wasn’t just checking people’s ages, but building a massive profile of their online presence and linking it to their real world identity. Suffice to say, things went nuclear with this story. Not only were people worst fears realized, but for many, the situation was far FAR worse than anyone could have imagined.
Today, we are learning about two updates on the matter. First, Discord has apparently severed ties with Persona in the wake of this massive scandal. From Fortune:
And the information was openly available. “We didn’t even have to write or perform a single exploit, the entire architecture was just on the doorstep,” wrote the researchers in their blog, adding they found 53 megabytes of data on a Federal Risk and Authorization Management Program (FedRAMP) government endpoint that also “tags reports with codenames from active intelligence programs.”
Discord has since announced it is cutting ties with Persona. The AI software, partially funded by Palantir cofounder Peter Thiel’s venture firm Founders Fund, continues to provide age verification services for OpenAI, Lime, and Roblox.
Both Persona and Discord confirmed to Fortune their partnership lasted for less than a month and has since dissolved. According to Discord, only a small number of users were part of this test, in which any information submitted could be stored for up to seven days before it would be deleted.
The fact that others are still using Persona even after all of that is pretty frightening.
The other update we are learning is that it looks like Discord is delaying their rollout. From Discord:
Let me be upfront: we knew this rollout was going to be controversial. Any time you introduce something that touches identity and verification, people are going to have strong feelings. Rightfully so. In hindsight, we should have provided more detail about our intentions and how the process works.
The way this landed, many of you walked away thinking we’re requiring face scans and ID uploads from everyone just to use Discord. That’s not what’s happening, but the fact that so many people believe it tells us we failed at our most basic job: clearly explaining what we’re doing and why. That’s on us.
On top of that, many of you are worried that this is just another big tech company finding new ways to collect your personal data. That we’re creating a problem to justify invasive solutions. I get that skepticism. It’s earned, not just toward us, but toward the entire tech industry. But that’s not what we’re doing.
We heard you, and we want to get this right. So here’s what’s happening:
We’re delaying our global rollout to the second half of 2026. Where we have legal obligations, we will continue to meet them, but we will only expand globally after we’ve done the following:
- Adding more verification options. We already had alternatives in development, including credit card verification. We’ll complete and expand those before scaling globally so you have more options you’re comfortable with.
- Vendor transparency. We’ll document every verification vendor and their practices on our website, and make it clear in the product who each vendor is. We’ve also set a new requirement: any partner offering facial age estimation must perform it entirely on-device. If they don’t meet that bar, we won’t work with them.
- A new spoiler channel option. We know many communities use age-restricted channels not for adult content, but for topics people prefer to engage with on their own terms: spoilers, politics, and heavier conversations. We’re building a dedicated spoiler channel option so communities don’t have to age-gate their server just to give members that choice.
- A technical blog post before global launch. We’ll publish a detailed post explaining how our automatic age determination systems work, including the signal categories and privacy constraints. So you can evaluate our approach for yourselves.
- Age assurance data in our transparency reports. We’ll include how many users were asked to verify, what methods they used, and how often our automated systems handled it without any user action.
This is, plain as day, Discord not only screwing up, but admitting that they screwed up here. I don’t know how many hearts and minds they’ll win over with their comments about how they didn’t convey clearly enough that the information about how they are storing that information. After all, there was that conflict between the whole ‘information never leaves the device’ and ‘oh yeah, we do collect that information, but it’ll only be stored for 7 days’. This goes well beyond just two conflicting information. In some countries (i.e. Canada), there is that promise that the information will get destroyed, but mysteriously, there aren’t any repercussions if companies don’t follow that. That is a very well known problem on Canada’s side of things that lawmakers, thus far, have flatly refused to fix. There are other jurisdictions that swear up and down that the information is stored securely or not collected. Yet, not only do we have clear evidence that this is not the actual practice, but evidence that some vendors out there are just shoehorning invasive privacy busting systems to build highly detailed profiles of people online. Discords conflicting messages didn’t initiate this concern, but it absolutely magnified it to a very big degree.
The other thing to is the fact that there is going to be a huge number of people out there who are going to look at this message and simply respond to this message with the usual “I don’t believe you” gif from the Anchorman. For the small fraction of users that don’t know what I’m talking about, it’s this one:
That would be a very understandable reaction. After all, who is to say that Discord would put out the technical blog post detailing exactly what they say they are doing and, behind the scenes, do a few things conveniently different? With so much distrust in Discord right now, I think there are a number of people out there who are thinking that right about now. So, in terms of restoring trust, I think Discord has an uphill battle to fight on that one. That’s not to say it’s going to be the end of Discord, but that trust has been, at minimum, badly damaged across portions of the community.
Ditching Persona because they are damaged goods (at bare minimum, as far as the public is concerned) was a good step. Offering transparency moving forward, well, only time will tell if that will be enough to restore that trust for users.
One point I will agree with Discord on is that this is an industry wide problem. Discord is by no means the only one using age verification and Persona is a vendor that is still being used by other platforms. Discord is not exclusively to blame for the age verification mess we are in. I would argue a good deal of the blame is also shared by the shady politicians that were pushing these laws in the first place. Without those politicians meddling in the affairs of online platforms, we wouldn’t even be discussing this in the first place, yet here we are. There are other platforms doing this and are doing what they can to fly under the radar as this nuclear blast from Discord goes on.
I’ve said it before and I’ll say it again, the best thing that can happen right about now is lawmakers scrapping these age verification mandates. They were a mistake. They were predicted to be a mistake and they are proving that they well and truly are a mistake. The technology is nowhere near what the law requires and countless people are going to end up being hurt by this. Scrapping these laws is literally the only solution to cleaning up this mess at this point in time. It won’t happen any time soon due to the cult-like mentality of those pushing these laws, but it’s the only way that I’m aware of that will ultimately begin to clean things up.
Drew Wilson on Mastodon, Twitter and Facebook.
Discover more from Freezenet.ca
Subscribe to get the latest posts sent to your email.



I would argue that the only way this would teach a lesson is for the business to just go bankrupt or be the end of it. That would definitely discourage anyone going the age gate, population control, profiling route.
It definitely would send a powerful signal. Probably needed given how many companies and platforms were so eager to adopt these policies in the first place.