NPL Study: Facial Recognition Fails Spectacularly For Non-White People

Facial recognition has been long known to getting non-white people’s faces wrong. The UKs Home Office is confirming this problem.

Age verification has long been known as broken technology that can both be circumvented by those not wanting to submit all of their personal information to a nebulous third party or hacked by hackers wanting to break in and get all that highly sensitive information afterwards.

While the technology has long been broken at the fundamental level, age verification apologists and anti-privacy advocates have treated these fundamental problems as a Public Relations (PR) problem rather than the technological problem that is truly is. So, they’ve spent years trying to flood the zone with marketing buzz words like “age assurance”, “facial recognition”, and “double blind” (with the last one going down in flames a few months back).

While “age assurance” has been used frequently to confuse the debate, “facial recognition” has been another term that has been deeply problematic for quite some time. While security and privacy have been both major problems with the technology, the accuracy of the technology has also long plagued the technology. In 2023, an NIST study found that if you are not white, the technology was 100 times more likely to misidentify you. The findings helped to push to the forefront that the technology is propelling racial biases that have been found in modern society.

Age verification apologists have desperately tried to dismiss the NIST study, claiming that the study is old and outdated. They argue that the technology for facial recognition has improved since then and that the NIST findings can be treated with a grain of salt. The reality, however, is that, much like the debate about AI being a broken technology, approximately sweet fuck all has changed with the technology and its accuracy since then. A new study by NPL has looked into the very findings that facial recognition is still terrible at properly identifying people who are not white people and the findings are pretty much the same as what the NIST study found. That has left the UKs Home Office admitting that the technology is problematic. From The Guardian:

Ministers are facing calls for stronger safeguards on the use of facial recognition technology after the Home Office admitted it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings.

Following the latest testing conducted by the National Physical Laboratory (NPL) of the technology’s application within the police national database, the Home Office said it was “more likely to incorrectly include some demographic groups in its search results”.

Police and crime commissioners said publication of the NPL’s finding “sheds light on a concerning inbuilt bias” and urged caution over plans for a national expansion.

The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the “biggest breakthrough since DNA matching”.

Oops! Yeah, you know that had to be politically inconvenient. What’s more, this nukes from orbit the argument that the NIST findings are out of date and can’t be trusted. When the science can be repeated, that is definitely another sign that the original findings can be trusted after all.

Of course, as is suggested by the article, facial recognition isn’t just being shoved into age verification technology as a terrible Band-Aid solution to the deep flaws of age verification. It is also being thrown into the mix of policing as well. Police here in Canada have been busted playing around with the tech in their body cameras away from the prying eyes of privacy commissioners. This after the RCMP experiment in 2020 was found to have violated Canadian privacy laws.

This adds another dimension to the policing effort to turn to such technology as a silver bullet to policing. While facial recognition sounds like a great idea, and it is a great premise for science fiction and drama television shows as the technology can be written as a perfectly accurate means of identification (such as person of interest), the technology that boasts of a high degree of accuracy continues to remain firmly in the grips of science fiction.

Drew Wilson on Mastodon, Twitter and Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top