Communications Daily is a service of Warren Communications News.

NIST Reports More False Positives for Minority Face-Scanning

Most facial recognition technology algorithms show evidence of “demographic differentials,” or racial bias, the National Institute of Standards and Technology reported Thursday. For one-to-one matching, a study found higher rates of false positives for Asian and African American faces “relative…

Sign up for a free preview to unlock the rest of this article

Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!

to images of Caucasians,” NIST said. “While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” said NIST computer scientist Patrick Grother. The study evaluated 189 software algorithms from 99 developers, “a majority of the industry.”