Federal study finds bias in facial recognition

Many facial recognition technology systems misidentify people of color at a higher rate than white people, according to a federal study released Thursday.

The research from the National Institute of Standards and Technology (NIST), a federal agency within the Department of Commerce, comes amid pushback from lawmakers and civil rights groups over the software which scans faces to quickly identify individuals.

After reviewing 189 pieces of software from 99 developers, which NIST identified as a majority of the industry, the researchers found that in one-to-one matching, which is normally used for verification, Asian and African American people were up to 100 times more likely to be misidentified than white men.

In one-to-many matching, used by law enforcement to identify people of interest, faces of African American women returned more false positives than other groups.

“In a one-to-one search, a false negative might be merely an inconvenience — you can’t get into your phone, but the issue can usually be remediated by a second attempt,” Patrick Grother, a NIST computer scientist and the report’s primary author, said in a statement.

“But a false positive in a one-to-many search puts an incorrect match on a list of candidates that warrant further scrutiny.”

Grother concluded that NIST found “empirical evidence” that the majority of facial recognition systems have “demographic differentials” that can worsen their accuracy based on a person’s age, gender or race.