While you’re here… help us stay here.

Are you enjoying open access to policy and research published by a broad range of organisations? Please donate today so that we can continue to provide this service.

Fact sheet

Fact Check: Is facial recognition technology worse at identifying darker-skinned faces than lighter ones?

Biometrics Artificial Intelligence (AI)

Human Rights Commissioner Edward Santow is critical of facial recognition technology, claiming that it is prone to errors ' particularly with people with darker skin, where it's much less accurate. Mr Santow's claim that facial recognition technology can be less accurate at identifying darker-skinned faces than lighter ones is correct. One study found three leading software systems correctly identified white men 99 per cent of the time, but the darker the skin, the more often the technology failed. Darker-skinned women were the most misidentified group, with an error rate of nearly 35 per cent in one test, according to the research. It said algorithms trained with biased data have resulted in algorithmic discrimination. Another study found six different face recognition algorithms were less accurate in identifying females, black people and 18-to-30-year-olds. Toby Walsh, Scientia Professor of Artificial Intelligence at the University of NSW, told RMIT ABC Fact Check that despite more representative training data being used increasingly over the past 10 years, the technology was still less accurate in identifying darker-skinned faces. "We're pretty certain it's not reliable at the minute," he said.
Verdict: Correct

Publication Details
License type:
All Rights Reserved
Access Rights Type: