Photo: Dave and Les Jacobs (Getty)
Facial recognition is in right now, as your phone identifying you by your thumb is so last year. You know what it’s all about now? Your face. There’s only one problem. Current face recognition software works best when it’s identifying guys with light skin.
A new review of the software also shows that the software absolutely fails when it comes to women with dark skin.
“Overall, male subjects were more accurately classified than female subjects replicating previous findings (Ngan et al., 2015), and lighter subjects were more accurately classified than darker individuals,” Joy Buolamwini, an MIT Media Lab researcher and computer scientist, who tested three commercial gender classifiers offered as part of face recognition services, said. “An intersectional breakdown reveals that all classifiers performed worst on darker female subjects.”
The software misidentified the gender of dark-skinned females 35 percent of the time. By contrast, the error rate rate for light-skinned males was less than one percent.
The results mirror previous findings about the failures of face recognition software when identifying women and individuals with darker skin. As noted by Georgetown University’s Center for Privacy and Technology, these gender and racial disparities could, in the context of airport facial scans, make women and minorities more likely to be targeted for more invasive processing like manual fingerprinting.
All face recognition software is trained by scanning thousands upon thousands of images in a dataset, refining its ability to extract valuable datapoints and ignore what isn’t useful. As Buolamwini notes, many of these datasets are themselves biased. Adience, one gender classification benchmark, uses subjects that are 86 percent light-skinned. Another dataset, IJB-A, uses subjects that are 79 percent light-skinned.
So yes, it seems this new software performs better on male faces than female faces, and better on a certain skin color. Now of course you can’t blame the technology, it has no idea what it’s doing. We can blame the men and women who put all these, as noted above, “bias datasets” together. I think it’s time those folks in lab coats start meeting people from all walks of life so that this AI technology can actually work someday.