A new study shows that without the bias imposed upon it by political correctness, objective facial recognition software refuses to be fooled by a substantial percentage of “trans” people trying to appear as the opposite sex.
When ‘dude looks like a lady’ it doesn’t fool the facial recognition software. In reality, it doesn’t fool the rest of us, either. It’s just we’re programmed by society to lie about it, and facial recognition software is programmed to speak the truth.
Researchers at the University of Colorado in Boulder recently tested 2,450 photographs crowd-sourced from transgender people and found that facial recognition software is not always fooled by wigs, make-up, or other transgender accouterments. It knows the difference between a man and a woman based upon unchangeable features like brow-lines, cheek-bones, jaw-lines, and shape of the head.
Whereas the software correctly identified the sex of “cisgender” people (a made-up term that means someone identifies with the genitalia God gave them) 98% of the time, it was only 70% “accurate” in identifying transexuals by their preferred gender. However, a third of the time – no matter their disguise – the software would not be fooled.
Whereas the failure rate to identify real gender was only 2% among cisgender people, it was 30% among “transgender” people. In other words, for a substantial proportion of “transgender” people, they can’t fool computer software.
A full 100 percent of the agender, genderqueer and nonbinary individuals were not identified by their preferred status. There is no way, currently, to program the software to see anything besides “male” and “female.” Likewise, “agender, genderqueer, and nonbinary” and things that don’t actually exist, and so it would be impossible to spot imaginary designations anyway.
The facial recognition software used in the study included Amazon’s Rekognition, IBM’s Watson, Microsoft’s Azure and Clarifai.
The transgender advocates who performed the study publicized it as a great victory that the software was fooled 70% of the time. Compared to the software accuracy for the ‘cisgender’ people, however, it hardly seems like a victory at all.