News article by Heather Murphy.
Published in The New York Times.
Michal Kosinski felt he had good reason to teach a machine to detect sexual orientation.
An Israeli start-up had started hawking a service that predicted terrorist proclivities based on facial analysis. Chinese companies were developing facial recognition software not only to catch known criminals — but also to help the government predict who might break the law next.
And all around Silicon Valley, where Dr. Kosinski works as a professor at Stanford Graduate School of Business, entrepreneurs were talking about faces as if they were gold waiting to be mined.
Few seemed concerned. So to call attention to the privacy risks, he decided to show that it was possible to use facial recognition analysis to detect something intimate, something “people should have full rights to keep private.”
After considering atheism, he settled on sexual orientation.
Whether he has now created “A.I. gaydar,” and whether that’s even an ethical line of inquiry, has been hotly debated over the past several weeks, ever since a draft of his study was posted online. [ . . . ]
- Advances in AI are used to spot signs of sexuality
- GLAAD and HRC call on Stanford University & responsible media to debunk dangerous & flawed report claiming to identify LGBTQ people through facial recognition technology
- HRC and GLAAD release a silly statement about the ‘gay face’ study
- Using AI to determine queer sexuality is misconceived and dangerous
- Artificial Intelligence Discovers Gayface. Sigh.
- The invention of AI ‘gaydar’ could be the start of something much worse
- Do algorithms reveal sexual orientation or just expose our stereotypes?
The Research Study
Deep neural networks are more accurate than humans at detecting sexual orientation from facial images.
Peer-reviewed article by Michal Kosinski and Yilun Wang. Journal of Personality and Social Psychology. February 2018, Vol. 114, Issue 2, Pages 246-257
We show that faces contain much more information about sexual orientation than can be perceived or interpreted by the human brain. We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 71% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. Prediction models aimed at gender alone allowed for detecting gay males with 57% accuracy and gay females with 58% accuracy. Those findings advance our understanding of the origins of sexual orientation and the limits of human perception. Additionally, given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.