THE GAY GENE: New AI can guess whether gay or straight from photograph

New AI can guess whether you're gay or straight from a photograph

An algorithm deduced the sexuality of people on a dating site with up to 91% accuracy, raising tricky ethical questions

archive.is/2017.09.08-023213/https://www.theguardian.com/technology/2017/sep/07/new-artificial-intelligence-can-tell-whether-youre-gay-or-straight-from-a-photograph

The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes. The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website. The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using “deep neural networks”, meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.

The research found that gay men and women tended to have “gender-atypical” features, expressions and “grooming styles”, essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women. Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the authors wrote.

The paper suggested that the findings provide “strong support” for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid. While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people’s sexual orientation without their consent.

Other urls found in this thread:

en.wikipedia.org/wiki/Homosexual_behavior_in_animals
en.wikipedia.org/wiki/List_of_mammals_displaying_homosexual_behavior
archive.rebeccablacktech.com/g/thread/62148160/#62159174
twitter.com/NSFWRedditVideo

So it flagged men that try to look like women and the rest of this is clickbait. Great article, OP.


Beta men turn to faggotry out of desperation. We've seen this with Cosmo, Anthony Burch, Jake Rapp, and Chris chan.

So AI tells us faggots are all so shallow that their existence can be boiled down to how they talk?
Sounds about right tbh

Thanks for confirming this is a slide thread you stupid fuck.

Nice cherrypick.
Abnormal pre-natal testosterone (low for men, high for women) correlates with faggotry, and this AI found it through facial analysis. 80%+ is nothing to sneeze at.


There's nothing to slide here. You're on the wrong board.

Oy vey!

One step closer to day of the slay the gay.

Source? Seeing as you've apparently solved the conundrum science is still internally debating about

What a narrow minded bullshit.
What if I told you that Alan Turing was gay? Or the Queen singer?

2/10, had me.