Sunday, September 10, 2017

Algayrithm!

"What?Pfft...no, Diane, I didn't write the
algorithm because you wouldn't go out with
me... Although this does say you are gay."

-Some creepy researcher
Say, did you see this thing about a Stanford study saying that a new computer algorithm can detect gayface? Scienceticians on the very cutting edge of ridiculous and potentially offensive new uses for technology fed a bunch of photos from dating sites into their fancy new thinking machine and found that after it knew what to look for it could correctly identify gays 81% of the time and lesbians 74% of the time. It's called a deep neural network and it really is super-sophisticated and seems like it could have countless real world applications. Playing spot the gay doesn't seem like one of them.

I mean, for one thing, these are dating sites, so you'd think you wouldn't need an algorithm to work out who's gay and who's not, but here we are.
See? Gay. If the pic is on Grindr and mostly abs:
Gay. Don't even need the stupid algorithm.
Although being an angry, vitriol-spewing,
homophobic hate-monger is totally a choice..
But whatever, the software exists so a logical question is how the shit does it do it? With science it turns out. The computer is just analyzing faces and correlating data, but with enough photos and data about the people in them, patterns emerge and apparently there are certain subtle physical traits associated with sexuality. The authors of the study theorize that certain hormones affect a person's development in utero and the results can be detected visually in adulthood. So on the one hand it's another reason that people who think being gay is a choice should shut the hell up. 

"Homosexuals detected, cease all
cake baking activity immediately!"
But on the other hand, is there a single use for the technology that isn't terrifying? Sure, the America of 2017 is in many ways more progressive than say the America of 1950-wait, sorry, let me rephrase, the America of 2016 was more progressive, the America of 2017 is bananas foster. Remember as recently as Friday when we were talking about that idiot baker refusing to make wedding cakes for gay couples and getting support from the Department of Justice? I'm not saying the Trump administration is going to start building Sentinels anytime soon, but there are literal Nazis marching in the streets so I don't think it's unreasonable to be concerned.

Really? 28? Well, I'm sure we're
making progress in the courts and-huh?
Oh, right, everything is terrible now...
So what's the deal? Why would Michal Kosinski and Yilun Wang, the co-authors of the study, put this out there? The technology already exists, so is it that all someone has to do to recreate their work is start plugging in OKCupid profiles? Eh, it's apparently more complicated than that, and they're not saying which sites they used or how they did it, but still, this technology sounds like a kinda serious threat to privacy. Did you know that it's still legal to fire someone for being LGBTQ in 28 States?

"You'll thank me someday...you know,
after you get over the flu I just gave you."
-Some hypothetical jerk
Yeah, 28 States. Can you believe it? Anyway, the authors insist that publishing the study was a way to point out the potential dangers inherent in a technology that profiles people in the way that this does. Ok, it's cool that they're watching out for the public, that's great and all, but as well intended as they might be, and was difficult as it may be to imitate their research, it's not impossible. So publishing this study as a cautionary tale still sounds a little like someone with the flu coughing all over people in the subway and then explaining that the flu already exists and it's like, super dangerous so they're just spreading it around to raise awareness of the importance of flu prevention.

It just seems like maybe not telling everybody that it's possible to detect sexual orientation using facial recognition software would be a more effective way to not inspire people to detect sexuality with facial recognition software. But what do I know? I'm not a scientist and can't possibly know what it's like to find oneself with a dilemma like the one Kosinski and Wang faced. You know, to publish and risk a potentially dangerous technology falling into the wrong hands or don't publish and not get interviewed by The Economist. Tough choice.
On the up side, a decent portion of the readership is old people waiting for work
on their car to be finished, so unless any of them have advanced degrees in computer
 science and access to a neural network, the world should be reasonably safe...for now.

No comments:

Post a Comment