Search

AUTOMATIC GENDER RECOGNITION TECH IS DANGEROUS, SAY CAMPAIGNERS: IT’S TIME TO BAN IT

Updated: May 11

Simplistic gender binaries infringe on the right to self-expression


AUTOMATIC GENDER RECOGNITION TECH IS DANGEROUS, SAY CAMPAIGNERS: IT’S TIME TO BAN IT
AUTOMATIC GENDER RECOGNITION TECH IS DANGEROUS

Dangers posed by facial recognition like mass surveillance and mistaken identity have been widely discussed in recent years. But digital rights groups say an equally insidious use case is currently sneaking under the radar: using the same technology to predict someone’s gender and sexual orientation. Now, a new campaign has launched to ban these applications in the EU.


Trying to predict someone’s gender or sexuality from digitized clues is fundamentally flawed, says Os Keyes, a researcher who’s written extensively on the topic. This technology tends to reduce gender to a simplistic binary and, as a result, is often harmful to individuals like trans and nonbinary people who might not fit into these narrow categories. When the resulting systems are used for things like gating entry for physical spaces or verifying someone’s identity for an online service, it leads to discrimination.


“Identifying someone’s gender by looking at them and not talking to them is sort of like asking what does the smell of blue taste like,” Keyes tells. “The issue is not so much that your answer is wrong as your question doesn’t make any sense.”


These predictions can be made using a variety of inputs, from analyzing someone’s voice to aggregating their shopping habits. But the rise of facial recognition has given companies and researchers a new data input they believe is particularly authoritative.


THESE SYSTEMS DON’T JUST FAIL TO RECOGNIZE THAT TRANS PEOPLE EXIST THEY LITERALLY CAN’T RECOGNIZE THAT TRANS PEOPLE EXIST.


Commercial facial recognition systems, including those sold by big tech companies like Amazon and Microsoft, frequently offer gender classification as a standard feature.


Predicting sexual orientation from the same data is much rarer, but researchers have still built such systems, most notably the so-called “AI gaydar” algorithm. There’s strong evidence that this technology doesn’t work even on its own flawed premises, but that wouldn’t necessarily limit its adoption.


“Even the people who first researched it said, yes, some tinpot dictator could use this software to try and ‘find the queers’ and then throw them in a camp,” says Keyes of the algorithm to detect sexual orientation. “And that isn’t hyperbole. In Chechnya, that’s exactly what they’ve been doing, and that’s without the aid of robots.”


In the case of automatic gender recognition, these systems generally rely on narrow and outmoded understandings of gender. With facial recognition tech, if someone has short hair, they’re categorized as a man; if they’re wearing makeup, they’re a woman. Similar assumptions are made based on biometric data like bone structure and face shape. The result is that people who don’t fit easily into these two categories — like many trans and nonbinary individuals — are misgendered. “These systems don’t just fail to recognize that trans people exist. They literally can’t recognize that trans people exist,” says Keyes.


Current applications of this gender recognition tech include digital billboards that analyze passersby to serve them targeted advertisements; digital spaces like “girls-only” social app Giggle, which admits people by guessing their gender from selfies; and marketing stunts, like a campaign to give discounted subway tickets to women in Berlin to celebrate Equal Pay Day that tried to identify women based on facial scans. Researchers have also discussed much more potentially dangerous use cases, like deploying the technology to limit entry to gendered areas like bathrooms and locker rooms.



1 view0 comments
 Artificial intelligence robot getting orders from customers at hotel.