Skip to main content

Automatic gender recognition tech is dangerous, say campaigners: it’s time to ban it

Simplistic gender binaries infringe on the right to self-expression

Share this story

A series of wireframe faces.
Illustration by Alex Castro / The Verge

Dangers posed by facial recognition like mass surveillance and mistaken identity have been widely discussed in recent years. But digital rights groups say an equally insidious use case is currently sneaking under the radar: using the same technology to predict someone’s gender and sexual orientation. Now, a new campaign has launched to ban these applications in the EU.

Trying to predict someone’s gender or sexuality from digitized clues is fundamentally flawed, says Os Keyes, a researcher who’s written extensively on the topic. This technology tends to reduce gender to a simplistic binary and, as a result, is often harmful to individuals like trans and nonbinary people who might not fit into these narrow categories. When the resulting systems are used for things like gating entry for physical spaces or verifying someone’s identity for an online service, it leads to discrimination.

“Identifying someone’s gender by looking at them and not talking to them is sort of like asking what does the smell of blue taste like,” Keyes tells The Verge. “The issue is not so much that your answer is wrong as your question doesn’t make any sense.”

These predictions can be made using a variety of inputs, from analyzing someone’s voice to aggregating their shopping habits. But the rise of facial recognition has given companies and researchers a new data input they believe is particularly authoritative.

“These systems don’t just fail to recognize that trans people exist they literally can’t recognize that trans people exist.”

Commercial facial recognition systems, including those sold by big tech companies like Amazon and Microsoft, frequently offer gender classification as a standard feature. Predicting sexual orientation from the same data is much rarer, but researchers have still built such systems, most notably the so-called “AI gaydar” algorithm. There’s strong evidence that this technology doesn’t work even on its own flawed premises, but that wouldn’t necessarily limit its adoption.

“Even the people who first researched it said, yes, some tinpot dictator could use this software to try and ‘find the queers’ and then throw them in a camp,” says Keyes of the algorithm to detect sexual orientation. “And that isn’t hyperbole. In Chechnya, that’s exactly what they’ve been doing, and that’s without the aid of robots.”

In the case of automatic gender recognition, these systems generally rely on narrow and outmoded understandings of gender. With facial recognition tech, if someone has short hair, they’re categorized as a man; if they’re wearing makeup, they’re a woman. Similar assumptions are made based on biometric data like bone structure and face shape. The result is that people who don’t fit easily into these two categories — like many trans and nonbinary individuals — are misgendered. “These systems don’t just fail to recognize that trans people exist. They literally can’t recognize that trans people exist,” says Keyes.

Current applications of this gender recognition tech include digital billboards that analyze passersby to serve them targeted advertisements; digital spaces like “girls-only” social app Giggle, which admits people by guessing their gender from selfies; and marketing stunts, like a campaign to give discounted subway tickets to women in Berlin to celebrate Equal Pay Day that tried to identify women based on facial scans. Researchers have also discussed much more potentially dangerous use cases, like deploying the technology to limit entry to gendered areas like bathrooms and locker rooms.

Giggle is a “girls-only” social app that attempts to verify that users are female using selfies.
Giggle is a “girls-only” social app that attempts to verify that users are female using selfies.
Image: Giggle

Being rejected by a machine in such a scenario has the potential to be not only humiliating and inconvenient, but to also trigger an even more severe reaction. Anti-trans attitudes and hysteria over access to bathrooms have already led to numerous incidents of harassment and violence in public toilets, as passersby take it upon themselves to police these spaces. If someone is publicly declared by a seemingly impartial machine to be the “wrong” gender, it would only seem to legitimize such harassment and violence.

Daniel Leufer, a policy analyst at digital rights group Access Now, which is leading the campaign to ban these applications, says this technology is incompatible with the EU’s commitment to human rights.

“If you live in a society committed to upholding these rights, then the only solution is a ban,” Leufer tells The Verge. “Automatic gender recognition is completely at odds to the idea of people being able to express their gender identity outside the male-female binary or in a different way to the sex they were assigned at birth.”

Automatic gender recognition is incompatible with self-expression, say campaigners

Access Now, along with more than 60 other NGOs, has sent a letter to the European Commission, asking it to ban this technology. The campaign, which is supported by international LGBT+ advocacy group All Out, comes as the European Commission considers new regulations for AI across the EU. A draft white paper that circulated last year suggested a complete ban on facial recognition in public spaces was being considered, and Leufer says this illustrates how seriously the EU is taking the problem of AI regulation.

“There’s a unique moment right now with this legislation in the EU where we can call for major red lines, and we’re taking the opportunity to do that,” says Leufer. “The EU has consistently framed itself as taking a third path between China and the US [on AI regulation] with European values at its core, and we’re attempting to hold them to that.” 

Keyes points out that banning this technology should be of interest to everyone, “regardless of how they feel about the centrality of trans lives to their lives,” as these systems reinforce an extremely outdated mode of gender politics.

“When you look at what these researchers think, it’s like they’ve time-traveled from the 1950s,” says Keyes. “One system I saw used the example of advertising cars to males and pretty dresses to females. First of all, I want to know who’s getting stuck with the ugly dresses? And secondly, do they think women can’t drive?”

Miami Int’l Airport To Use Facial Recognition Technology At Passport Control
Gender identification can be used in unrelated systems, like facial recognition tech used to verify identity at borders.
Photo by Joe Raedle / Getty Images

The use of this technology can also be much more subtle than simply delivering different advertisements to men and women. Often, says Keyes, gender identification is used as a filter to produce outcomes that have nothing to do with gender itself.

For example, if a facial recognition algorithm is used to bar entry to a building or country by matching an individual to a database of faces, it might narrow down its search by filtering results by gender. Then, if the system misgenders the person in front of it, it will produce an invisible error that has nothing to do with the task at hand.

Algorithmic transparency would be needed to enforce a ban

Keyes says this sort of application is deeply worrying because companies don’t share details of how their technology works. “This may already be ubiquitous in existing facial recognition systems, and we just can’t tell because they are entirely black-boxed,” they say. In 2018, for example, trans Uber drivers were kicked off the company’s app because of a security feature that asked them to verify their identity with a selfie. Why these individuals were rejected by the system isn’t clear, says Keyes, but it’s possible that faulty gender recognition played a part.

Ultimately, technology that tries to reduce the world to binary classifications based on simple heuristics is always going to fail when faced with the variety and complexity of human expression. Keyes acknowledges that gender recognition by machine does work for a large number of people but says the underlying flaws in the system will inevitably hurt those who are already marginalized by society and force everyone into narrower forms of self-expression.

“We already live in a society which is very heavily gendered and very visually gendered,” says Keyes. “What these technologies are doing is making those decisions a lot more efficient, a lot more automatic, and a lot more difficult to challenge.”