aiaTranslations: The Language of Life Sciences Blog

The truth behind Google Translate's sexism

Written by Alysa Salzberg | Apr 12, 2021 4:00:00 AM

In recent weeks, the internet has been ablaze with yet another scandal: Google Translate is sexist. This may sound like an exaggeration or an example of cancel culture, but the accusers do have a point.

Why is Google Translate being accused of sexism?

A few weeks ago, people began to notice that if you type gender-neutral phrases in one language into Google Translate, the English version will ascribe a gender to the subjects in question.

This intriguing longform article by Bored Panda writers Jonas Grinevičius and Austėja Akavickaitė , for instance, shows how Twitter user Dora Vargha typed in a few phrases in Hungarian. Hungarian does not use gendered pronouns. But when translated into English, each statement was automatically ascribed a gender. These included: “He is clever. He reads. She washes the dishes.” Or “He’s researching. She is raising a child.”

Twitter user bani haykal found similar results when translating from Malay into English: “He is a doctor.” “She is a nurse.”

For centuries, women and men have been put into stereotypical roles. When a translation of a language with genderless pronouns systematically results in phrases featuring stereotypical gender roles, you could say that the translation platform is supporting or reinforcing this way of thinking.

What causes Google Translate’s sexism?

Commenters on all sides have pointed out that AI like Google Translate learn by algorithms. This means that the AI has analyzed hundreds of thousands, or even millions, of documents generated by humans of all sorts, to create the most common translation of a phrase -right down to gender.

This could be a way to turn a mirror to ourselves. But as some have pointed out, many of the texts used for algorithms are decades or even centuries old, created long before things like women’s rights, gender equality, and education and job opportunities for women. From what the ’bot behind Google Translate has learned, women aren’t doctors, and men aren’t nurses.

Is this a new problem?

Unfortunately, this isn’t a problem that’s come to us along with the advent of machine translation. Whether it’s done by machines or humans, translation has a gender bias.

Sometimes, the problem is due to the rules of a particular language. For instance, in French, gender is present in both nouns and pronouns. Even collective terms like “they” or “them” follow gender rules; if there’s a group made up of women with just one man among them, the collective term becomes masculine.

In recent years, people in various cultures and linguistic groups have found workarounds or even created entirely new gender-neutral pronouns, but these aren’t always universally accepted or acknowledged.

For instance, using “they” as a singular, gender-neutral pronoun in English dates to the 14th century, but many modern-day people say they find it confusing to refer to an individual this way.

What is Google doing about this?

Gender bias in translation has been a problem for far longer than Google Translation’s existence. Still, Google is a powerful company, full of geniuses. They should be able to do something about this, right?

Actually, they have, at least sometimes. According to the Bored Panda article, ZombieTron discovered that now when you use Google Translate with a language that doesn’t have gendered pronouns, you will see two results in English: one with “he” and one with “she”.

Unfortunately, other users pointed out that this is only for individual phrases, not for translations of longer texts.

Why doesn’t Google just use the pronoun “they”?

It comes down to clarity. Many people use Google Translate in the hopes of a quick, accurate translation. This isn’t guaranteed, but it’s understandable that Google would do anything in their power to facilitate it. Grinevičius and Akavickaitė explain that the word “they” can cause confusion out of context because a user unfamiliar with the source language may think the original sentence refers to a group of people, rather than a single person.

What can the rest of us do?

Ultimately, the situation shows that both machines and humans are still trapped by translation’s inherent gender bias. This may not ever go away, since some of it is a result of particular languages’ rules and makeup. But at least Google is now giving an option for “He” or “She” with simple phrases. That’s a start.

It’s hard to think that a single person could dramatically change a language or topple cultural ways of thinking all on their own. Still, little changes over time can have an impact. And not just on the virtual world. When you make general statements or give examples, try to think about the gender you’re associating with them, especially if you have a job or hobby that means you’re producing content online that could be analyzed by AI.

For instance, my son, who goes to school in France, often receives worksheets where women are shown taking care of children and men are shown going to work. Why not flip those things around now and then, so that we can teach machines – and ourselves – that gender shouldn’t limit who we are and how we live?

Image source

Contact Our Writer – Alysa Salzberg