On the 6th of December, 2018, Google had announced about its action for removing all kinds of gender bias related to Google Translate service. Basically, Google is making the web as equitable as possible. Here’s more information based on this news.
What was the initial issue about Gender bias?
The issue comes after Google has been specifically called out for its biases in translate and autocomplete. Back in February, Forbes reported how examples of gender bias in Translate began popping up on social media.
Previously, Google Translate used machine learning to provide translations. Often “inadvertently” resulted in a gender-biased translation, James Kuczmarski, Google Translate product manager, said in a blog post.
So words like “strong” or “doctor” would receive masculine translations, said Kuczmarski. While words like “nurse” or “beautiful” would receive feminine translations, even though both masculine and feminine translations exist. With the new Google Translate feature, you’ll get both translations, according to Kuczmarski.
Basically, if you type “o bir doktor” in Turkish, you’ll get “she is a doctor” and “he is a doctor”. That is, earlier we used to get just the masculine translation. Hence, now that would not be the case.
More Insight –
The solution given by Google Translate –
Google Translate service will stop defaulting to the masculine option for some languages, says engadget, offering both a masculine and feminine translation when either are available. The new feature is currently only available when translating English to French, Italian, Portuguese, or Spanish. It is also available when you’re translating from Turkish to English, according to engadget.
This isn’t the first move Google has made to make their products better and neutral. In November, Google stopped predicting people’s genders in its Smart Compose tool in Gmail, according to Cnet. Google decided to make the change after one of its research scientists typed, “I am meeting an investor next week”. To this, the Smart Compose suggested, “Do you want to meet him?”
Hence, Google’s Paul Lambert told Cnet that was the kind of gender bias Google simply did not want to see in its AI, so they eliminated gender predictions from the tool.
As TechCrunch says, “Down the road, Google says it does plan to address non-binary-gender in translations. Google will also eventually bring this to its iOS and Android apps, and address gender biases in auto-complete.”
What does Google have to say about this?
“Over the course of this year, there’s been an effort across Google to promote fairness and reduce bias in machine learning. Our latest development in this effort addresses gender bias by providing feminine and masculine translations for some gender-neutral words on the Google Translate website.
Google Translate learns from hundreds of millions of already-translated examples from the web. Historically, it has provided only one translation for a query, even if the translation could have either a feminine or masculine form. So when the model produced one translation, it inadvertently replicated gender biases that already existed. For example: it would skew masculine for words like “strong” or “doctor,” and feminine for other words, like “nurse” or “beautiful.”
Now you’ll get both a feminine and masculine translation for a single word—like “surgeon”—when translating from English into French, Italian, Portuguese or Spanish. You’ll also get both translations when translating phrases and sentences from Turkish to English. For example, if you type “o bir doktor” in Turkish, you’ll now get “she is a doctor” and “he is a doctor” as the gender-specific translations.
In the future, we plan to extend gender-specific translations to more languages, launch on other Translate surfaces like our iOS and Android apps, and address gender bias in features like query auto-complete. And we’re already thinking about how to address non-binary gender in translations, though it’s not part of this initial launch.
– James Kuczmarski (Product Manager, Google Translate)