Human language may reflect gender biases. Artificial intelligence too. Google admits to be aware that algorithms can replicate discriminations after assimilating and reworking data from the real world. And that's why he says he works to reduce these shortcomings when it comes to developing and improving services like his automatic translator.
The latest news of the company in this sense is the implementation of a double female and male version to translate in some languages that provide distinctions by gender neutral words in English. So, for example, if you enter the noun surgeon in the section intended for the expression that is to be translated, the result in Castilian that the tool returns will be both surgeon and surgeon.
"Google's translator learns from hundreds of millions of examples already translated into the network," James Kuczmasrki, product manager of the service, said in a statement published Thursday. "Therefore, the model replicated without wanting gender biases that already existed when producing a translation," he adds.
Kuczmarski explains that the translator used to generate a single translation for each request, even though there could be both a female and a male version. "For example, I would have provided a male translation for words like doctor [doctor, doctora] and feminine for words like nurse [enfermero, enfermera]", Explains the Google manager.
At the moment, the tool generates double translations of adjectives, nouns and pronouns without gender in Spanish, French, Italian and Portuguese. So, for example, for the personal pronoun they, in Spanish both they and they will appear as suggestions. In addition, the service also works for translate short phrases from Turkish (language without genres) to English. If the expression is composed or bir doktor, in English, the male version is generated he is a doctor [él es un doctor] and the feminine she is a doctor [ella es una doctora].
Google shows the distinction by gender in the web version of the tool available for Chrome and Firefox browsers. The new functionality is not yet operational in the application for mobile devices. "We have programmed for the future to extend gender-specific translations to more languages. We also want to launch them in the application for the iOS and Android operating systems, "says Kuczmasrki.
The order of visualization of the two versions depends on the label that marks the genre according to which language the translator is used (in Spanish, the feminine version appears before the masculine one because the "f" precedes the "m" in the alphabet) . For "no alphabet" languages, Google adds, "the translations that distinguish by gender are displayed in the standard indexing order of each language."
The tool is open to user contributions. Holders of a Google account can access the translators' community, choose the languages in which they want to work and suggest new translations or corrections of existing translations. The company remembers that it is not possible to add translations with distinction by gender for the languages that have not yet been planned.
Efforts to avoid bias
Kuczmasrki affirms that other challenges that are being worked on are implementing the gender distinction in the automatic suggestions to complete a search that is carried out and "how to deal with the issue of non-binary gender in translations".
It is not the first time that the company faces the problem of avoiding gender biases in the services it offers. As reported by Reuters at the end of NovemberGoogle decided to block the suggestion of pronouns marked by gender, such as him [a él] Y her [a ella], in the new Smart Compose system, tool that suggests options to complete the phrases in the emails currently available for English.
The company detected that the tool generated biased suggestions and looked for different ways to prevent it from happening, but in the end it had to surrender. "The only reliable technique has been to be conservative," said Prabhakar Raghavan, one of the developers.