April 14, 2021

more expensive credits for them and different product offers



Algorithms are increasingly taking center stage in different sectors, including financial, where knowledge of the risk of an operation takes on a key relevance. These programs collect data and tailor a bank’s offerings and products based on a customer’s history and other industry circumstances. However, far from being objective, they carry gender biases present in society and that are translated into different conditions when granting a loan or when launching offers depending on whether it is a man or a woman, as concluded in a forum organized this Monday by the Asufin consumer association.

“A well-trained algorithm can make life easier because of its massive data processing capacity, it can make life easier and they are fantastic if we use them well. But that’s not always the case,” said Gemma Galdón, a public policy analyst specializing in impact. of the technologies intensive in personal data and in the audit of algorithms. “There are signs of malfunction, in many cases in the banking sector,” he added, noting that there is “evidence” of discrimination that occurs in financial activity on the basis of gender.

Galdón points out the origin of this discrimination in that in the historical data section, there is an absence of representation of women, because traditionally it was the man who dealt with the entities. “If the algorithm is not warned of this underrepresentation, the algorithm assigns more risk,” he said. “The discrimination of the past revictimizes us now. If that risk is not corrected, what it does is reproduce it,” the consultancy has influenced who points out that these consequences extend not only to the granting of credit but also to the commercial policies of the entities financial “Investment products are allocated less to women because we have less assets associated with our profiles. There are products that are never offered to us,” he added.

Verónica López, director of the AFI Foundation, recalled a case of “sexist bias” in the financial sector, which was Apple’s card with Goldman Sachs, which received complaints because it granted a higher level of credit to women than to men. “The algorithm made discriminatory decisions based on gender, the algorithms are not impartial”, has defended the applied economics consultancy Analistas Financieros Internacionales. “The data is what it is, but it depends on the glasses that one puts on before a set of information that is available to all,” he pointed out.

Both have coincided in pointing out that many of these algorithms are designed by teams that, usually, are composed only of men and, also, by training profiles linked to engineering, without the presence of other disciplines. “Any sociologist, for example, could indicate that there are dynamics of historical discrimination,” said Galdón, who has defended the presence of women and other training profiles in these teams to be able to make “better algorithms” that “do not serve to expand discriminatory structures “.

The problem, Galdón pointed out, is that there are still no concrete cases of discrimination due to algorithms in Spain, but that the experience in the US confirms this bias in the banking sector. The consultant has recalled that the law already requires that, when it is considered that an unfair decision has been made by an algorithm, it must be explained and that the final decision must be made by a person. “The problem is that this legal guarantee is not fulfilled,” he said. “At the moment we do not have data that banks have begun audits on their algorithms,” he said.

López, from AFI, warns that, beyond the problems of the “injustice” of discrimination that these practices have, banks must attend to the “reputational risks” that these algorithms have. “I would hope that all entities would have on the table an improvement of these tools,” he pointed out, being confident that the European regulations that are being developed will serve to urge these changes. “We have laws but they are not complied with, there are algorithms that are annulled in different countries and in Spain there is one denounced [referente al modelo de cálculo del bono social eléctrico]”, says Galdón.

With this, the representative of AFI assures that, although in the Government there are intentions to study these biases as in the Agenda 2025 of the Ministry of Economy, “there are no concrete actions” that serve to make a technological future “more egalitarian.” López recalled that the structures of the financial sector itself already show significant gender differences. For example, regarding women’s access to positions of power. “Women became directors in the financial sector in territorial divisions, not in the centers where decisions are made,” she pointed out.

Galdón points out that he has had “high-level” conversations with financial organizations to apply improvements to the algorithms they use, although he acknowledges that these improvements need to be applied in the banks’ day-to-day operations. “It is difficult for us to enter the intermediate structures,” he added. For the expert, the solution for this type of bias should not be “new biases” in favor of “positive discrimination”, but rather take into account the “personal history” rather than the historical data of the sector. “In historically sexist areas such as credit history or the doctor, reinforcing data from own experience would help,” he emphasized.

.



Source link