Catalonia uses an algorithm to decide which prisoners to grant parole

The Artificial intelligence (AI) conditions people’s lives much more than one imagines. These systems are used in social networks to predict human behavior, but also to determine whether or not a bank should grant credit or whether a prisoner can be paroled. That is what happens in prisons from Catalonia, where a algorithm evaluates the granting of prison permits. Although this protocol has been in operation for 12 years, several experts warn of the risk that this evaluation mechanism generates discrimination against the prison population.

In 2009 the Department of Justice launched RisCanvi, a protocol that uses a predictive analysis algorithm that calculates the risk of recidivism for each prisoner. Catalan prison inmates are interviewed every six months. Their responses are ranked and used by the algorithm to determine if there is a high, medium, or low risk that they will commit a new one. crime after leaving the jail. This personalized report is used by officials to help them decide on the recommendation of the conditional freedom, but also penitentiary permits or the transfer of inmates.

Although its operation is little known to public opinion, this tool has been deeply implanted in the Catalan prison system for more than a decade, when it was launched to try to stop the discharge. recidivism violent of the prisoners. A version of RisCanvi with 10 questions is applied to the 7,873 prisoners currently in prisons in Catalonia for screening and a more exhaustive version of 43 questions is applied to those with a high risk of recidivism. Since 2009, a total of 37,897 inmates have been evaluated using this software, according to data from the Generalitat. Up to 750 professionals use this program daily, from psychologists to social workers to criminologists.

Effective or biased system?

The key to any predictive system is the effectiveness of its forecasts. Even more so when human lives are involved. The only public data on RisCanvi’s performance – according to an official report from 2014 – document that the system got it right. predicting a risk of recidivism high or medium in 77.15% of the cases and that it was correct in 52.26% in the cases classified as low risk. The false negatives –the prisoners who re-offended without the algorithm predicting it– were only 4.6% of those considered low risk, while the false positives –those who were predicted to reoffend and have not done so– were a 42.7% of the non-recidivists.

The psychologist Antonio Andrés Pueyo, one of the creators of RisCanvi, has stated on several occasions that the system is “independent of the biases”. “We have a human review system that allows us to correct the algorithm’s predictions to remove any cultural or ethnic biases that it may have,” he says. Jordi Camps, head of the Rehabilitation Service of the Generalitat of Catalonia. A team from Pompeu Fabra University (UPF) led by the researcher Carlos Castillo he is studying the bowels of the Catalan system. “At the moment we have not found any indication of bias,” he says, noting that “it works well, but that does not guarantee that it will continue like this.”

However, other experts point in the opposite direction. “RisCanvi can contribute to the discrimination socio-economic situation by overestimating the dangerousness of certain prisoners, who usually come from disadvantaged groups, and contributing to their remaining in prison ”, he explains Gemma Galdon, founder and executive director of ETICAS. Among the parameters that the algorithm uses to assess the risk of each prisoner is their history of violence or bad behavior, but also their lack of economic resources, employment or family support, drug addiction, level of education or poor adaptation during childhood. There are four aspects that are always evaluated: age, gender, nationality and if the prisoner is convicted or pending of judgment. It is unknown what weight each of these parameters has in the final report of the algorithm.

Lack of transparency

Researchers denounce lack of transparency in the system. “It is worrying that RisCanvi is not monitored every year and does not go through a study of algorithmic impact to know if it produces social biases ”, explains Castillo. The algorithm has never been audited by an independent body and the last public report on its performance, from 2014, does not classify prisoners, which makes it impossible to know if its predictions affect the same depending on nationality, migrant origin or sex. “The Generalitat was a pioneer, but it was done at a time when there were not so many standards of regulation“, Add.

The RisCanvi algorithm does not make an automated final decision, but rather serves as a guiding resource for officials. That allows its use to be unregulated. “Legally it is as if it did not exist,” he says. Alejandro Huergo, professor of administrative law at the Oviedo University. Although he considers that this system “is more objective than the official’s clinical eye,” he warns that the lack of regulation means that the process has “fewer guarantees.” The adoption of the European Union A regulation on AI would classify RisCanvi as a ‘high risk’ system and would force the adoption of a supervisory mechanism.

Legal defenselessness of prisoners

This system can also cause cases of certain legal defenselessness of persons deprived of liberty. What happens if the algorithm penalizes a prisoner for his condition of poverty? Could you appeal it? Who grants the permissions are the judges, not the algorithm. However, problems arise if the RisCanvi recommendations are given too much relevance, since it is unknown which variables are the most valued by the system and if it amplifies discrimination. “That makes it very difficult for us to challenge the algorithm,” explains the criminal lawyer. Mireia Balaguer. “There are factors that can lead to a result not adjusted to reality, in that case we must analyze each case and accredit variables of the prisoners as their positive evolution.” Even so, this specialist sees RisCanvi as “a good guiding tool for prevention” and remarks that the process has guarantees, since the judge bases his decision on permits and does not have “blind faith” in what the system recommends.

There are also conflicting opinions regarding the impact of this protocol on inmates. “They feel completely subjected to being permanently judged on their habits,” he explains. Iñaki Rivera, director of the Observatory of the Penal System and Human rights (OSPDH). “They are asked very intrusive questions that, in my opinion, conflict with the right to privacy.” Camps, however, believes that the system is key to rehabilitation: “the data that is extracted is very practical to help the inmate and reduce his risks.”

In 2008, a year before the entry of RisCanvi, the recidivism rate was 40.3%. The latest public data, of 2014, they place it at 30.2%. The Generalitat is working on another report with the 2018 data.


Source link