April 15, 2021

Amnesty International, against facial recognition technology


A facial recognition solution from Intel.

A facial recognition solution from Intel.
EP

The organization for the defense of human rights International Amnesty (AI) launched a global campaign on Monday to fight for the ban on facial recognition technology, which assures “amplifies racist police surveillance and threatens the right to protest.”

“There is a risk that facial recognition used as a weapon by law enforcement agencies against marginalized communities all over the world, “the Artificial Intelligence researcher and researcher said in a press release. Human rights from Amnesty International, Matt Mahmoudi.

“From New Delhi to New York, this invasive technology turns our identities against us and undermines human rights,” he added.

Amnesty International’s campaign, called “Ban the Scan”, kicks off in New York to later focus on the use of this type of identification around the world in 2020.

The group assures that facial recognition systems are a form of mass surveillance that violates the rights to privacy and threatens the rights to freedom of peaceful assembly and expression.

In addition, they point out, it exacerbates systematic racism, since it could affect more people of color, who already suffer discrimination and violations of their rights by the forces of order, says Amnesty International.

“New Yorkers could go out and make a living without being tracked by facial recognition. Other major cities in the US have already facial recognition prohibited, and New York should do the same, “Mahmoudi urged.

In the Big Apple, Amnesty International collaborates in this effort with other organizations such as Artificial Intelligence for the People, the Union of Civil Liberties of New York or the Coalition for the Privacy of New York.

“Police use of facial recognition technology puts New Yorkers on perpetual identification wheels and violates our rights to privacy. Facial recognition is ubiquitous, unregulated and should be prohibited,” Founder Mutale Nkonde said in a statement. and CEO of Artificial Intelligence for the People.

This type of technology can be developed by copying images of profiles of social networks or driving licenses without the permission of individuals, after which a computer program analyzes images captured by security cameras to try to identify people.

Amnesty International also noted that, although cities such as Boston, Portland or San Francisco have prohibited its use, the New York Police continue to use it to “intimidate” citizens, “as seen in the Black Lives Matter protests last year.”

For example, they point out, the activist Dwreck Ingram was videotaped at one of these demonstrations in New York in June 2020, and on August 7, dozens of police officers tried to enter his apartment under accusations of having attacked an agent.

The organization, which claims that one of the agents involved in this arrest attempt was photographed with a document that implied that a facial recognition system had been used, claims that the Police incorrectly informed Ingram of his rights, tried to question him without the presence of a lawyer and surrounded his residence.

“The Police left only after Dwreck broadcast the event live on social media,” says Amnesty International, adding that the Police used your Instagram profile photo to identify you, which was later used in a “wanted” poster that security forces hung up in his neighborhood.

“Activists are specifically targeted by these technologies because of what we are protesting and because we are trying to dismantle a system that the Police are a part of,” Ingram said.

The “Ban the Scan” campaign will begin with the launch of a website where New York residents can generate feedback on the use of facial recognition technology, and later submit requests to find out exactly where it is being used.

The page will then be expanded next May, when an “Army of Decoders” will help geolocate devices that have the ability to carry out this facial recognition in New York.

.



Source link