A plan to monitor women’s expressions with facial recognition technology to prevent street harassment in a northern Indian city will lead to intrusive checks and breaches of privacy, digital rights experts have warned.
In Lucknow, capital of the most populous Indian state of Uttar Pradesh, police have identified some 200 harassment hotspots that women often visit and where most complaints are reported, said Police Commissioner DK Thakur.
“We will install five AI-based cameras that will be able to send an alert to the nearest police station,” he said, referring to the technology based on artificial intelligence.
“These cameras will become active as soon as the expressions of a woman in distress change,” he told reporters this week, without giving more details on which expressions would trigger an alert.
Facial recognition technology is increasingly being deployed in airports, train stations and cafes across India, with plans for nationwide systems to modernize the police force and its information gathering and reporting processes. criminal identification.
But technology analysts and privacy experts say the benefits are unclear and could infringe on people’s privacy or lead to heightened surveillance, with little clarity on how the technology works, the how data is stored and who can access the data.
“The idea that cameras will monitor women’s expressions to see if they are in distress is absurd,” said Anushka Jain, associate attorney at the nonprofit Internet Freedom Foundation for Digital Rights.
“What is the expression of a person in distress – is it fear, anger? I could talk to my mom on the phone, get angry and pull a face – will that trigger an alert and send a cop?
A more feasible solution would be to increase the number of police patrols, Jain told the Thomson Reuters Foundation, adding that the technology has not been tested and could lead to over-surveillance and harassment of women that trigger alerts.
India is one of the most dangerous places in the world for women, with one rape every 15 minutes, according to government data.
Uttar Pradesh is the least secure state, with the highest number of crimes against women reported in 2019.
Police often refuse women who file complaints or take no action, said Roop Rekha Verma, a women’s rights activist in Lucknow.
“And they want us to believe that they will act by looking at our facial expressions,” she said.
India launched a series of legal reforms after a fatal gang rape in 2012, including simpler mechanisms for reporting sex crimes, fast-track courts and a tougher rape law with the death penalty, but rates condemnation rates remain low.
As there is a growing backlash against facial recognition technology in the United States and Europe, Indian officials have said it is needed to strengthen a severely under-guarded country, stop criminals and locate missing children. .
But digital rights activists say its use is problematic without a data protection law, and threatens the right to privacy, which was declared as a fundamental right by the Supreme Court in a landmark decision in 2017. .
“The police are using technology to solve a problem without seeing it as just a new form of surveillance, a new form of exercising power over women,” said Vidushi Marda, researcher at the human rights group Article 19. .
“AI is not a silver bullet, and no ‘fancy’ technology can solve societal problems,” she said.