“The data is then kept stored and shared proportionally with other retailers, creating a larger watchlist that everyone benefits from,” says a spokesperson for Facewatch. Its website claims that this is the “ONLY National Shared Facial Recognition Watchlist” and that the Watchlist basically works by linking multiple private facial recognition networks. He adds that since the Southern Co-op trial he has started a trial with another division of Co-op.
Facewatch refuses to say who all of its customers are, citing confidential reasons, but its website includes case studies of petrol stations and other stores UK. Last year the Financial Times reported Humber Prison uses its technology, as well as police and retailers in Brazil. Facewatch said its technology will be used in 550 stores in London. This can mean that a large number of people have their faces scanned. In Brazil, in December 2018, 2.75 million faces were captured by the technology, with the founders of the company telling the FT that it had reduced crime “globally by 70%”. (The report also states that a Co-op grocery store around London Victoria Station was using the technology.)
However, civil liberties advocates and regulators are wary of the expansion of private facial recognition networks, with concerns about their regulation and proportionality.
“Once someone walks into a Co-op store, they will be subjected to facial recognition scans … which could deter people from entering stores during a pandemic,” says Edin Omanovic, a director of the advocacy that focuses on facial recognition. at NGO Privacy International. The group has written to Co-op, regulators and law enforcement on the use of technology. In addition, his colleague Ioannis Kouvakas claims that the use of Facewatch technology raises legal issues. “It’s unnecessary and disproportionate,” says Kouvakas, a lawyer at Privacy International.
Facewatch and Co-op both build on their legitimate business interests under GDPR and data protection laws to scan people’s faces. They say the use of facial recognition technology allows them to minimize the impact of crime and improve staff safety.
“You must always be necessary and proportionate. Using extremely intrusive technology to scan people’s faces without them being 100% aware of the consequences and without their having the choice to provide explicit, freely given, informed and unambiguous consent is prohibited ”, says Kouvakas.
This is not the first time that Facewatch technology has been called into question. Other jurists have throw doubt whether there is a substantial public interest in using facial recognition technology. Britain’s data protection regulator, the Information Commissioner’s Office (ICO), says companies need to have clear evidence that there is a legal basis for using these systems.
“Public support for police using facial recognition to catch criminals is high, but less when it comes to the private sector that operates the technology in a quasi-police capacity,” an ICO spokesperson said. . The ICO is studying where live facial recognition is used in the private sector and hopes to release its findings early next year.
“The investigation includes the compliance assessment of a series of private companies that have used or are currently using facial recognition technology,” the ICO spokesperson said. “Facewatch is one of the organizations being considered.”
Part of the ICO’s investigation into the use of facial recognition in the private sector includes cases where police forces are involved. There are growing concerns about how police and law enforcement officers can access images captured by private surveillance systems.
In the United States, Amazon’s smart doorbells, which include motion tracking and facial recognition, have been configured to provide data to the police in certain circumstances. And the London Met Police were forced to apologize after delivery of images of seven people to a controversial private facial recognition system in Kings Cross in October 2019.