Simon MacKenzie, a security officer at discount retailer QD Stores outside London, was having trouble breathing. He had just chased away three shopkeepers who had fled with several packages of laundry soap. Before the police arrived, he sat down at a back room table to do something important: capture the faces of criminals.
On an old desktop computer, he pulled up security camera footage, paused to zoom in, and saved a photo of each burglar. He then logged into Facewatch, the facial recognition program his store uses to identify shoppers. The next time those people enter a store using Facewatch within a few miles, store employees will receive an alert.
Mr Mackenzie said, “It’s like saying to someone with you, ‘That person you got last week has just come back.’
The use of facial recognition technology by police has been heavily scrutinized in recent years, but its application by private businesses has received less attention. Now, as technology improves and costs come down, the systems are moving further and further into people’s lives. No longer confined to government agencies alone, facial recognition is increasingly being used to identify shopkeepers, problematic customers and legal adversaries.
British company Facewatch is used by retailers across the country frustrated by petty crime. For as little as £250 a month, or about $320, Facewatch provides access to a customized watchlist that stores nearby each other’s shares. When the Facewatch sees a marked face, an alert is sent to a smartphone at the store, where employees decide whether to keep a close eye on the person or ask that person to leave.
Mr. Mackenzie adds one or two new faces every week, he said, mainly people who steal diapers, groceries, pet supplies and other low-cost items. He said his economic hardship has made him sympathetic, but the number of burglaries has been so high that facial recognition is required. Usually at least once a day, the Facewatch alerts him that someone on the watch list has entered the store.
Facial recognition technology is growing rapidly as Western countries grapple with the advances made by artificial intelligence. The European Union is drafting rules that would ban many uses of facial recognition, while New York City Mayor Eric Adams has encouraged retailers to try out the technology to fight crime. MSG Entertainment, the owner of Madison Square Garden and Radio City Music Hall, has used automated facial recognition to deny entry to lawyers whose firms have sued the company.
Among democracies, the UK is at the forefront of using live facial recognition, with courts and regulators signing off on its use. Police in London and Cardiff are using the technology to identify wanted criminals as they walk on the street. In May, it was used to scan crowds Coronation of King Charles III.
But its use by retailers has been criticized as an inconsistent solution to minor offences. Individuals have little way of knowing whether they are on a watch list or how to appeal. In a legal complaint last year, Big Brother Watch, a civil society group, called it “Orwellian in the extreme”.
Britain’s Biometrics and Surveillance Cameras Commissioner Fraser Sampson, who advises the government on policy, said there was “nervousness and hesitation” about facial recognition technology because of privacy concerns and algorithms that have performed poorly in the past.
“But I think that facial recognition technology could be, you know, literally a game changer in some areas, in terms of speed, scale, accuracy and cost,” he said. “This means that its arrival and deployment is probably inevitable. it’s just a matter of when,
‘You can’t expect the police to come’
Facewatch was founded in 2010 by Simon Gordon, owner of a popular 19th-century wine bar in central London known for its cellar-like interior and popularity among pickpockets.
At the time, Mr. Gordon hired software developers to create an online tool for sharing security camera footage with authorities, which he hoped would save police time filing incident reports and result in more arrests .
Interest was limited, but Mr. Gordon’s fascination with security technology grew. He followed the development of facial recognition and had the idea of a watchlist that retailers could share and contribute to. It was like the photos thieves used to keep next to the store register, but supercharged into a collective database to identify the bad guys in real time.
By 2018, Mr. Gordon felt the technology was ready for commercial use.
He said in an interview, “You have to help yourself.” “You can’t expect the police to come.”
Facewatch, which licenses facial recognition software made by Real Networks and Amazon, is now in around 400 stores across the UK. Trained on millions of images and videos, the system reads biometric information on a person’s face as they enter the store and checks it against a database of flagged people.
FaceWatch’s watchlist continues to grow as stores upload photos of shoppers and problematic customers. Once added, a person stays there for one year before being removed.
‘Mistakes are rare but do happen’
Every time Facewatch’s system identifies a thief, a notification goes to the person who has passed the test to become a “super recogniser” – a person with a special talent for remembering faces. Within seconds, the super recognizer must confirm a match against the Facewatch database before an alert is sent.
But while the company has policies in place to prevent mistaken identity and other errors, mistakes do happen.
In October, a woman buying milk at a supermarket in Bristol, England, was confronted by an employee and ordered to leave. She was told that Facewatch had flagged her as a prohibited shopper.
The woman, who requested that she be kept anonymous due to privacy concerns and whose story corroborates with material provided by her lawyer and Facewatch, said there must have been a mistake. When she contacted Facewatch a few days later, the company apologized, saying it was a case of mistaken identity.
After the woman threatened legal action, Facewatch went through her records. It found that the woman had been added to the watch list because of an incident that occurred 10 months earlier, involving 20 pounds of merchandise, approximately $25. Facewatch said, “The system worked perfectly.”
But while technology had correctly identified the woman, it did not leave much room for human discretion. Neither Facewatch nor the store where the incident occurred contacted her to tell her she was on a watchlist and to ask what had happened.
The woman said she did not remember the incident and never made the purchase. She said she may not have realized that her debit card payment failed at the self-checkout kiosk, so she may have walked out.
Madeleine Stone, legal and policy officer for Big Brother Watch, said Facewatch was “normalizing airport-style security checks for everyday activities like buying a pint of milk.”
Mr Gordon declined to comment on the Bristol incident.
In general, he said, “mistakes are rare but do happen.” “If this happens, we acknowledge our error, apologise, remove any relevant data to prevent a recurrence and offer proportionate compensation,” he added.
approved by the privacy office
Civil liberties groups have raised concerns about Facewatch and have suggested that its deployment to prevent petty crime may be illegal under British privacy law, which requires that there is a “substantial public interest” in biometric technologies.
The UK’s Information Commissioner’s Office, the privacy regulator, conducted a one-year investigation into Facewatch. The office concluded in March that Facewatch’s system was permissible under lawBut only after the company has made changes to the way it operates.
Stephen Bonner, deputy commissioner of the Office for Regulatory Supervision, said in an interview that an investigation caused Facewatch to change its policies: it would put up more signage in stores, share information only about serious and violent offenders between stores And will only send alerts. Repeat offenders. This means that people will not be put on a watch list after even a minor offence, as was the case with the woman in Bristol.
“This reduces the amount of personal data held, makes individuals less likely to be erroneously added to such a list and is more likely to be accurate,” Mr Bonner said. This technique, he said, is “no different than simply having very good security guards,
Liam Ardern, operations manager at Lawrence Hunt, which owns 23 Spar convenience stores using Facewatch, estimates the technology has saved the company more than £50,000 since 2020.
He exaggerated the privacy risks of facial recognition. The only instance of mistaken identity that he remembered was when a man was confused for his identical twin, who stole items from a store. Critics, he said, overlook the fact that stores like his operate on low profit margins.
“It’s easy for them to say, ‘No, it’s against human rights,'” Mr Ardern said. If shoplifting is not reduced, their shops will have to raise prices or cut staff, he said.