Simon Mackenzie, a security guard at discount retailer QD Stores outside London, was out of breath. He had just chased down three shoplifters who had made off with several packs of laundry detergent. Before the police arrived, he was sitting at a desk in a back room doing something important: capturing the faces of the perpetrators.
On an outdated desktop computer, he pulled up footage from security cameras, pausing to zoom in and save a photo of each thief. He then logged into a facial recognition program, Facewatch, which his store uses to identify shoplifters. The next time those people enter a store within a few miles that uses Facewatch, store staff will get an alert.
“It’s like having someone with you saying, ‘That person you put in the bag last week just came back,'” Mr Mackenzie said.
Police use of facial recognition technology has come under heavy scrutiny in recent years, but its adoption by private companies has received less attention. As technology improves and costs fall, systems are penetrating people’s lives. Facial recognition is no longer the preserve of government agencies, but is increasingly used to identify shoplifters, problem customers and legal opponents.
Facewatch, a British company, is used by shopkeepers across the country frustrated by petty crime. For just £250 a month, or about $320, Facewatch offers access to a custom watchlist that is stored near each other. When Facewatch sees a marked face, an alert is sent to a smartphone in the store, where employees decide whether to keep a close eye on the person or ask the person to leave.
Mackenzie adds one or two new faces every week, he said, mostly people who steal diapers, groceries, pet supplies and other cheap goods. He said their economic hardship made him sympathetic, but the thefts had gotten so out of control that facial recognition was needed. Usually, Facewatch notifies him at least once a day that someone on the watchlist has entered the store.
Facial recognition technology is advancing rapidly as Western countries grapple with the advancements brought about by artificial intelligence. The European Union is drafting rules that would ban much of the use of facial recognition, while New York City mayor Eric Adams has encouraged retailers to try the technology to fight crime. MSG Entertainment, the owner of Madison Square Garden and Radio City Music Hall, has used automatic facial recognition to deny entry to lawyers whose firms have sued the company.
Of the democratic countries, Britain is leading the way in the use of live facial recognition, with courts and regulators approving its use. Police in London and Cardiff are experimenting with the technology to identify wanted criminals as they walk down the street. In May it was used to scan the crowd at King Charles III’s coronation.
But its use by retailers has drawn criticism as a disproportionate remedy for petty crimes. Individuals hardly know they are on the watch list or how to appeal. In a legal complaint last year, Big Brother Watch, a civil society group, called it “Orwellian in the extreme.”
Fraser Sampson, the UK’s biometrics and surveillance camera commissioner, who advises the government on policy, said there was “nervousness and hesitation” around facial recognition technology due to privacy concerns and underperforming algorithms in the past.
“But I think in terms of speed, scale, accuracy and cost, facial recognition technology could literally be a game changer in some areas,” he said. “That means that its arrival and deployment is probably inevitable. It’s just a matter of when.”
‘You can’t expect the police to come’
Facewatch was founded in 2010 by Simon Gordon, the owner of a popular 19th-century wine bar in central London, known for its cellar-like interior and popularity among pickpockets.
At the time, Mr Gordon hired software developers to create an online tool to share security camera footage with authorities, hoping it would save police time filing incident reports and result in more arrests.
There was limited interest, but Mr. Gordon’s fascination with security technology was awakened. He followed developments in facial recognition and had the idea for a watchlist that retailers could share and contribute to. It was like the photos of shoplifters that stores keep next to the checkout counter, but then loaded into a collective database to identify bad guys in real time.
In 2018, Mr. Gordon believed the technology was ready for commercial use.
“You have to help yourself,” he said in an interview. “You can’t expect the police to come.”
Facewatch, which licenses facial recognition software from Real Networks and Amazon, is now in nearly 400 stores across the UK. Trained on millions of photos and videos, the systems read a face’s biometric information when the person walks into a store and compares it to a database of tagged people.
The Facewatch watchlist is constantly growing as stores upload photos of shoplifters and problematic customers. Once added, a person will stay there for a year before being removed.
‘Mistakes are rare, but they happen’
Every time Facewatch’s system identifies a shoplifter, a notification goes to a person who has passed a test to be a “super recognizer” — someone with a special talent for remembering faces. Within seconds, the super recognizer should confirm the match against the Facewatch database before sending an alert.
But while the company has policies in place to prevent misidentification and other errors, mistakes do happen.
In October, a woman buying milk at a supermarket in Bristol, England, was confronted by an employee and ordered to leave. She was told that Facewatch had marked her as a barred shoplifter.
The woman, who asked that her name be kept secret due to privacy concerns and whose story was corroborated by material from her lawyer and Facewatch, said there must have been a mistake. When she contacted Facewatch a few days later, the company apologized and said it was a case of mistaken identity.
After the woman threatened legal action, Facewatch dove into its records. It found the woman had been added to the watchlist due to an incident 10 months earlier involving £20 worth of merchandise, about $25. The system “worked perfectly,” Facewatch said.
But while the technology correctly identified the woman, it didn’t leave much room for human discretion. Neither Facewatch nor the store where the incident took place contacted her to let her know she was on the watch list and to ask what had happened.
The woman said she had no recollection of the incident and had never shoplifted. She said she may have walked out after not realizing that her debit card payment at a self-checkout kiosk had failed.
Madeleine Stone, Big Brother Watch’s legal and policy officer, said Facewatch “normalized airport-style security checks for everyday activities like buying a pint of milk.”
Mr Gordon declined to comment on the Bristol incident.
Overall, he said, “Mistakes are rare, but they do happen.” He added: “If this happens, we will acknowledge our mistake, apologise, delete all relevant data to prevent recurrence and offer commensurate compensation.”
Approved by the privacy agency
Civil liberties groups have raised concerns about Facewatch, suggesting that its use to prevent petty crime may be illegal under UK privacy law, which requires biometric technologies to be of “substantial public interest”.
The British Information Commissioner’s Office, the privacy regulator, has been investigating Facewatch for a year. The agency concluded in March that Facewatch’s system was legal, but only after the company made changes to the way it worked.
Stephen Bonner, the office’s deputy commissioner for regulatory oversight, said in an interview that an investigation led Facewatch to change its policy: it would place more signage in stores, only share information about serious and violent offenders among stores, and only send warnings about repeat offenders. That means people are not put on the watch list after one minor offense, as happened to the woman in Bristol.
“That reduces the amount of personal data held, reduces the likelihood of individuals being wrongly added to these types of lists and makes them more likely to be accurate,” said Mr Bonner. The technology, he said, is “no different than just having really good guards.”
Liam Ardern, the operations manager of Lawrence Hunt, owner of 23 Spar convenience stores that use Facewatch, estimates the technology has saved the company more than £50,000 since 2020.
He called the privacy risks of facial recognition exaggerated. The only example of misidentification he recalled was when a man was mistaken for his identical twin brother, who had shoplifted. Critics overlook the fact that stores like his operate on thin profit margins, he said.
“It’s easy for them to say, ‘No, it’s against human rights,'” said Mr. Ardern. If shoplifting is not reduced, he said, his stores will have to raise prices or cut staff.