Scanned, tackled, arrested: how live facial recognition was piloted on the streets of Croydon
Police got several matches during trial in London borough – but where some see progress on crime, others see violation of privacy
silverguide.site –
It happened in a flash outside Barclays in Croydon town centre. A digital trap snapped shut around one of Britain’s thousands of wanted criminals. In little over a minute, a combination of high-definition cameras, automated AI face scanning and half a dozen police officers had run a wanted man to ground.
After the handcuffs clicked shut, the Metropolitan police’s controversial live facial recognition (LFR) cameras had chalked up another arrest: the fifth in 45 minutes on a regular Thursday morning.
The arrest was one of hundreds made during a six-month Met police pilot of LFR cameras on vans and fixed to lamp-posts, as also seen across cities in China, the UAE, India and Israel. Critics have called the technology invasive, unregulated and anti-democratic, cited studies suggesting racial bias and called for it to be scrapped. But the Met police commissioner, Mark Rowley, has said it is “gamechanging” and keeps the public safe.
The trap was set at 10am, when the clusters of surveillance cameras mounted high on pillars at the junction of Church Street and North End were switched on. Standing nearby was Kevin Brown, a plain-clothes police sergeant. The wanted man unwittingly walked past one of the cameras and instantaneously Sgt Brown’s handheld beeped furiously and flashed up the suspect’s pin-sharp live photo, alongside a previous custody photo, his name, the suspected crime and warnings of any weapons or drugs risk.
Every face passing the cameras – as many as 5,000 an hour – was being scanned and its biometric data streamed live to a police operations room five miles away in Sydenham. There, an AI-powered system, supplied by the Japanese tech company NEC, checked it instantly against photos of wanted suspects and people under court orders.
Brown had a match. Uniformed officers standing across the road received the same alert and they all dashed to converge on the suspect, who made a vain run for it and then began to fight hard. A pair of officers jumped on his back and several more came in on top as they brought him down, amid shouting and consternation from bystanders. One officer emptied the man’s pockets: a lighter and what looked like a pair of scissors. He was subdued, arrested and taken away in a police van.
There were street signs warning that the system was scanning the face of every pedestrian, but the suspect was one of many passersby oblivious to the fact that a torrent of their personal data was being scanned, creating a digital dragnet.
Within minutes Brown’s device was buzzing again with matches at the other end of the street, and on it went. On this weekday morning the AI-enabled system triggered 19 alerts, resulting in nine arrests for crimes including rape, shoplifting and breach of court orders. Another man was stopped, not because he was wanted for a crime, but to check he was adhering to the terms of a court order, and another was held for a while, he said, because he had a criminal record and was on probation.
“This is a bit over the top isn’t it?” he said as three officers crowded him against a wall to check he was adhering to his probation terms. He had had no idea LFR was in operation and said: “I’m not coming to Croydon again. It’s mad. I am all registered.”
Scotland Yard has trumpeted the effectiveness of the technology at catching people wanted for violence against women and girls, with 2,100 such arrests made with the help of facial recognition since the start of 2024, as well as more than 100 sex offenders. In one case the LFR cameras detected a registered sex offender, who was required not to be around children, alone with a six-year-old girl. He was arrested and jailed for two years.
A widespread public concern is the risk of racial bias, after early models showed concerning results. But the Met has said independent testing by the National Physical Laboratory found that, at the threshold Scotland Yard sets to determine a match, the system was accurate and balanced with regard to ethnicity and gender.
It said that in 2025 there were just 12 false alerts out of more than 3 million faces captured.
Public opinion in Croydon town centre ran the gamut from strong opposition to strong support.
“It’s not good to have your face scanned,” said Maleek Ife, 36, a delivery driver. “It’s a violation of privacy. To scan everyone? I’m not happy with that. It’s not what I believe the UK is about. You should respect people’s privacy. You should be able to walk around freely. I don’t think we should become a surveillance state. We are going to become like the [countries around the world] we criticise.”
“Its good,” said Sam Mensah, 53, a supermarket worker. “If someone is doing something bad you can catch that person immediately. I am not worried about being scanned. I have no issue [to hide].”
Owen Brown, 63, a carer, saw the cameras as just another part of a wider slide into digital tracking. “The way life is moving now they track you through your phone anyway,” he said. “There’s nowhere you can go without being scanned or looked at. It’s invasive, but what can you do about it? It’s part of life now.”

Comment