The Centre for Retail research has produced an overview of crime statistics in the UK which show a continued increase in the level and type of crime facing the retail sector.

https://www.retailresearch.org/crime-costs-uk.html

There are several official and unofficial surveys of retail crime in the UK, including the Home Office Crime Against Business 2018, the British Retail Consortium’s Retail Crime Survey 2018, the Association of Convenience Stores’ The Crime Report 2019, as well as Sensormatic’s Global Shrink Index and Checkpoint’s Global Retail Theft Barometer. Each one has a different methodology and covers a slightly different purpose.

We have consolidated this information into a simple info-graphics with the focus on the major crime and violence issues facing UK retailers

Download the info-graphics guide:

FaceWatch_Infographic_A5 v6-2

Reported by:

As the world’s first case against the controversial technology concluded, two leading judges dismissed the case brought by human rights campaign group Liberty on behalf of Ed Bridges, a Cardiff resident whose face was scanned by South Wales Police during a trial of facial recognition.

Lord Justice Haddon-Cave, sitting with Mr Justice Swift, concluded that South Wales Police’s use of live facial recognition “met the requirements of the Human Rights Act”.In a three-day hearing in May, Mr Bridges’ lawyers had argued that South Wales Police violated his human right to privacy by capturing and processing an image taken of him in public.

The judges also ruled that existing data protection law offered sufficient safeguards for members of the public whose faces were scanned by facial recognition cameras, and that South Wales Police had considered the implications.

Liberty lawyer Megan Goulding said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.

“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.

“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

A spokesperson for the Information Commissioner’s Office said: “We will be reviewing the judgment carefully.

“We welcome the court’s finding that the police use of Live Facial Recognition (LFR) systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“Any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”

Published in The Observer, by Tom Chivers, Sun 4 Aug 2019 09.00 BST

The technology is helping to combat crimes police no longer deal with, but its use raises concerns about civil liberties

Paul Wilks runs a Budgens supermarket in Aylesbury, Buckinghamshire. Like most retail owners, he’d had problems with shoplifting – largely carried out by a relatively small number of repeat offenders. Then a year or so ago, exasperated, he installed something called Facewatch. It’s a facial-recognition system that watches people coming into the store; it has a database of “subjects of interest” (SOIs), and if it recognises one, it sends a discreet alert to the store manager. “If someone triggers the alert,” says Paul, “they’re approached by a member of management, and asked to leave, and most of the time they duly do.”

Facial recognition, in one form or another, is in the news most weeks at the moment. Recently, a novelty phone app, FaceApp, which takes your photo and ages it to show what you’ll look like in a few decades, caused a public freakout when people realised it was a Russian company and decided it was using their faces for surveillance. (It appears to have been doing nothing especially objectionable.) More seriously, the city authority in San Francisco have banned the use of facial-recognition technologies by the police and other government agencies; and the House of Commons Science and technology committee has called for British police to stop using it as well, until regulation is in place, though the then home secretary (now chancellor) Sajid Javid, said he was in favour of trials continuing.

Paul Wilks, Owner Wilks Budgens

Paul Wilks. Owner of WilksBudgens

Wilks Budgens Store

There is a growing demand for the technology in shops, with dozens of companies selling retail facial-recognition software – perhaps because, in recent years, it has become pointless to report shoplifting to the police. Budgets for policing in England have been cut in real terms by about 20% since 2010, and a change in the law in 2014, whereby shoplifting of goods below a value of £200 was made a summary offence (ie less serious, not to be tried by a jury), meant police directed time and resources away from shoplifting. The number of people being arrested and charged has fallen dramatically, with less than 10% of shoplifting now reported. The British Retail Consortium trade group estimates that £700m is lost annually to theft. Retailers are looking for other methods. The rapid improvement in AI technologies, and the dramatic fall in cost, mean that it is now viable as one of those other methods.

“The systems are getting better year on year,” says Josh Davis, a psychologist at the University of Greenwich who works on facial recognition in humans and AIs. The US National Institute of Standards and Technology assesses the state of facial recognition every year, he says, and the ability of the best algorithms to match a new image to a face in a database improved 20-fold between 2014 and 2018. And analogously with Moore’s law, about computer processing power doubling every year – the cost falls annually as well.

In ideal environments such as airport check-ins, where the face is straight on and well lit and the camera is high-quality, AI face recognition is now better than human, and has been since at least 2014. In the wild – with the camera looking down, often poorly lit and lower-definition – it’s far less effective, says Prof Maja Pantic, an AI researcher at Imperial College London. “It’s far from the 99.9% you get with mugshots,” she says. “But it is good, and moving relatively fast forward.”

 

Each algorithm is different, but fundamentally, they work the same way. They are given large numbers of images of people and are told which ones are the same people; they then analyse those images to pick out the features that identify them. Those features are not things like “size of ear” or “length of nose”, says Pantic, but something like textures: the algorithm assesses faces by gradients of light and dark, which allow it to detect points on the face and build a 3D image. “If you grow a beard or gain a lot of weight,” she says, “very often a passport control machine cannot recognise you, because a large part of the texture is different.”

But while the algorithms are understood at this quite high level, the specific things that they use to identify people are not and cannot be known in detail. It’s a black box: the training data goes into the algorithm, sloshes around a bit, and produces very effective systems, but the exact way it works is not clear to the developer. “We don’t have theoretical proofs of anything,” says Pantic. The problem is that there is so much data: you could go into the system and disentangle what it was doing if it had looked at a few tens of photos, perhaps, or a few hundred, but when it has looked at millions, each containing large amounts of data itself, it becomes impossible. “The transparency is not there,” she says.

Still, neither she nor Davis is unduly worried about the rise of facial recognition. “I don’t really see what the big issue is,” Pantic says. Police prosecutions at the moment often rely on eyewitnesses, “who say ‘sure, that’s him, that’s her’, but it’s not”: at least facial recognition, she says, can be more accurate. She is concerned about other invasions of privacy, of intrusions by the government into our phones, but, she says, facial recognition represents a “fairly limited cost of privacy” given the gains it can provide, and given how much privacy we’ve already given up by having our phones on us all the time. “The GPS knows exactly where you are, what you’re eating, when you go to the office, whether you stayed out,” she says. “The faces are the cherry on top of the pie, and we talk about the cherry and forget about the pie.”

As with all algorithmic assessment, there is reasonable concern about bias. No algorithm is better than its dataset, and – simply put – there are more pictures of white people on the internet than there are of black people. “We have less data on dark-skinned people,” says Pantic. “Large databases of Caucasian people, not so large on Chinese and Indian, desperately bad on people of African descent.” Davis says there is an additional problem, that darker skin reflects less light, providing less information for the algorithms to work with. For these two reasons algorithms are more likely to correctly identify white people than black people. “That’s problematic for stop and search,” says Davis. Silkie Carlo, the director of the not-for-profit civil liberties organisation Big Brother Watch, describes one situation where an 18-year-old black man was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.

That said, the Facewatch facial-recognition system is, at least on white men under the highly controlled conditions of their office, unnervingly good. Nick Fisher, Facewatch’s CEO, showed me a demo version; he walked through a door and a wall-mounted camera in front of him took a photo of his face; immediately, an alert came up on his phone (he’s in the system as an SOI, so he can demonstrate it). I did the same thing, and it recognised me as a face, but no alert was sent and, he said, the face data was immediately deleted, because I was not an SOI.

Facewatch are keen to say that they’re not a technology company themselves – they’re a data management company. They provide management of the watch lists in what they say is compliance with the European General Data Protection Regulation (GDPR). If someone is seen shoplifting on camera or by a staff member, their image can be stored as an SOI; if they are then seen in that shop again, the shop manager will get an alert. GDPR allows these watch lists to be shared in a “proportionate” way; so if you’re caught on camera like this once, it can be shared with other local Facewatch users. In London, says Fisher, that would be an eight-mile radius. If you’re seen stealing repeatedly in many different cities, it could proportionately be shared nationwide; if you’re never seen stealing again, your face is taken off the database after two years.

Carlo is not reassured: she says that it involves placing a lot of trust in retail companies and their security staff to use this technology fairly. “We’re not talking about police but security staff who aren’t held to the same professional standards. They get stuff wrong all the time. What if they have an altercation [with a customer] or a grievance?” The SOI database system, she says, subverts our justice system. “How do you know if you’re on the watch list? You’re not guilty of anything, in the legal sense. If there’s proof that you’ve committed a crime, you need to go through the criminal justice system; otherwise we’re in a system of private policing. We’re entering the sphere of pre-crime.”

Fisher and Facewatch, though, argue that it is not so unlike the age-old practice of shops and bars having pictures up in the staff room of regular troublemakers. The difference, they say, is that it is not relying on untrained humans to spot those troublemakers, but a much more accurate system.

The problem is that, at the moment, there is very little regulation – other than GDPR – governing what you can and can’t do with a facial-recognition system. Facewatch say, loudly and often, that they want regulation, so they know what they are legally allowed to do. On the other hand, Carlo and Big Brother Watch, along with other civil liberties groups, want an urgent moratorium and a detailed democratic debate about the extent to which we are happy with technologies like these in our lives. “Our politicians don’t seem to be aware that we’re living through a seismic technological revolution,” she says. “Jumping straight to legislation and ‘safeguards’ is to short-circuit what needs to be a much bigger exercise.”

Either way, it needs to happen fast. In Buckinghamshire, Paul Wilks is already using the technology in his Budgens, and is finding it makes life easier. When he started, his shop would have things stolen every day or two, but since he introduced the system, it’s become less common. “There’s definitely been a reduction in unknown losses, and a reduction in disruptive incidents,” he says. As well as a financial gain, his staff feel safer, especially late at night, “which is good for team morale”. If enough retailers start using facial-recognition technology before the government takes notice, then we may find that the democratic discussion has been short-circuited already.

 

Sajid Javid, the Home Secretary, has backed trials of face recognition by the Metropolitan Police. The trials will be used to test AFR (automatic facial recognition) to help in the fight against child abuse.

Speaking at the launch of new computer technology aimed at helping police fight against online child abuse, Mr Javid said it was right for forces to…

“To be on top of the latest technology”

He added:

“I back the police in looking at technology and trialling it and… different types of facial recognition technology is being trialled especially by the Met at the moment and I think it’s right they look at that.”

Report from the BBC may be read here

The independent panel that advises City Hall on the ethics of policing has set out new guidelines on how facial recognition technology should be used by the Met Police in the capital.