It’s a perfect time for our customers and Accredited Installers to watch all the latest news face-to-face on Facewatch TV.

The first series of video updates are delivered by our technical guru and all round Facewatch expert, George Gordon.

Broadcasting from his home George will bring you weekly tech tips and information about installing, running and getting the best from your installation.

Please accept statistics, marketing cookies to watch this video.

[/av_textblock]

By  18 March 2020  Retail Week 

As the government takes increasingly stringent action to combat the ongoing coronavirus outbreak, retail staff have found themselves on the front line in the face of panic buying and contagion fears.

  • Labour MP Alex Norris calls on government to pass legislation protecting shopworkers from violence, abuse and assault
  • Morrisons CEO David Potts introduces numerous measures to protect staff, including statutory sick pay
  • Several retailers have called on consumers to treat colleagues with respect in the face of growing concerns around stockpiling

The number of confirmed cases of coronavirus in the UK is rising daily. The government by its own admission has begun imposing “draconian” measures and many consumers have ignored pleas to the contrary and cleared shelves of some products in a stockpiling frenzy.

While panic buying has mostly taken place in supermarkets and grocery convenience stores, instances of consumers hoarding hand sanitiser, soap and over-the-counter medicines have also hit health and beauty retailers.

Social media, Twitter in particular, has started to fill up over the last few days with stories of frontline retail staff working longer hours and coming face to face with fraught and sometimes abusive members of the public, all the while trying to do their best to keep shelves stocked and consumers happy.

On Monday, Labour MP Alex Norris stood up in Parliament and put forward legislation to protect shopworkers from rising levels of violence, abuse and assault.

Norris said the shopworkers were “on the front line” of abuse and crime, and those worries were likely to be exacerbated amid the growing panic about coronavirus, given the “significance retail workers have in our lives, particularly during this period”.

“With the current coronavirus crisis we would argue that retail staff are essential workers”

Paddy Lillis, Usdaw

Shopworkers’ union Usdaw has backed the call and said “retail staff are essential to our communities, particularly during the coronavirus crisis”.

Usdaw general secretary Paddy Lillis says: “We have always made the case that retail staff are at the heart of our communities, but with the current coronavirus crisis we would argue that they are essential workers.

“Usdaw members across the retail supply chain and in stores are working hard to keep shelves filled and serve customers. We understand this is a stressful time and remind customers that shopworkers deserve respect and that no level of abuse is ever acceptable. It should never be a part of the job.”

The BRC says it is working with police and partners to “keep retail sites running as smoothly as possible” and that “when circumstances are difficult, retailers are well-versed in providing effective security measures”.

In a statement issued to all Morrisons’ stakeholders yesterday, chief executive David Potts agreed and called on consumers to “treat our colleagues on the front line with the greatest respect”.

David Potts

Morrisons boss David Potts asked customers to ‘treat our colleagues on the front line with the greatest respect’

Potts also called on customers to “please consider others even more so everyone can buy what they need, especially those who are most vulnerable in our society”.

A spokesman from another national grocer said there had been a handful of incidents across its estate, but there had not needed to take on extra security guards.

While some retail staff have faced abuse from consumers, others are also struggling with worries about getting the disease themselves – either from customers or from colleagues.

It is becoming a growing concern for businesses that frontline staff, as well as those working in key supply chain roles such as warehouse workers and delivery drivers, will fall sick or be forced to self-isolate as the virus continues to spread.

Earlier this week, in a call between representatives of Defra, supermarket chains and the wider food industry, the possibility of taking thousands of hospitality workers on secondment to work in food supply chains was raised, according to BuzzFeed news. While this could even increase the risk of spreading the virus, it will at least safeguard vital jobs and supply lines in the sector.

‘Amazing group of people’

Protecting staff from spreading the disease is becoming a top priority. The managing director of one high street food and beverage operator told Retail Week his staff are deep cleaning premises three times a day. Under normal circumstances, they would be deep cleaned twice in a month.

A spokeswoman from the Co-op says it has taken “immediate steps” to safeguard staff including building in “additional working hours for store colleagues to undertake more frequent handwashing throughout the day”.

Morrisons and Boots are among those to have implemented measures designed to enhance hygiene and staff safety.

A Boots spokeswoman says it has been “making sure that our stores, pharmacies, opticians and hearing care facilities are all clean and hygienic”. She also says teams in-store “have access to handwashing facilities and sanitiser”.

Boots bag

Boots has ensured staff have access to handwashing facilities and sanitiser

Morrisons yesterday announced a slew of measures designed to protect staff. In order to reduce the handling of cash by shopworkers, the grocer has asked customers to pay by card or smartphone “where possible”.

The grocer has “been issuing hand sanitiser” to all checkout workers in-store, significantly increased cleaning on surfaces that consumers and staff touch and redeployed staff “who are vulnerable to the virus”.

The retailer has also taken measures to protect staff who either fall ill from the virus or are forced to self-isolate and therefore can’t work by creating a ‘colleague hardship fund’. This will ensure all staff affected by the virus receive sick pay “whether or not they would normally be eligible”.

As the retail sector waits to see what measures will be bought in by the government next, many in the industry are rallying around frontline workers in these uncertain times.

Timpson chief executive James Timpson today described employees as “an amazing group of people who I’m going to need to lean on heavily over the coming weeks and months to keep the show on the road”. Other retail leaders will heartily agree and be doing their best on behalf of their people.

Less than a month ago Vista CCTV became the newest partner with Facewatch to distribute this game changing technology to Vista Priority Partners (VPPs). In this series of Face to Face videos Nick Fisher,CEO, Facewatch and Dean Kernot, Sales and Marketing Manager, Vista CCTV go Face to Face in conversation to explore the opportunity.

In a wide ranging discussion with probing questions from Dean and straight talking answers from Nick this huge change to the security landscape is discussed.  Fundamentally Facewatch facial recognition is fast becoming the acceptable, affordable and compliant solution for any business wanting to deter crime and anti-social behaviour whilst providing a safer and more customer orientated environment for visitors and staff.

The full Face to Face film:

Please accept statistics, marketing cookies to watch this video.

The Facewatch story

Please accept statistics, marketing cookies to watch this video.

The Problem Facewatch Solves

Please accept statistics, marketing cookies to watch this video.

The Vista VPP program

Please accept statistics, marketing cookies to watch this video.

The Accredited Partner Program

Please accept statistics, marketing cookies to watch this video.

How do watch lists work?

Please accept statistics, marketing cookies to watch this video.

Why Facewatch?

Please accept statistics, marketing cookies to watch this video.

 

 

 

The Centre for Retail research has produced an overview of crime statistics in the UK which show a continued increase in the level and type of crime facing the retail sector.

https://www.retailresearch.org/crime-costs-uk.html

There are several official and unofficial surveys of retail crime in the UK, including the Home Office Crime Against Business 2018, the British Retail Consortium’s Retail Crime Survey 2018, the Association of Convenience Stores’ The Crime Report 2019, as well as Sensormatic’s Global Shrink Index and Checkpoint’s Global Retail Theft Barometer. Each one has a different methodology and covers a slightly different purpose.

We have consolidated this information into a simple info-graphics with the focus on the major crime and violence issues facing UK retailers

Download the info-graphics guide:

FaceWatch_Infographic_A5 v6-2

David Davies, the technical director, of DVS goes on camera to explain the merits of Facewatch to their customers.  DVS were appointed by Facewatch just a few weeks ago to distribute Facewatch to the UK CCTV and security reseller market.

DVS will provide sales, support and training for Facewatch under a newly announced Facewatch Accredited Solutions Partner Program (ASP). This ensures that CCTV and security resellers will confidently be able to install, and support Facewatch in retail stores across the UK with total confidence:

 

Please accept statistics, marketing cookies to watch this video.

For more information on how to become a Facewatch ASAP please call DVS or the Facewatch team:

www.dvs.co.uk

By Charles Hymas

Home Affairs Editor (The Daily Telegraph)

https://www.telegraph.co.uk/authors/charles-hymas/

Please accept statistics, marketing cookies to watch this video.

When you enter Paul Wilks’ supermarket in Aylesbury, a facial recognition camera by the door snaps your image and then checks it against a “watchlist” of people previously caught shoplifting or abusing staff.

If you are on it, managers receive a “ping” alert on their mobile phones from Facewatch, the private firm that holds the watchlists of suspects, and you will be asked to leave or monitored if you decide you want to walk around the store.

This is not some Big Brother vision of the future but Budgens in Buckinghamshire in Britain 2019.

It is also stark evidence of the way that Artificial Intelligence (AI) technology is spreading without regulation potentially intruding on our personal privacy.

For Mr Wilks, it has been a success. Since he introduced it at his 3,000 square foot store a year ago, he says shoplifting has fallen from ten to two incidents a week. The thousands he has saved has more than paid for the technology.

“As well as stopping people, it’s a deterrent. Shoplifters know about it,” says Mr Wilks, who has a prominent poster warning customers they will be subject to facial recognition. “As retailers, we have to find ways to counteract what is going on.”

As the retail sector loses £700 million a year to thefts, Facewatch gives store owners a “self-help” solution to the reluctance of police to investigate petty shoplifting.

It is the brainchild of businessman Simon Gordon, whose own London wine bar Gordons was plagued by pickpockets. Using AI technology provided by Intel, a US multinational, he has bold ambitions to have 5,000 high-resolution facial recognition cameras in operation across the UK by 2022.

His firm is close to a deal with a major UK supermarket chain and already has cameras being used or trialled in 100 stores, garages and other retail outlets.

The lenses are mounted by entry doors to catch a full clean facial image, which is sent to a computer that extracts biometric information to compare it to faces in a database of suspects.

Facewatch says there must be a 99 per cent match before the alert is sent to store staff and in consultation with the Information Commissioner Elizabeth Denham has introduced privacy safeguards including immediate automatic deletion of images of “innocent” faces.

“When CCTV first came out 25 or 30 years ago, people thought it was the end of the world, Big Brother and 1984,” says Stuart Greenfield, a Facewatch executive. Now there are six million cameras in London. People either think they are not working or are there to stop terrorists. No-one really worries about it. Facial recognition is the same. Facebook, Instagram and the airports are using it. It is here to stay but it has to be regulated. Everything needs to be controlled because every technology can be used negatively.”

And there’s the rub. MPs, experts and watchdogs, like the Information Commissioner Ms Elizabeth Denham and Paul Wiles, the biometrics commissioner, are concerned facial recognition technology is becoming established if not widespread with little public debate or regulatory scrutiny. They point to critical questions yet to be resolved.

When should facial technology surveillance be used, in what circumstances and under what conditions? And should consent be required before it is deployed?

Judges in a test case against its use by South Wales Police ruled taking a biometric image of a face is as intrusive as a fingerprint or DNA swab. More significantly, unlike with fingerprints or a swab, people have no choice about whether, where or when their biometric image is snapped.

South Wales Police are thought to have scanned more than 500,000 faces since first deploying facial recognition cameras during the Champions League Final at Cardiff’s Millennium Stadium in June 2017. The Met Police and Leicestershire police have scanned thousands more in their “trials.”

The test case in South Wales was brought by Ed Bridges, a former LibDem councillor, who noticed the cameras when he went out to buy a lunchtime sandwich. He said taking “biometric information without his consent” was a breach of his privacy when he was acting lawfully in a public place.

The judges, however, ruled use of the technology was “proportionate.” They said it was deployed in an “open and transparent” way, for a limited time to identify particular individuals of “justifiable interest” to the police and with publicity campaigns to alert the public in advance.

However, Mr Wiles, the biometrics commissioner, is not convinced that this test case alone should be taken as sufficient justification for a roll-out of police use of facial recognition. “I am not disagreeing with the South Wales Police judgement. What South Wales Police did was lawful,” he says.

“Some uses of Automated Face Recognition in public places when highly targeted – for example scanning the faces of people going into a football match against watchlists of those known to cause trouble in football matches in the past – that is arguably in the public interest.

“However, scanning everyone walking down the street against a watchlist of people you would like to arrest seems to be a bit more difficult because it gets near mass surveillance. I don’t think in this country we have ever really wanted to see police using mass surveillance. That’s what the Chinese are doing with facial recognition. There are some lines between legitimate use to protect people who have committed crimes against a rather different use. It is not for me to say where the line is. Nor should it be the police who say where it is. That’s the debate we are not having. I feel it is frustrating that ministers are not talking about it. And before we ask Parliament to decide, we need to have a public debate.”

Cases have already emerged where Mr Wiles’s line appears to have been crossed. Last year, the Trafford Centre in Manchester had to stop using live facial recognition cameras after the Surveillance Camera Commissioner intervened. Up to 15 million people were scanned during the operation.

At Sheffield’s Meadowhall shopping centre, some two million people are thought to have been scanned in secret police trials last year, according to campaign group Big Brother Watch.

The privately-owned Kings Cross estate in London has also had to switch off its facial recognition cameras after it became public. It later emerged the Met Police shared images of suspects with the property firm without anyone’s consent or senior officers and mayor’s office apparently knowing.

Liverpool’s World Museum scanned visitors with facial recognition cameras during its exhibition, “China’s First Emperor and the Terracotta Warriors” in 2018, while Millennium Point conference centre in Birmingham – a scene of demonstrations by trade unionists, football fans and anti-racism campaigners – has deployed it “at the request of law enforcement.”

The Daily Telegraph has revealed Waltham Forest council in London has trialled facial recognition cameras on its streets without residents’ consent and even that AI and facial expression technology is being used for the first time in job interviews to identify the best UK candidates.

As its use widens, one key issue is its reliability and accuracy. The success of the technology’s algorithms in matching a face is improving and is good when there is a high-quality image – as at UK passport control – but less effective with CCTV images that do not always give a clear view.

The US National Institute of Standards and Technology which assesses the ability of algorithms to match a new image to a face in a database estimates it has improved 20-fold between 2014 and 2018.

However, it is by no means infallible. In South Wales, police started in 2017 at a Champions League match with a “true positive” rate – where it got an accurate match with a “suspect” on its database – of just three per cent. This rose to 46 per cent when deployed at the Six Nations rugby last year.

Across all events where deployed, there were 2,900 possible matches of suspects generated by the facial recognition system, of which only 144 were confirmed “true positives” by operators; 2,755 were “false positives,” according to the analysis by Cardiff University. Eighteen people were arrested.

The researchers found performance fell as light faded and was less accurate if faces were obscured by clothing, glasses or jewellery. They concluded it should be viewed as “assisted” rather than “automated” facial recognition, as the decision on whether there was a match was a police officer’s.

Professor Peter Fussey, from Essex University, who reviewed the Met Police trials of the technology, said only eight of the 42 “matches” that they saw thrown up by the technology were accurate.

Sixteen were instantly rejected in the van as it was clear they were the “wrong ethnicity, wrong sex or ten years younger,” he said. Four were then lost in the crowd, which left 22 suspects who were then approached by a police officer in the street to show their id or be mobile fingerprinted.

Of these, 14 were inaccurate and just eight were correct.

In the febrile world of facial recognition, how you measure success is a source of debate. It could be argued you have high 90 per cent-plus accuracy given the cameras scan thousands of faces to pick out the 42. Or you can measure it according to the ratio of “false positives” to accurate matches.

On human rights grounds, Professor Fussey said his concern was consent and legitimacy. In Berlin and Nice, the trials have been conducted where volunteers have signed up to be “members of the public” to test the facial recognition technology.

By contrast, in the Met police trial in Romford, he saw one young man targeted after being seen to avoid the facial recognition cameras which were signposted as in operation. “He was stopped and searched and had a small quantity of cannabis on him and was arrested,” he said.

He may have acted suspiciously by trying to avoid the camera but he was not a suspect on any “watch list,” nor had he consented to take part in the “trial.”

“For me, one of the big issues is around the definition of ‘wanted’,” said Professor Fussey. “There is ‘wanted by the courts’ where someone has absconded and there is judicial oversight. Then there is ‘wanted by the police’ which is wanted for questioning or wanted on the basis of suspicion.”

South Wales police was careful to prepare four watchlists: red – those who posed a risk to public safety, amber – offenders with previous convictions, green – those whose presence did not pose any risk, blue – police officers’ faces to test the system.

However, human rights campaigners cite as an example police databases that hold the images of 21 million innocent people who have been arrested or in custody but never convicted.

It has been ruled such databases are illegal but so great is the task of processing and deleting them that progress on doing so has stalled.

So concerned is the Commons science and technology committee that in its recent report on face recognition it called for a moratorium on its use until rules on its deployment are agreed.

Professor Wiles sums it up:

“There are some uses that are not in the public interest. What that raises is who should make that decision about those uses. The one thing I am clear about is that the people who want to use facial recognition technology should not be the people who make that decision. It ought to be decided by a body that represents the public interest and the most obvious one is Parliament. There should be governance backed by legislation. Parliament should decide, yes, this is in the public interest provided these conditions are met. We have done that with DNA. Parliament said it’s in the public interest that the police can derive profiles of individuals from DNA but it’s not in the public interest that police could keep the samples. You can tell a lot more about a person from a sample. It’s important because if you get it wrong, the police will lose public trust and in Britain, we have a historic tradition of policing by consent.”

Ms Denham, the Information Commissioner, is to publish the results of her investigation into facial recognition, which she says should only be deployed where there is “demonstrable evidence” that it is “necessary, proportionate and effective.”

She has demanded police and other organisations using facial technology must ensure safeguards are in place including assessments of how it will affect people before each deployment and a clear public policy on why it is being used.

“There remain significant privacy and data protection issues that must be addressed and I remain deeply concerned about the rollout of this technology,” she says.

We are hurtling towards a surveillance state’: the rise of facial recognition technology

It can pick out shoplifters, international criminals and lost children in seconds. But as the cameras proliferate, who’s watching the watchers?

Cameras making up the image of a face against an orange background
 ‘If you’ve got something to be worried about, you should probably be worried.’ Photograph: Lol Keegan/The Guardian. Cameras supplied by dynamic-cctv.com

Gordon’s wine bar is reached through a discreet side-door, a few paces from the slipstream of London theatregoers and suited professionals powering towards their evening train. A steep staircase plunges visitors into a dimly lit cavern, lined with dusty champagne bottles and faded newspaper clippings, which appears to have had only minor refurbishment since it opened in 1890. “If Miss Havisham was in the licensing trade,” an Evening Standard review once suggested, “this could have been the result.”

The bar’s Dickensian gloom is a selling point for people embarking on affairs, and actors or politicians wanting a quiet drink – but also for pickpockets. When Simon Gordon took over the family business in the early 2000s, he would spend hours scrutinising the faces of the people who haunted his CCTV footage.

“There was one guy who I almost felt I knew,” he says. “He used to come down here the whole time and steal.” The man vanished for a six-month stretch, but then reappeared, chubbier, apparently after a stint in jail. When two of Gordon’s friends visited the bar for lunch and both had their wallets pinched in his presence, he decided to take matters into his own hands. “The police did nothing about it,” he says. “It really annoyed me.”

Gordon is in his early 60s, with sandy hair and a glowing tan that hints at regular visits to Italian vineyards. He makes an unlikely tech entrepreneur, but his frustration spurred him to launch Facewatch, a fast-track crime-reporting platform that allows clients (shops, hotels, casinos) to upload an incident report and CCTV clips to the police. Two years ago, when facial recognition technology was becoming widely available, the business pivoted from simply reporting into active crime deterrence. Nick Fisher, a former retail executive, was appointed Facewatch CEO; Gordon is its chairman.

Gordon installed a £3,000 camera system at the entrance to the bar and, using off-the-shelf software to carry out facial recognition analysis, began collating a private watchlist of people he had observed stealing, being aggressive or causing damage. Almost overnight, the pickpockets vanished, possibly put off by a warning at the entrance that the cameras are in use.

The company has since rolled out the service to at least 15 “household name retailers”, which can upload photographs of people suspected of shoplifting, or other crimes, to a centralised rogues’ gallery in the cloud. Facewatch provides subscribers with a high-resolution camera that can be mounted at the entrance to their premises, capturing the faces of everyone who walks in. These images are sent to a computer, which extracts biometric information and compares it to faces in the database. If there’s a close match, the shop or bar manager receives a ping on their mobile phone, allowing them to monitor the target or ask them to leave; otherwise, the biometric data is discarded. It’s a process that takes seconds.

Facewatch HQ is around the corner from Gordon’s, brightly lit and furnished like a tech company. Fisher invites me to approach a fisheye CCTV camera mounted at face height on the office wall; he reassures me that I won’t be entered on to the watchlist. The camera captures a thumbnail photo of my face, which is beamed to an “edge box” (a sophisticated computer) and converted into a string of numbers. My biometric data is then compared with that of the faces on the watchlist. I am not a match: “It has no history of you,” Fisher explains. However, when he walks in front of the camera, his phone pings almost instantly, as his face is matched to a seven-year-old photo that he has saved in a test watchlist.

“If you’re not a subject of interest, we don’t store any images,” Fisher says. “The argument that you walk in front of a facial recognition camera, and it gets stored and you get tracked is just.” He pauses. “It depends who’s using it.”

While researching theft prevention, Fisher consulted a career criminal from Leeds who told him that, for people in his line of work, “the holy grail is, don’t get recognised”. This, he says, makes Facewatch the ultimate deterrent. He tells me he has signed a deal with a major UK supermarket chain (he won’t reveal which) and is set to roll out the system across their stores this autumn. On a conservative estimate, Fisher says, Facewatch will have 5,000 cameras across the UK by 2022.

The company also has a contract with the Brazilian police, who have used the platform in Rio de Janeiro.

“We caught the number two on Interpol’s most-wanted South America list, a drug baron,”

says Fisher, who adds the system also led to the capture of a male murderer who had been on the run for several years, spotted dressed as a woman at the Rio carnival. I ask him whether people are right to be concerned about the potential of facial recognition to erode personal privacy.

“My view is that, if you’ve got something to be worried about, you should probably be worried,” he says. “If it’s used proportionately and responsibly, it’s probably one of the safest technologies today.

Unsurprisingly, not everyone sees things this way. In the past year, as the use of facial recognition technology by police and private companies has increased, the debate has intensified over the threat it could pose to personal privacy and marginalised groups.

The cameras have been tested by the Metropolitan police at Notting Hill carnivala Remembrance Sunday commemoration, and at the Westfield shopping centre in Stratford, east London. This summer, the London mayor, Sadiq Khan, wrote to the owners of a private development in King’s Cross, demanding more information after it emerged that facial recognition had been deployed there for unknown purposes.

In May, Ed Bridges, a public affairs manager at Cardiff University, launched a landmark legal case against South Wales police. He had noticed facial recognition cameras in use while Christmas shopping in Cardiff city centre in 2018. Bridges was troubled by the intrusion. “It was only when I got close enough to the van to read the words ‘facial recognition technology’ that I realised what it was, by which time I would’ve already had my data captured and processed,” he says. When he noticed the cameras again a few months later, at a peaceful protest in Cardiff against the arms trade, he was even more concerned: it felt like an infringement of privacy, designed to deter people from protesting. South Wales police have been using the technology since 2017, often at major sporting and music events, to spot people suspected of crimes, and other “persons of interest”. Their most recent deployment, in September, was at the Elvis Festival in Porthcawl.

“I didn’t wake up one morning and think, ‘I want to take my local police to court’,” Bridges says. “The objection I had was over the way they were using the technology. The police in this country police by consent. This undermines trust in them.”

Nick Fisher, CEO of Facewatch, a UK facial-recognition firm, in Gordon’s Wine Bar, London
Pinterest
 Nick Fisher, CEO of Facewatch, a UK facial-recognition firm that started life as a way to track pickpockets in a London wine bar. Photograph: Karen Robinson/The Guardian

During a three-day hearing, lawyers for Bridge, supported by the human rights group Liberty, alleged the surveillance operation breached data protection and equality laws. But last month, Cardiff’s high court ruled that the trial, backed by £2m from the Home Office, had been lawful. Bridges is appealing, but South Wales police are pushing forward with a new trial of a facial recognition app on officers’ mobile phones. The force says it will enable officers to confirm the identity of a suspect “almost instantaneously, even if that suspect provides false or misleading details, thus securing their quick arrest”.

The Metropolitan police have also been the subject of a judicial review by the privacy group Big Brother Watch and the Green peer Jenny Jones, who discovered that her own picture was held on a police database of “domestic extremists”.

In contrast with DNA and fingerprint data, which normally have to be destroyed within a certain time period if individuals are arrested or charged but not convicted, there are no specific rules in the UK on the retention of facial images. The Police National Database has snowballed to contain about 20m faces, of which a large proportion have never been charged or convicted of an offence. Unlike DNA and fingerprints, this data can also be acquired without a person’s knowledge or consent.

“I think there are really big legal questions,” says Silkie Carlo, director of Big Brother Watch. “The notion of doing biometric identity checks on millions of people to identify a handful of suspects is completely unprecedented. There is no legal basis to do that. It takes us hurtling down the road towards a much more expansive surveillance state.”

Some countries have embraced the potential of facial recognition. In China, which has about 200m surveillance cameras, it has become a major element of the Xue Liang (Sharp Eyes) programme, which ranks the trustworthiness of citizens and penalises or credits them accordingly. Cameras and checkpoints have been rolled out most intensively in the north-western Xinjiang province, where the Uighur people, a Muslim and minority ethnic group, account for nearly half the population. Face scanners at the entrances of shopping malls, mosques and at traffic crossings allow the government to cross-reference with photos on ID cards to track and control the movement of citizens and their access to phone and bank services.

At the other end of the spectrum, San Francisco became the first major US city to ban police and other agencies from using the technology in May this year, with supervisor Aaron Peskin saying: “We can have good policing without being a police state.”

Meanwhile, the UK government has faced harsh criticism from its own biometrics commissioner, Prof Paul Wiles, who said the technology is being rolled out in a “chaotic” fashion in the absence of any clear laws. Brexit has dominated the political agenda for the past three years; while politicians have looked the other way, more and more cameras are being allowed to look at us.

Facial recognition is not a new crime-fighting tool. In 1998, a system called FaceIt, comprising a handful of CCTV cameras linked to a computer, was rolled out to great fanfare by police in the east London borough of Newham. At one stage, it was credited with a 40% drop in crime. But these early systems only worked reliably in the lab. In 2002, a Guardian reporter tried in vain to get spotted by FaceIt after police agreed to add him to their watchlist. He compared the system to a fake burglar alarm on the front of a house: it cuts crime because people believe it works, not because it does.

However, in the past three years, the performance of facial recognition has stepped up dramatically. Independent tests by the US National Institute of Standards and Technology (Nist) found the failure rate for finding a target picture in a database of 12m faces had dropped from 5% in 2010 to 0.1% this year.

The rapid acceleration is thanks, in part, to the goldmine of face images that have been uploaded to Instagram, Facebook, LinkedIn and captioned news articles in the past decade. At one time, scientists would create bespoke databases by laboriously photographing hundreds of volunteers at different angles, in different lighting conditions. By 2016, Microsoft had published a dataset, MS Celeb, with 10m face images of 100,000 people harvested from search engines – they included celebrities, broadcasters, business people and anyone with multiple tagged pictures that had been uploaded under a Creative Commons licence, allowing them to be used for research. The dataset was quietly deleted in June, after it emerged that it may have aided the development of software used by the Chinese state to control its Uighur population.

In parallel, hardware companies have developed a new generation of powerful processing chips, called Graphics Processing Units (GPUs), uniquely adapted to crunch through a colossal number of calculations every second. The combination of big data and GPUs paved the way for an entirely new approach to facial recognition, called deep learning, which is powering a wider AI revolution.

“The performance is just incredible,” says Maja Pantic, research director at Samsung AI Centre, Cambridge, and a pioneer in computer vision. “Deep [learning] solved some of the long-standing problems in object recognition, including face recognition.”

Recognising faces is something like a game of snap – only with millions of cards in play rather than the standard deck of 52. As a human, that skill feels intuitive, but it turns out that our brains perform this task in a surprisingly abstract and mathematical way, which computers are only now learning to emulate. The crux of the problem is this: if you’re only allowed to make a limited number of measurements of a face – 100, say – what do you choose to measure? Which facial landmarks differ most between people, and therefore give you the best shot at distinguishing faces?

A deep-learning program (sometimes referred to more ominously as an “agent”) solves this problem through trial and error. The first step is to give it a training data set, comprising pairs of faces that it tries to match. The program starts out by making random measurements (for example, the distance from ear to ear); its guesses will initially be little better than chance. But at each attempt, it gets feedback on whether it was right or wrong, meaning that over millions of iterations it figures out which facial measurements are the most useful. Once a program has worked out how to distil faces into a string of numbers, the algorithm is packaged up as software that can be sent out into the world, to look at faces it has never seen before.

The performance of facial recognition software varies significantly, but the most effective algorithms available, such as Microsoft’s, or NEC’s NeoFace, very rarely fail to match faces using a high-quality photograph. There is far less information, though, about the performance of these algorithms using images from CCTV cameras, which don’t always give a clear view.

Recent trials reveal some of technology’s real-world shortcomings. When South Wales police tried out their NeoFace system for 55 hours, 2,900 potential matches were flagged, of which 2,755 were false positives and just 18 led to arrests (the number charged was not disclosed). One woman on the watchlist was “spotted” 10 times – none of the sightings turned out to be of her. This led to claims that the software is woefully inaccurate; in fact, police had set the threshold for a match at 60%, meaning that faces do not have to be rated as that similar to be flagged up. This minimises the chance of a person of interest slipping through the net, but also makes a lot of false positives inevitable.

In general, Pantic says, the public overestimates the capabilities of facial recognition. In the absence of concrete details about the purpose of the surveillance in London’s King’s Cross this summer, newspapers speculated that the cameras could be tracking shoppers and storing their biometric data. Pantic dismisses this suggestion as “ridiculous”. Her own team has developed, as far as she is aware, the world’s leading algorithm for learning new faces, and it can only store the information from about 50 faces before it slows down and stops working. “It’s huge work,” she says. “People don’t understand how the technology works, and start spreading fear for no reason.”

This week, the Met police revealed that seven images of suspects and missing people had been supplied to the King’s Cross estate “to assist in the prevention of crime”, after earlier denying any involvement. Writing to the London Assembly, the deputy London mayor, Sophie Linden, said she “wanted to pass on the [Metropolitan police service’s] apology” for failing to previously disclose that the scheme existed, and announced that similar local image-sharing agreements were now banned. The police did not disclose whether any related arrests took place.

Like many of those working at the sharp end of AI, Pantic believes the controversy is “super overblown”. After all, she suggests, how seriously can we take people’s concerns when they willingly upload millions of pictures to Facebook and allow their mobile phone to track their location? “The real problem is the phones,” she says – a surprising statement from the head of Samsung’s AI lab. “You are constantly pushed to have location services on. [Tech companies] know where you are, who you are with, what you ate, what you spent, wherever you are on the Earth.”

Concerns have been raised that facial recognition has a diversity problem, after widely cited research by MIT and Stanford University found that software supplied by three companies misassigned gender in 21% to 35% of cases for darker-skinned women, compared with just 1% for light-skinned men. However, based on the top 20 algorithms, Nist found that there is an average difference of just 0.3% in accuracy between performance for men, women, light- and dark-skinned faces. Even so, says Carlo of Big Brother Watch, the technology’s impact could still be discriminatory because of where it is deployed and whose biometric data ends up on databases. It’s troubling, she says, that for two years, Notting Hill carnival, the country’s largest celebration of Caribbean and black British culture, was seen as an “acceptable testing ground” for the technology.

Maja Pantic, research director of Samsung Al Research Centre, photographed at Imperial College, London
Pinterest
 ‘The real problem is phones’: Maja Pantic, research director at Samsung’s AI Centre. Photograph: Karen Robinson/The Guardian

I ask Fisher about the risk of racial profiling: the charge that some groups may be more likely to fall under suspicion, say, when a shop owner is faced with ambiguous security footage. He dismisses the concern. Facewatch clients are required to record the justification for their decision to upload a picture on to the watchlist and, in a worst-case scenario, he argues, a blameless individual might be approached by a shopkeeper, not thrown into jail.

“You’re talking about human prejudices, you can’t blame the technology for that,” he says.

After our interview, I email several times to ask for a demographic breakdown of the people on the watchlist, which Fisher had offered to provide; Facewatch declines.

Bhuwan Ribhu grew up in Delhi, in a small apartment with his parents, his sister Asmita, and many children who had been rescued from slavery and exploitation. Like Gordon, Ribhu followed his parents into the family business – in his case, tracking down India’s missing children, who have been enticed, forcibly taken or sold by their parents to traffickers, and end up working in illegal factories, quarries, farms and brothels. His father is the Nobel Peace laureate Kailash Satyarthi, who founded the campaign Bachpan Bachao Andolan (Save Childhood Movement) in 1980, after realising that he could not accommodate all of the children being rescued in the family home.

The scale of the challenge is almost incomprehensible: 63,407 child kidnappings were reported to Indian police in 2016, according to the National Crime Records Bureau. Many children later resurface, but the sheer numbers involved mean it can take months or years to reunite them with their families. “About 300,000 children have gone missing over the last five or six years, and 100,000 children are housed in various childcare institutions,” says Ribhu. “For many of those, there is a parent out there looking for their child. But it is impossible to manually go through them all.”

He describes the case of Sonu, a boy from India’s rural Bihar region, 1,000km from Delhi. When Sonu was 13, his parents entrusted him to a factory owner who promised him a better life and money. But they quickly lost track of their son’s whereabouts and began to fear for his safety. Eventually they contacted Bachpan Bachao Andolan for help. Sonu was tracked down after about two years, hundreds of miles from home. “We found the child after sending out his photo to about 1,700 childcare institutions across India,” Ribhu says. “One of them called us back and said they might have the child. People went and physically verified it. We were looking for one child in a country of 1.3 billion.”

Ribhu had read a newspaper article about the use of facial recognition to identify terrorists at airports and realised it could help. India has created two centralised databases in recent years: one containing photos of missing children, and the other containing photos of children housed in childcare institutions. In April last year, a trial was launched to see whether facial recognition software could be used to match the identities of missing and found children in the Delhi region. The trial was quickly hailed a success, with international news reports suggesting that “nearly 3,000 children” had been identified within four days. This was an exaggeration: the 3,000 figure refers to potential matches flagged by the software, not verified identifications, and it proves difficult to find out how many children have been returned to parents. (The Ministry of Women and Child Development did not respond to questions.) But Ribhu says that, since being rolled out nationally in April, there have been 10,561 possible matches and the charity has “unofficial knowledge” of more than 800 of these having been verified. “It has already started making a difference,” he says. “For the parents whose child has been returned because of these efforts, for the parents whose child has not gone missing because the traffickers are in jail. We are using all the technological solutions available.”

Watching footage of Sonu being reunited with his parents in a recent documentary, The Price Of Free, it is hard to argue against the deployment of a technology that could have ended his ordeal more quickly. Nonetheless, some privacy activists say such successes are used to distract from a more open-ended surveillance agenda. In July, India’s Home Ministry put out a tender for a new Automated Facial Recognition System (AFRS) to help use real-time CCTV footage to identify missing children – but also criminals and others, by comparing the footage with a “watchlist” curated from police databases or other sources.

Real-time facial recognition, if combined with the world’s largest biometric database (known as Aadhaar), could create the “perfect Orwellian state”, according to Vidushi Marda, a legal researcher at the human rights campaign group Article 19. About 90% of the Indian population are enrolled in Aadhaar, which allocates people a 12-digit ID number to access government services, and requires the submission of a photograph, fingerprints and iris scans. Police do not currently have access to Aadhaar records, but some fear that this could change.

“If you say we’re finding missing children with a technology, it’s very difficult for anyone to say, ‘Don’t do it’,” Marda says. “But I think just rolling it out now is more dangerous than good.”

Debates about civil liberties are often dictated by instinct: ultimately, how much do you trust law enforcement and private companies to do the right thing? When searching for common ground, I notice that both sides frequently reference China as an undesirable endpoint. Fisher thinks that the recent disquiet about facial recognition stems from the paranoia people feel after reading about its deployment there. “They’ve created digital prisons using facial recognition technology. You can’t use your credit card, you can’t get a taxi, you can’t get a bus, your mobile phone stops working,” he says. “But that’s China. We’re not China.”

Groups such as Liberty and Big Brother Watch say the opposite: since facial recognition, by definition, requires every face in a crowd to be scanned to identify a single suspect, it will turn any country that adopts it into a police state. “China has made a strategic choice that these technologies will absolutely intrude on people’s liberty,” says biometrics commissioner Paul Wiles. “The decisions we make will decide the future of our social and political world.”

For now, it seems that the question of whether facial recognition will make us safer, or represents a new kind of unsafe, is being left largely to chance. “You can’t leave [this question] to people who want to use the technology,” Wiles says. “It shouldn’t be the owners of the space around King’s Cross, it shouldn’t be Facewatch, it shouldn’t be the police or ministers alone – it should be parliament.”

After leaving the Facewatch office, I walk along the terrace of Gordon’s, where a couple of lunchtime customers are enjoying a bottle of red in the sunshine, and past the fisheye lens at the entrance to the bar, which I now know is beaming my face to the computer cloud. I think back to a winky promise I’ve read on the Gordon’s website: “Make your way to the cellar to your rickety candlelit table – anonymity is guaranteed!”

Out in the wider world, anonymity is no longer guaranteed. Facial recognition gives police and companies the means of identifying and tracking people of interest, while others are free to go about their business. The real question is: who gets that privilege?

 

Nick Fisher, CEO, of Facewatch talks Face2Face with customers, partners and the public about the real issues of retail crime and how Facewatch is facing up to the challenge of providing a proven and powerful tool which deters and prevents retail crime and violence.

Transcription below:

The Full Film:

Please accept statistics, marketing cookies to watch this video.

Hi, I’m Nick Fisher and I’m the CEO of Facewatch. I’m coming on camera because I want to put the record straight about, a lot of the things I’ve been reading about facial recognition that are a little bit frustrating in some, in some cases and needs to be commented on from somebody who runs a business that at the heart of it is about facial recognition and data sharing. So bear with me while I express my views and tell you a little bit about Facewatch and why I think there’s a clear point of difference between how you may have read or understood how facial recognition works and how it’s used when it’s used in a very responsible fashion in the way that we use it today. So I think there’s a real place for facial recognition technology and the sharing of data, providing it’s done responsibly and very transparently and Facewatch has lobbied for transparency and some clear understanding within the government as to how we should be using this technology going forward.

But it certainly shouldn’t be driven underground. It’s a phenomenal tool if used correctly. I think it’s really important that we embrace new technology and specifically against the backdrop of increasing crime. There are 23,000 fewer officers than there were in 2010 and over the same period of time, 35% new types of crime. If you then take what gets published in the public domain, different forces across the UK saying that they’re not coming out to low-level crime anymore, it’s quite understandable. They don’t have the resources. Now, I spent 20 plus years as a shopkeeper, so I understand what happens in retail and all that does, it builds apathy, apathy for reporting crime. What’s the point? Nobody’s coming out, who are we educating at the end of all this, we’re educating the thief who knows there’s not many police around.

The police have gone public saying they’re not coming out to this anymore. And by the way, the shopkeepers said, there’s no point in reporting, happy days. And so the statistics all bear fruit. If you look at crime reported over the last five, six, seven years, it just keeps going up. And a recent independent report said it was at £11 billion in the UK now we’ve got to do something about that. And so when you’re talking about those sorts of numbers, they have a material impact on operating profit and net profit of businesses. We’ve been doing this for three years and what a transition it has been in the market in terms of the quality of algorithm production from when I first started in facial recognition to the point that we’ve only really released our technology in the last nine to 12 months. Because before then quite frankly I didn’t think it was good enough to use. It’s a  different story now though, this is a really powerful tool and we’ve got to embrace new technology. If we’re going to help fight crime.

 

Facewatch a really powerful tool

Please accept statistics, marketing cookies to watch this video.

We’re commoditising facial recognition. You need a standard HD camera and a license from Facewatch and away you go.  The same price as CCTV technology. So a retailer, let’s just take a corner shop, can have facial recognition for the same price as a good CCTV system. Now the difference between facial recognition and CCTV; CCTV fundamentally  records crime and therefore you have to report that crime to the police. And the police have said we don’t have enough resources to deal with low-level crime. Facial recognition provided by FaceWatch is proven to deter and prevent crime. And we’ve got some fantastic evidence where our subscribers have used Facewatch where they’ve seen significant reductions in crime in just 90 days of deployment.

 

So who benefits from Facewatch?

Please accept statistics, marketing cookies to watch this video.

Well, in my opinion, everybody benefits, the store has a lot less negative activity on-site, less loss on-site, therefore greater profitability. The feedback we’ve had from employees where we’ve deployed Facewatch is they feel much safer and I’m sure it makes communities and environment safer. In fact, our YouGov survey said in general that people welcome facial recognition. CCTV essentially records the crime as it happens Facewatch and facial recognition technology prevents and deters crime before it happens.

 

Facts about Facewatch

Please accept statistics, marketing cookies to watch this video.

So, let me deal with some of the facts and some of the misreporting of facial recognition. But let me specifically talk about Facewatch. Number one, we do not record and store data of innocent people. Number two, we do not track innocent people. Number three, we only operate on private property. We do not use facial recognition technology in public space, and this is wholly different from how the police and other organisations have been using facial recognition in public places.

 

How Facewatch works day today.

Please accept statistics, marketing cookies to watch this video.

So here’s my message to you as store owners, this how you can make FaceWatch work for you in your business. So we’ve commoditised the proposition you need a standard HD camera and a Facewatch license. But how does it work on a day to day basis? Well, it captures the image of everybody who walks into your environment. It takes an image and converts it into an algorithm. It sends the algorithm to the Facewatch watch-list and looks for a match. If there’s no match, the image is deleted. As I said before, we do not store and hold data of innocent people. However, if there is a match, it will send an alert to a mobile device or any device you choose in less than two seconds with the image that you’ve got on file or that we hold for you and the image of the subject of interest that’s  just walked through the door for you to match and compare and then it’s up to you how you handle that incident.   As a retailer it should be non-evasive typically what I was in retail is we approach nearly all customers and say hi, can I help you? If you’re an innocent person, you’re saying I’m just having a look and most people respond with just having a look. Then you can say, I’ll be right behind you if you need any help. That means two different things to do different people to a thief. It means you watching me. To me, it means that sounds very helpful

 

Facewatch and Data Management.

Please accept statistics, marketing cookies to watch this video.

Today. Everyone’s worried about data and so are we. We spent five years working with government bodies and, and uh, authorities to ensure that we are fully GDPR compliant with regards to managing and holding this special category data, which is facial recognition data on behalf of our clients. We manage, store and share that data proportionately and thematically with subscribers to face watch only and with no one else to give you the support and the training so that you and your teams can manage the system appropriately. All our data is secured in the cloud at tier three-level and Trustwave does penetration tests on our database frequently. So you can be confident that Facewatch knows how to manage and look after your data as we will be your data controller mitigating your risk.

 

So how effective is FaceWatch?

Please accept statistics, marketing cookies to watch this video.

Well, let me give you some examples of clients who have used FaceWatch recently. We are working with a big convenience food retailer who had quite a material problem of theft in a couple of their stores. Quite substantial numbers with no list. Facewatch deployed the service into their site and within 90 days they got a greater than 25% reduction in both sites That’s led them to ordering another 18 licenses and to fit Facewatch in 18 more of their stores. Since we deployed the technology, they have not reported one crime. We’ve gone into a small convenience store who reported losing circa £25,000 a year and has seen a greater than 30% reduction in crime. This is all in less than 90 days of deployment. These are all running and are now into contract. These people are using it and they’re becoming the advocates of Facewatch. It really does prevent and deter crime.

 

So where does FaceWatch work best?

Please accept statistics, marketing cookies to watch this video.

Well, we’re working with independent convenience stores, mom and pop stores who have deployed it because it’s highly affordable and as a great impact. We’re working with a national food retailer who now wants to consolidate and share data of a specific geography, so it’s very powerful the way it works in petrol forecourts, which are essentially now convenience stores that sell food. It works fantastically in shopping malls creating very safe environments. We’ve got a huge deployment with a shopping mall with a  70,000 footfall a day. They’re using it to really deter crime in their stores, down 70% year on year. It works absolutely everywhere. It’s a fantastic opportunity to be deployed in your business because it’s really affordable technology and it is proven to work.

 

Facewatch and self-help

So FaceWatch is about self-help. It’s about helping yourself and we work in partnerships, not just with our customers, but you have to work in partnerships with your customers. One of the things we insist on is absolute transparency. That means putting signage in your store, telling people you’re deploying facial recognition. One of the things that FaceWatch we’ll do is we’ll hold, store and share that data proportionally with other businesses in your area who subscribed to FaceWatch. You then start to build a really powerful tool to, to prevent and deter crime in your geography. Statistics reported by the ACS recently have shown that violent crime in specifically retail is on the increase and quite significant levels. These are people typically feeding habits , drug habits or drink problems. You know, we’ve had some fantastic feedback from subscribers to Facewatch saying that they feel safer in an environment where Facewatch is deployed and is a deterrent keeping these people away from their stores. They’re very transparent. The signage is their facial recognition deployed in here. They’d certainly don’t want to be caught on film. And as a consequence it’s easier not to go in an environment where you might be seen.

Final Thoughts

So I think facial recognition technology is like all technologies. I remember  when CCTV first came out and I was a junior retailer and everybody said this is big brother watching us and we won’t be able to go out.! There’s nearly 6 million cameras in the UK and about a half a million in London alone. It’s technology that eventually becomes part of society and people’s sort of, well they might not embrace it, they just accept that it’s there. I think that’s part of the problem in the retail landscape that people just appreciate their CCTV cameras here now and that they’re everywhere and therefore my mindset is that it’s not much of a deterrent. I think facial recognition will change the landscape. I personally believe that it will be a commonplace technology and in less than five years time, the markets forecast it to be somewhere around the £8 billion mark in the next three years.

So I think there’s a lot of investment going into it by companies and I think society needs it as a tool. There’s clearly not enough police officers around, , there’s going to be a huge challenge getting the 20,000 officers that have been mentioned by the government. I think we’re migrating towards a world of self-help and technology is a key partner in the evolution of, of helping people. And I just fundamentally believe facial recognition is a fabulous tool, but it has to be used with some form of very clear guidance and governance. It has to be used in a very transparent way and it has to be used and deployed by responsible organisations, people who take that responsibility seriously and with very clear guidance and training to end users.

And I think it will be a huge deterrent for crime going forward. I think the initial subscriber base will be the primary benefit of it, but I think as adoption grows, I think this will have a material impact on crime and it, you know, what I’ve learned from criminals is they definitely don’t want to be recognized. This is a technology that creates a quick alert of someone who creates crime, then it might just be a start to have a material impact in reducing crime across the UK. Our objective is to make Facewatch as a technology affordable. We’ve aimed at the retail sector and we’ve priced it at the same price as a good quality CCTV system. So this is my sales pitch;  If you’re interested, if you like what I’ve said, if you think this is a product for you, you have experienced retail crime, you want an affordable system where we manage the data for you.  You put the technology in and it will have a material impact and we’ve got some fabulous case studies that we would be happy to share with you.

Get in touch with us. We’re here to help you, a very friendly organization and we’ll, we’ll guide you through the process. Thanks for listening.