• Facebook
  • Twitter
  • Linkedin
  • Support
Facewatch
  • Home
  • Retail Sector
  • Privacy
    • Privacy Notice
    • Facewatch and DPA
  • Testimonials and News
  • Contact Us
  • Menu

Guardian Newspaper reports on the rise of facial recognition technology

News, Retail

We are hurtling towards a surveillance state’: the rise of facial recognition technology

Facial recognition

It can pick out shoplifters, international criminals and lost children in seconds. But as the cameras proliferate, who’s watching the watchers?

Hannah Devlin

Hannah Devlin

 @hannahdev

Sat 5 Oct 2019 10.00 BSTLast modified on Sat 5 Oct 2019 13.17 BST

Cameras making up the image of a face against an orange background
 ‘If you’ve got something to be worried about, you should probably be worried.’ Photograph: Lol Keegan/The Guardian. Cameras supplied by dynamic-cctv.com

Gordon’s wine bar is reached through a discreet side-door, a few paces from the slipstream of London theatregoers and suited professionals powering towards their evening train. A steep staircase plunges visitors into a dimly lit cavern, lined with dusty champagne bottles and faded newspaper clippings, which appears to have had only minor refurbishment since it opened in 1890. “If Miss Havisham was in the licensing trade,” an Evening Standard review once suggested, “this could have been the result.”

The bar’s Dickensian gloom is a selling point for people embarking on affairs, and actors or politicians wanting a quiet drink – but also for pickpockets. When Simon Gordon took over the family business in the early 2000s, he would spend hours scrutinising the faces of the people who haunted his CCTV footage.

“There was one guy who I almost felt I knew,” he says. “He used to come down here the whole time and steal.” The man vanished for a six-month stretch, but then reappeared, chubbier, apparently after a stint in jail. When two of Gordon’s friends visited the bar for lunch and both had their wallets pinched in his presence, he decided to take matters into his own hands. “The police did nothing about it,” he says. “It really annoyed me.”

Gordon is in his early 60s, with sandy hair and a glowing tan that hints at regular visits to Italian vineyards. He makes an unlikely tech entrepreneur, but his frustration spurred him to launch Facewatch, a fast-track crime-reporting platform that allows clients (shops, hotels, casinos) to upload an incident report and CCTV clips to the police. Two years ago, when facial recognition technology was becoming widely available, the business pivoted from simply reporting into active crime deterrence. Nick Fisher, a former retail executive, was appointed Facewatch CEO; Gordon is its chairman.

Gordon installed a £3,000 camera system at the entrance to the bar and, using off-the-shelf software to carry out facial recognition analysis, began collating a private watchlist of people he had observed stealing, being aggressive or causing damage. Almost overnight, the pickpockets vanished, possibly put off by a warning at the entrance that the cameras are in use.

The company has since rolled out the service to at least 15 “household name retailers”, which can upload photographs of people suspected of shoplifting, or other crimes, to a centralised rogues’ gallery in the cloud. Facewatch provides subscribers with a high-resolution camera that can be mounted at the entrance to their premises, capturing the faces of everyone who walks in. These images are sent to a computer, which extracts biometric information and compares it to faces in the database. If there’s a close match, the shop or bar manager receives a ping on their mobile phone, allowing them to monitor the target or ask them to leave; otherwise, the biometric data is discarded. It’s a process that takes seconds.

Facewatch HQ is around the corner from Gordon’s, brightly lit and furnished like a tech company. Fisher invites me to approach a fisheye CCTV camera mounted at face height on the office wall; he reassures me that I won’t be entered on to the watchlist. The camera captures a thumbnail photo of my face, which is beamed to an “edge box” (a sophisticated computer) and converted into a string of numbers. My biometric data is then compared with that of the faces on the watchlist. I am not a match: “It has no history of you,” Fisher explains. However, when he walks in front of the camera, his phone pings almost instantly, as his face is matched to a seven-year-old photo that he has saved in a test watchlist.

“If you’re not a subject of interest, we don’t store any images,” Fisher says. “The argument that you walk in front of a facial recognition camera, and it gets stored and you get tracked is just.” He pauses. “It depends who’s using it.”

While researching theft prevention, Fisher consulted a career criminal from Leeds who told him that, for people in his line of work, “the holy grail is, don’t get recognised”. This, he says, makes Facewatch the ultimate deterrent. He tells me he has signed a deal with a major UK supermarket chain (he won’t reveal which) and is set to roll out the system across their stores this autumn. On a conservative estimate, Fisher says, Facewatch will have 5,000 cameras across the UK by 2022.

The company also has a contract with the Brazilian police, who have used the platform in Rio de Janeiro.

“We caught the number two on Interpol’s most-wanted South America list, a drug baron,”

says Fisher, who adds the system also led to the capture of a male murderer who had been on the run for several years, spotted dressed as a woman at the Rio carnival. I ask him whether people are right to be concerned about the potential of facial recognition to erode personal privacy.

“My view is that, if you’ve got something to be worried about, you should probably be worried,” he says. “If it’s used proportionately and responsibly, it’s probably one of the safest technologies today.

Unsurprisingly, not everyone sees things this way. In the past year, as the use of facial recognition technology by police and private companies has increased, the debate has intensified over the threat it could pose to personal privacy and marginalised groups.

The cameras have been tested by the Metropolitan police at Notting Hill carnival, a Remembrance Sunday commemoration, and at the Westfield shopping centre in Stratford, east London. This summer, the London mayor, Sadiq Khan, wrote to the owners of a private development in King’s Cross, demanding more information after it emerged that facial recognition had been deployed there for unknown purposes.

In May, Ed Bridges, a public affairs manager at Cardiff University, launched a landmark legal case against South Wales police. He had noticed facial recognition cameras in use while Christmas shopping in Cardiff city centre in 2018. Bridges was troubled by the intrusion. “It was only when I got close enough to the van to read the words ‘facial recognition technology’ that I realised what it was, by which time I would’ve already had my data captured and processed,” he says. When he noticed the cameras again a few months later, at a peaceful protest in Cardiff against the arms trade, he was even more concerned: it felt like an infringement of privacy, designed to deter people from protesting. South Wales police have been using the technology since 2017, often at major sporting and music events, to spot people suspected of crimes, and other “persons of interest”. Their most recent deployment, in September, was at the Elvis Festival in Porthcawl.

“I didn’t wake up one morning and think, ‘I want to take my local police to court’,” Bridges says. “The objection I had was over the way they were using the technology. The police in this country police by consent. This undermines trust in them.”

Nick Fisher, CEO of Facewatch, a UK facial-recognition firm, in Gordon’s Wine Bar, London
FacebookTwitterPinterest
 Nick Fisher, CEO of Facewatch, a UK facial-recognition firm that started life as a way to track pickpockets in a London wine bar. Photograph: Karen Robinson/The Guardian

During a three-day hearing, lawyers for Bridge, supported by the human rights group Liberty, alleged the surveillance operation breached data protection and equality laws. But last month, Cardiff’s high court ruled that the trial, backed by £2m from the Home Office, had been lawful. Bridges is appealing, but South Wales police are pushing forward with a new trial of a facial recognition app on officers’ mobile phones. The force says it will enable officers to confirm the identity of a suspect “almost instantaneously, even if that suspect provides false or misleading details, thus securing their quick arrest”.

The Metropolitan police have also been the subject of a judicial review by the privacy group Big Brother Watch and the Green peer Jenny Jones, who discovered that her own picture was held on a police database of “domestic extremists”.

In contrast with DNA and fingerprint data, which normally have to be destroyed within a certain time period if individuals are arrested or charged but not convicted, there are no specific rules in the UK on the retention of facial images. The Police National Database has snowballed to contain about 20m faces, of which a large proportion have never been charged or convicted of an offence. Unlike DNA and fingerprints, this data can also be acquired without a person’s knowledge or consent.

“I think there are really big legal questions,” says Silkie Carlo, director of Big Brother Watch. “The notion of doing biometric identity checks on millions of people to identify a handful of suspects is completely unprecedented. There is no legal basis to do that. It takes us hurtling down the road towards a much more expansive surveillance state.”

Some countries have embraced the potential of facial recognition. In China, which has about 200m surveillance cameras, it has become a major element of the Xue Liang (Sharp Eyes) programme, which ranks the trustworthiness of citizens and penalises or credits them accordingly. Cameras and checkpoints have been rolled out most intensively in the north-western Xinjiang province, where the Uighur people, a Muslim and minority ethnic group, account for nearly half the population. Face scanners at the entrances of shopping malls, mosques and at traffic crossings allow the government to cross-reference with photos on ID cards to track and control the movement of citizens and their access to phone and bank services.

At the other end of the spectrum, San Francisco became the first major US city to ban police and other agencies from using the technology in May this year, with supervisor Aaron Peskin saying: “We can have good policing without being a police state.”

Meanwhile, the UK government has faced harsh criticism from its own biometrics commissioner, Prof Paul Wiles, who said the technology is being rolled out in a “chaotic” fashion in the absence of any clear laws. Brexit has dominated the political agenda for the past three years; while politicians have looked the other way, more and more cameras are being allowed to look at us.

Facial recognition is not a new crime-fighting tool. In 1998, a system called FaceIt, comprising a handful of CCTV cameras linked to a computer, was rolled out to great fanfare by police in the east London borough of Newham. At one stage, it was credited with a 40% drop in crime. But these early systems only worked reliably in the lab. In 2002, a Guardian reporter tried in vain to get spotted by FaceIt after police agreed to add him to their watchlist. He compared the system to a fake burglar alarm on the front of a house: it cuts crime because people believe it works, not because it does.

However, in the past three years, the performance of facial recognition has stepped up dramatically. Independent tests by the US National Institute of Standards and Technology (Nist) found the failure rate for finding a target picture in a database of 12m faces had dropped from 5% in 2010 to 0.1% this year.

The rapid acceleration is thanks, in part, to the goldmine of face images that have been uploaded to Instagram, Facebook, LinkedIn and captioned news articles in the past decade. At one time, scientists would create bespoke databases by laboriously photographing hundreds of volunteers at different angles, in different lighting conditions. By 2016, Microsoft had published a dataset, MS Celeb, with 10m face images of 100,000 people harvested from search engines – they included celebrities, broadcasters, business people and anyone with multiple tagged pictures that had been uploaded under a Creative Commons licence, allowing them to be used for research. The dataset was quietly deleted in June, after it emerged that it may have aided the development of software used by the Chinese state to control its Uighur population.

In parallel, hardware companies have developed a new generation of powerful processing chips, called Graphics Processing Units (GPUs), uniquely adapted to crunch through a colossal number of calculations every second. The combination of big data and GPUs paved the way for an entirely new approach to facial recognition, called deep learning, which is powering a wider AI revolution.

“The performance is just incredible,” says Maja Pantic, research director at Samsung AI Centre, Cambridge, and a pioneer in computer vision. “Deep [learning] solved some of the long-standing problems in object recognition, including face recognition.”

Recognising faces is something like a game of snap – only with millions of cards in play rather than the standard deck of 52. As a human, that skill feels intuitive, but it turns out that our brains perform this task in a surprisingly abstract and mathematical way, which computers are only now learning to emulate. The crux of the problem is this: if you’re only allowed to make a limited number of measurements of a face – 100, say – what do you choose to measure? Which facial landmarks differ most between people, and therefore give you the best shot at distinguishing faces?

A deep-learning program (sometimes referred to more ominously as an “agent”) solves this problem through trial and error. The first step is to give it a training data set, comprising pairs of faces that it tries to match. The program starts out by making random measurements (for example, the distance from ear to ear); its guesses will initially be little better than chance. But at each attempt, it gets feedback on whether it was right or wrong, meaning that over millions of iterations it figures out which facial measurements are the most useful. Once a program has worked out how to distil faces into a string of numbers, the algorithm is packaged up as software that can be sent out into the world, to look at faces it has never seen before.

The performance of facial recognition software varies significantly, but the most effective algorithms available, such as Microsoft’s, or NEC’s NeoFace, very rarely fail to match faces using a high-quality photograph. There is far less information, though, about the performance of these algorithms using images from CCTV cameras, which don’t always give a clear view.

 

People don’t understand how the technology works, and start spreading fear for no reason

Recent trials reveal some of technology’s real-world shortcomings. When South Wales police tried out their NeoFace system for 55 hours, 2,900 potential matches were flagged, of which 2,755 were false positives and just 18 led to arrests (the number charged was not disclosed). One woman on the watchlist was “spotted” 10 times – none of the sightings turned out to be of her. This led to claims that the software is woefully inaccurate; in fact, police had set the threshold for a match at 60%, meaning that faces do not have to be rated as that similar to be flagged up. This minimises the chance of a person of interest slipping through the net, but also makes a lot of false positives inevitable.

In general, Pantic says, the public overestimates the capabilities of facial recognition. In the absence of concrete details about the purpose of the surveillance in London’s King’s Cross this summer, newspapers speculated that the cameras could be tracking shoppers and storing their biometric data. Pantic dismisses this suggestion as “ridiculous”. Her own team has developed, as far as she is aware, the world’s leading algorithm for learning new faces, and it can only store the information from about 50 faces before it slows down and stops working. “It’s huge work,” she says. “People don’t understand how the technology works, and start spreading fear for no reason.”

This week, the Met police revealed that seven images of suspects and missing people had been supplied to the King’s Cross estate “to assist in the prevention of crime”, after earlier denying any involvement. Writing to the London Assembly, the deputy London mayor, Sophie Linden, said she “wanted to pass on the [Metropolitan police service’s] apology” for failing to previously disclose that the scheme existed, and announced that similar local image-sharing agreements were now banned. The police did not disclose whether any related arrests took place.

Like many of those working at the sharp end of AI, Pantic believes the controversy is “super overblown”. After all, she suggests, how seriously can we take people’s concerns when they willingly upload millions of pictures to Facebook and allow their mobile phone to track their location? “The real problem is the phones,” she says – a surprising statement from the head of Samsung’s AI lab. “You are constantly pushed to have location services on. [Tech companies] know where you are, who you are with, what you ate, what you spent, wherever you are on the Earth.”

Concerns have been raised that facial recognition has a diversity problem, after widely cited research by MIT and Stanford University found that software supplied by three companies misassigned gender in 21% to 35% of cases for darker-skinned women, compared with just 1% for light-skinned men. However, based on the top 20 algorithms, Nist found that there is an average difference of just 0.3% in accuracy between performance for men, women, light- and dark-skinned faces. Even so, says Carlo of Big Brother Watch, the technology’s impact could still be discriminatory because of where it is deployed and whose biometric data ends up on databases. It’s troubling, she says, that for two years, Notting Hill carnival, the country’s largest celebration of Caribbean and black British culture, was seen as an “acceptable testing ground” for the technology.

Maja Pantic, research director of Samsung Al Research Centre, photographed at Imperial College, London
FacebookTwitterPinterest
 ‘The real problem is phones’: Maja Pantic, research director at Samsung’s AI Centre. Photograph: Karen Robinson/The Guardian

I ask Fisher about the risk of racial profiling: the charge that some groups may be more likely to fall under suspicion, say, when a shop owner is faced with ambiguous security footage. He dismisses the concern. Facewatch clients are required to record the justification for their decision to upload a picture on to the watchlist and, in a worst-case scenario, he argues, a blameless individual might be approached by a shopkeeper, not thrown into jail.

“You’re talking about human prejudices, you can’t blame the technology for that,” he says.

After our interview, I email several times to ask for a demographic breakdown of the people on the watchlist, which Fisher had offered to provide; Facewatch declines.

Bhuwan Ribhu grew up in Delhi, in a small apartment with his parents, his sister Asmita, and many children who had been rescued from slavery and exploitation. Like Gordon, Ribhu followed his parents into the family business – in his case, tracking down India’s missing children, who have been enticed, forcibly taken or sold by their parents to traffickers, and end up working in illegal factories, quarries, farms and brothels. His father is the Nobel Peace laureate Kailash Satyarthi, who founded the campaign Bachpan Bachao Andolan (Save Childhood Movement) in 1980, after realising that he could not accommodate all of the children being rescued in the family home.

The scale of the challenge is almost incomprehensible: 63,407 child kidnappings were reported to Indian police in 2016, according to the National Crime Records Bureau. Many children later resurface, but the sheer numbers involved mean it can take months or years to reunite them with their families. “About 300,000 children have gone missing over the last five or six years, and 100,000 children are housed in various childcare institutions,” says Ribhu. “For many of those, there is a parent out there looking for their child. But it is impossible to manually go through them all.”

He describes the case of Sonu, a boy from India’s rural Bihar region, 1,000km from Delhi. When Sonu was 13, his parents entrusted him to a factory owner who promised him a better life and money. But they quickly lost track of their son’s whereabouts and began to fear for his safety. Eventually they contacted Bachpan Bachao Andolan for help. Sonu was tracked down after about two years, hundreds of miles from home. “We found the child after sending out his photo to about 1,700 childcare institutions across India,” Ribhu says. “One of them called us back and said they might have the child. People went and physically verified it. We were looking for one child in a country of 1.3 billion.”

Ribhu had read a newspaper article about the use of facial recognition to identify terrorists at airports and realised it could help. India has created two centralised databases in recent years: one containing photos of missing children, and the other containing photos of children housed in childcare institutions. In April last year, a trial was launched to see whether facial recognition software could be used to match the identities of missing and found children in the Delhi region. The trial was quickly hailed a success, with international news reports suggesting that “nearly 3,000 children” had been identified within four days. This was an exaggeration: the 3,000 figure refers to potential matches flagged by the software, not verified identifications, and it proves difficult to find out how many children have been returned to parents. (The Ministry of Women and Child Development did not respond to questions.) But Ribhu says that, since being rolled out nationally in April, there have been 10,561 possible matches and the charity has “unofficial knowledge” of more than 800 of these having been verified. “It has already started making a difference,” he says. “For the parents whose child has been returned because of these efforts, for the parents whose child has not gone missing because the traffickers are in jail. We are using all the technological solutions available.”

China has created digital prisons with this technology: you can’t use your credit card, your phone. But we are not China

Watching footage of Sonu being reunited with his parents in a recent documentary, The Price Of Free, it is hard to argue against the deployment of a technology that could have ended his ordeal more quickly. Nonetheless, some privacy activists say such successes are used to distract from a more open-ended surveillance agenda. In July, India’s Home Ministry put out a tender for a new Automated Facial Recognition System (AFRS) to help use real-time CCTV footage to identify missing children – but also criminals and others, by comparing the footage with a “watchlist” curated from police databases or other sources.

Real-time facial recognition, if combined with the world’s largest biometric database (known as Aadhaar), could create the “perfect Orwellian state”, according to Vidushi Marda, a legal researcher at the human rights campaign group Article 19. About 90% of the Indian population are enrolled in Aadhaar, which allocates people a 12-digit ID number to access government services, and requires the submission of a photograph, fingerprints and iris scans. Police do not currently have access to Aadhaar records, but some fear that this could change.

“If you say we’re finding missing children with a technology, it’s very difficult for anyone to say, ‘Don’t do it’,” Marda says. “But I think just rolling it out now is more dangerous than good.”

Debates about civil liberties are often dictated by instinct: ultimately, how much do you trust law enforcement and private companies to do the right thing? When searching for common ground, I notice that both sides frequently reference China as an undesirable endpoint. Fisher thinks that the recent disquiet about facial recognition stems from the paranoia people feel after reading about its deployment there. “They’ve created digital prisons using facial recognition technology. You can’t use your credit card, you can’t get a taxi, you can’t get a bus, your mobile phone stops working,” he says. “But that’s China. We’re not China.”

Groups such as Liberty and Big Brother Watch say the opposite: since facial recognition, by definition, requires every face in a crowd to be scanned to identify a single suspect, it will turn any country that adopts it into a police state. “China has made a strategic choice that these technologies will absolutely intrude on people’s liberty,” says biometrics commissioner Paul Wiles. “The decisions we make will decide the future of our social and political world.”

For now, it seems that the question of whether facial recognition will make us safer, or represents a new kind of unsafe, is being left largely to chance. “You can’t leave [this question] to people who want to use the technology,” Wiles says. “It shouldn’t be the owners of the space around King’s Cross, it shouldn’t be Facewatch, it shouldn’t be the police or ministers alone – it should be parliament.”

After leaving the Facewatch office, I walk along the terrace of Gordon’s, where a couple of lunchtime customers are enjoying a bottle of red in the sunshine, and past the fisheye lens at the entrance to the bar, which I now know is beaming my face to the computer cloud. I think back to a winky promise I’ve read on the Gordon’s website: “Make your way to the cellar to your rickety candlelit table – anonymity is guaranteed!”

Out in the wider world, anonymity is no longer guaranteed. Facial recognition gives police and companies the means of identifying and tracking people of interest, while others are free to go about their business. The real question is: who gets that privilege?

 

6th October 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/10/nickfisher.jpg 372 620 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-10-06 17:49:262020-02-24 08:37:32Guardian Newspaper reports on the rise of facial recognition technology

Face2Face with Nick Fisher, CEO, Facewatch – putting the record straight on facial recognition

News, Retail

Nick Fisher, CEO, of Facewatch talks Face2Face with customers, partners and the public about the real issues of retail crime and how Facewatch is facing up to the challenge of providing a proven and powerful tool which deters and prevents retail crime and violence.

Transcription below:

The Full Film:

Please accept statistics, marketing cookies to watch this video.

Hi, I’m Nick Fisher and I’m the CEO of Facewatch. I’m coming on camera because I want to put the record straight about, a lot of the things I’ve been reading about facial recognition that are a little bit frustrating in some, in some cases and needs to be commented on from somebody who runs a business that at the heart of it is about facial recognition and data sharing. So bear with me while I express my views and tell you a little bit about Facewatch and why I think there’s a clear point of difference between how you may have read or understood how facial recognition works and how it’s used when it’s used in a very responsible fashion in the way that we use it today. So I think there’s a real place for facial recognition technology and the sharing of data, providing it’s done responsibly and very transparently and Facewatch has lobbied for transparency and some clear understanding within the government as to how we should be using this technology going forward.

But it certainly shouldn’t be driven underground. It’s a phenomenal tool if used correctly. I think it’s really important that we embrace new technology and specifically against the backdrop of increasing crime. There are 23,000 fewer officers than there were in 2010 and over the same period of time, 35% new types of crime. If you then take what gets published in the public domain, different forces across the UK saying that they’re not coming out to low-level crime anymore, it’s quite understandable. They don’t have the resources. Now, I spent 20 plus years as a shopkeeper, so I understand what happens in retail and all that does, it builds apathy, apathy for reporting crime. What’s the point? Nobody’s coming out, who are we educating at the end of all this, we’re educating the thief who knows there’s not many police around.

The police have gone public saying they’re not coming out to this anymore. And by the way, the shopkeepers said, there’s no point in reporting, happy days. And so the statistics all bear fruit. If you look at crime reported over the last five, six, seven years, it just keeps going up. And a recent independent report said it was at £11 billion in the UK now we’ve got to do something about that. And so when you’re talking about those sorts of numbers, they have a material impact on operating profit and net profit of businesses. We’ve been doing this for three years and what a transition it has been in the market in terms of the quality of algorithm production from when I first started in facial recognition to the point that we’ve only really released our technology in the last nine to 12 months. Because before then quite frankly I didn’t think it was good enough to use. It’s a  different story now though, this is a really powerful tool and we’ve got to embrace new technology. If we’re going to help fight crime.

 

Facewatch a really powerful tool

Please accept statistics, marketing cookies to watch this video.

We’re commoditising facial recognition. You need a standard HD camera and a license from Facewatch and away you go.  The same price as CCTV technology. So a retailer, let’s just take a corner shop, can have facial recognition for the same price as a good CCTV system. Now the difference between facial recognition and CCTV; CCTV fundamentally  records crime and therefore you have to report that crime to the police. And the police have said we don’t have enough resources to deal with low-level crime. Facial recognition provided by FaceWatch is proven to deter and prevent crime. And we’ve got some fantastic evidence where our subscribers have used Facewatch where they’ve seen significant reductions in crime in just 90 days of deployment.

 

So who benefits from Facewatch?

Please accept statistics, marketing cookies to watch this video.

Well, in my opinion, everybody benefits, the store has a lot less negative activity on-site, less loss on-site, therefore greater profitability. The feedback we’ve had from employees where we’ve deployed Facewatch is they feel much safer and I’m sure it makes communities and environment safer. In fact, our YouGov survey said in general that people welcome facial recognition. CCTV essentially records the crime as it happens Facewatch and facial recognition technology prevents and deters crime before it happens.

 

Facts about Facewatch

Please accept statistics, marketing cookies to watch this video.

So, let me deal with some of the facts and some of the misreporting of facial recognition. But let me specifically talk about Facewatch. Number one, we do not record and store data of innocent people. Number two, we do not track innocent people. Number three, we only operate on private property. We do not use facial recognition technology in public space, and this is wholly different from how the police and other organisations have been using facial recognition in public places.

 

How Facewatch works day today.

Please accept statistics, marketing cookies to watch this video.

So here’s my message to you as store owners, this how you can make FaceWatch work for you in your business. So we’ve commoditised the proposition you need a standard HD camera and a Facewatch license. But how does it work on a day to day basis? Well, it captures the image of everybody who walks into your environment. It takes an image and converts it into an algorithm. It sends the algorithm to the Facewatch watch-list and looks for a match. If there’s no match, the image is deleted. As I said before, we do not store and hold data of innocent people. However, if there is a match, it will send an alert to a mobile device or any device you choose in less than two seconds with the image that you’ve got on file or that we hold for you and the image of the subject of interest that’s  just walked through the door for you to match and compare and then it’s up to you how you handle that incident.   As a retailer it should be non-evasive typically what I was in retail is we approach nearly all customers and say hi, can I help you? If you’re an innocent person, you’re saying I’m just having a look and most people respond with just having a look. Then you can say, I’ll be right behind you if you need any help. That means two different things to do different people to a thief. It means you watching me. To me, it means that sounds very helpful

 

Facewatch and Data Management.

Please accept statistics, marketing cookies to watch this video.

Today. Everyone’s worried about data and so are we. We spent five years working with government bodies and, and uh, authorities to ensure that we are fully GDPR compliant with regards to managing and holding this special category data, which is facial recognition data on behalf of our clients. We manage, store and share that data proportionately and thematically with subscribers to face watch only and with no one else to give you the support and the training so that you and your teams can manage the system appropriately. All our data is secured in the cloud at tier three-level and Trustwave does penetration tests on our database frequently. So you can be confident that Facewatch knows how to manage and look after your data as we will be your data controller mitigating your risk.

 

So how effective is FaceWatch?

Please accept statistics, marketing cookies to watch this video.

Well, let me give you some examples of clients who have used FaceWatch recently. We are working with a big convenience food retailer who had quite a material problem of theft in a couple of their stores. Quite substantial numbers with no list. Facewatch deployed the service into their site and within 90 days they got a greater than 25% reduction in both sites That’s led them to ordering another 18 licenses and to fit Facewatch in 18 more of their stores. Since we deployed the technology, they have not reported one crime. We’ve gone into a small convenience store who reported losing circa £25,000 a year and has seen a greater than 30% reduction in crime. This is all in less than 90 days of deployment. These are all running and are now into contract. These people are using it and they’re becoming the advocates of Facewatch. It really does prevent and deter crime.

 

So where does FaceWatch work best?

Please accept statistics, marketing cookies to watch this video.

Well, we’re working with independent convenience stores, mom and pop stores who have deployed it because it’s highly affordable and as a great impact. We’re working with a national food retailer who now wants to consolidate and share data of a specific geography, so it’s very powerful the way it works in petrol forecourts, which are essentially now convenience stores that sell food. It works fantastically in shopping malls creating very safe environments. We’ve got a huge deployment with a shopping mall with a  70,000 footfall a day. They’re using it to really deter crime in their stores, down 70% year on year. It works absolutely everywhere. It’s a fantastic opportunity to be deployed in your business because it’s really affordable technology and it is proven to work.

 

Facewatch and self-help

So FaceWatch is about self-help. It’s about helping yourself and we work in partnerships, not just with our customers, but you have to work in partnerships with your customers. One of the things we insist on is absolute transparency. That means putting signage in your store, telling people you’re deploying facial recognition. One of the things that FaceWatch we’ll do is we’ll hold, store and share that data proportionally with other businesses in your area who subscribed to FaceWatch. You then start to build a really powerful tool to, to prevent and deter crime in your geography. Statistics reported by the ACS recently have shown that violent crime in specifically retail is on the increase and quite significant levels. These are people typically feeding habits , drug habits or drink problems. You know, we’ve had some fantastic feedback from subscribers to Facewatch saying that they feel safer in an environment where Facewatch is deployed and is a deterrent keeping these people away from their stores. They’re very transparent. The signage is their facial recognition deployed in here. They’d certainly don’t want to be caught on film. And as a consequence it’s easier not to go in an environment where you might be seen.

Final Thoughts

So I think facial recognition technology is like all technologies. I remember  when CCTV first came out and I was a junior retailer and everybody said this is big brother watching us and we won’t be able to go out.! There’s nearly 6 million cameras in the UK and about a half a million in London alone. It’s technology that eventually becomes part of society and people’s sort of, well they might not embrace it, they just accept that it’s there. I think that’s part of the problem in the retail landscape that people just appreciate their CCTV cameras here now and that they’re everywhere and therefore my mindset is that it’s not much of a deterrent. I think facial recognition will change the landscape. I personally believe that it will be a commonplace technology and in less than five years time, the markets forecast it to be somewhere around the £8 billion mark in the next three years.

So I think there’s a lot of investment going into it by companies and I think society needs it as a tool. There’s clearly not enough police officers around, , there’s going to be a huge challenge getting the 20,000 officers that have been mentioned by the government. I think we’re migrating towards a world of self-help and technology is a key partner in the evolution of, of helping people. And I just fundamentally believe facial recognition is a fabulous tool, but it has to be used with some form of very clear guidance and governance. It has to be used in a very transparent way and it has to be used and deployed by responsible organisations, people who take that responsibility seriously and with very clear guidance and training to end users.

And I think it will be a huge deterrent for crime going forward. I think the initial subscriber base will be the primary benefit of it, but I think as adoption grows, I think this will have a material impact on crime and it, you know, what I’ve learned from criminals is they definitely don’t want to be recognized. This is a technology that creates a quick alert of someone who creates crime, then it might just be a start to have a material impact in reducing crime across the UK. Our objective is to make Facewatch as a technology affordable. We’ve aimed at the retail sector and we’ve priced it at the same price as a good quality CCTV system. So this is my sales pitch;  If you’re interested, if you like what I’ve said, if you think this is a product for you, you have experienced retail crime, you want an affordable system where we manage the data for you.  You put the technology in and it will have a material impact and we’ve got some fabulous case studies that we would be happy to share with you.

Get in touch with us. We’re here to help you, a very friendly organization and we’ll, we’ll guide you through the process. Thanks for listening.

26th September 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/06/NickFisherFacewatch2.jpg 329 371 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-09-26 16:55:382019-09-26 16:55:38Face2Face with Nick Fisher, CEO, Facewatch - putting the record straight on facial recognition

2020 Olympics in Tokyo will use facial recognition technology

News

BY STEPHEN SHANKLAND

SEPTEMBER 11, 2019 

Hundreds of terminals from Intel and NEC will scan the faces of athletes, sponsors, volunteers and other accredited people at the next summer Olympics.

 

(Credit: Walden Kirsch/Intel Corporation)

If you’re an athlete, sponsor, journalist or volunteer at the 2020 Olympics in Tokyo, you’ll be using a facial recognition system from Japanese electronics giant NEC and chipmaker Intel to get where you need to be.

Intel is collaborating with NEC to provide “a large-scale face recognition system for the Olympics,” said Ricardo Echevarria, general manager of Intel’s Olympics program. The system is designed to let Olympics organizers “ensure smoothly secure verification for the over 300,000 people at the games who are accredited,” he said. People using it will register with photos from government-issued IDs, he added.

Facial recognition has grown by leaps and bounds with the arrival of the sophisticated pattern-matching abilities of modern artificial intelligence technology called neural networks. But many are alarmed about pervasive computer surveillance, leading cities like Somerville, Massachusetts, and San Francisco and Oakland, California, to bar police from using the technology.

Intel didn’t comment on the privacy or data retention aspects of the technology, and NEC said that’s the purview of the Tokyo Olympics organizers. Those organizers didn’t immediately respond to a request for comment.

NEC will deploy hundreds of facial recognition systems around the Olympics facilities, a move that should speed up ID checks for accredited people, Echevarria said. It’s the first time the Olympics have used that facial recognition technology.

It won’t be a wholesale replacement for the old ways: Accredited personnel at the Olympics will still have to wear traditional ID lanyards, Intel and NEC said. But the facial recognition system will be required: if someone loses their lanyard or tries to get access with one that’s stolen, the facial recognition system will block them, NEC said.

“Facial recognition improves security and efficiency by being able to confirm a picture ID against the face of the person seeking to enter a facility with greater speed and accuracy than human staff,” NEC said.

Intel will be involved in other Olympic-related moves, too:

  • It’s helped develop a technology called 3DAT (3D Athlete Tracking) that broadcasters can use to boost instant-replay videos with data about player movements. An AI system processes video data rapidly to generate the overlay graphics.
  • Intel also is helping to run a global esports gaming competition in parallel with the Olympics in Tokyo. Players from an initial group of 20 countries will compete in the videogame event, which also includes participation from gaming companies Capcom and Epic Games.
  • It’s building virtual reality training realms that athletes and organizers can use to visualize arenas and other facilities.

 

12th September 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/09/dddba0bafc7b9c3b1b5f0f566de5f4b8.jpg 1120 1568 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-09-12 13:14:542019-09-12 13:14:542020 Olympics in Tokyo will use facial recognition technology

The risks and rewards of facial recognition in retail – Retail Week reports

News, Police, Retail

By Grace Bowden 27 August 2019

Facial recognition technology burst into the headlines this month following an exposé in the Financial Times about its use in London’s King’s Cross.

The Information Commissioner’s Office has launched an investigation into the use of the technology, which scanned pedestrians’ faces across the 67-acre site comprising King’s Cross and St Pancras stations and nearby shopping areas, without their knowledge.

It is the latest controversy to embroil the technology. Manchester’s Trafford Centre was ordered to stop using it by the Surveillance Camera Commission, which works for the Home Office.

Information commissioner Elizabeth Denham said after details of the King’s Cross scheme emerged that she was “deeply concerned about the growing use of facial recognition technology in public spaces”.

“Scanning people’s faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all”

Elizabeth Denham, information commissioner

“Scanning people’s faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all,” she maintained.

“That is especially the case if it is done without people’s knowledge or understanding. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”

The European Commission is also understood to planning new regulation that will give EU citizens explicit rights over the use of their facial recognition data as part of an update of artificial intelligence laws.

What’s it for?

So what does that mean for retailers that are either already deploying or are considering a roll-out of facial recognition technology in their stores?

Given the level of concern and scrutiny from regulators and public alike about how such technology is used, can retailers deploy it in a way that adds value to their business and without risking alienating customers?

Innovation agency Somo’s senior vice-president of product management Tim Johnson says: “There’s a very wide range of things [facial recognition] could potentially be used for. It is a very significant technology and a really seamless process that provides a strong form of identification, so it is undeniably big news.

“But at the moment it is a big muddle in terms of what it is for, whether it is useful or too risky and in what ways. We’ll look back on where we are now as an early stage of this technology.”

One area where facial recognition technology has been piloted by retailers is in-store to crack down on shoplifting and staff harassment.

“The only information held is on those who are known to have already committed a crime in the store previously”

Stuart Greenfield, Facewatch

According to the BRC, customer theft cost UK retailers £700m last year, up 31% year on year, while 70% of retail staff surveyed described police response to retail crime as poor or very poor.

Against that backdrop, retailers such as Budgens have rolled out tech from facial recognition provider Facewatch to stores across the South and Southeast, after a trial in an Aylesbury shop resulted in a 25% crime reduction.

Facewatch marketing and communications director Stuart Greenfield explains that clear signage is displayed throughout any store where the platform’s technology is used, and any data is held in Facewatch’s own cloud platform, not by the retailers.

“The only information held is on those who are known to have already committed a crime in the store previously, anyone whose face is scanned by the system and does not correspond against our existing database is deleted immediately,” says Greenfield.

He believes it is the “combination of marketing, in-store signage and the system itself” which acts as a deterrent to shoplifting and staff harassment in stores where Facewatch’s technology is used.

Shopping centre operator Westfield has teamed up with digital signage firm Quividi, which analyses passersby’s facial data based on their age, gender and mood to determine which adverts are displayed as a means of driving customer engagement and sales. Shoe specialist Aldo and jeweller Pandora also work with Quividi overseas.

Quividi chief marketing officer Denis Gaumondie argues that the platform’s technology is not facial recognition – rather it is facial analysis, because it does not store any data on passersby and would therefore not recognise a repeat customer, or link their data to purchases.

He adds that it is the responsibility of Quividi’s retail partners to inform shoppers that the technology is in use.

Hot potato

However, DWF partner Ben McLeod, who specialises in commercial and technology law, says even using facial recognition or analysis technology in-store as described above could land retailers in hot water.

“There is a general prohibition on processing special category data [which may, for instance, include racial or ethnic origin] unless a specific exception applies,” he points out. “Many of the exceptions relate to the public interest which doesn’t really apply to retailers, particularly where the primary purpose for the use of the technology is marketing or to prevent stock loss.”

“Processing is possible where the data subject [the customer] has given explicit consent, but in practice, this will be difficult to demonstrate, as merely alerting customers to the use of facial recognition technology will not suffice.”

“Given that the basis on which the police are using surveillance technology is also currently subject to legal challenge, retailers are advised to tread carefully,” he cautions.

Opting in

Facial recognition technology is prompting controversy

Facial recognition has also been tried out by the Co-op to verify the purchase of age-restricted products such as alcohol at self-service checkouts. Customers found to be over 30 were allowed to complete the purchase without the need for verification by a member of staff.

Johnson believes such use of facial recognition technology would be welcomed by many customers because it would require their specific consent to use it, as was the case with the Co-op, as would verification of the purchase of a whole shopping basket using biometric data.

“People are comfortable with using facial identification on their own device [such as Apple’s Face ID], so using it as a means of verifying purchases in-store feels like a logical next step. It would speed up the check-out experience.”

Capgemini principal consultant Bhavesh Unadkat also points to the roll-out of Amazon Go stores in the US, which verify shoppers’ purchases and link them to their Amazon account using biometric data including facial recognition technology.

He explains that shoppers who download the Amazon Go app and then go into one of the checkout-free stores understand what technology is being used, and how it is benefiting them by providing an efficient shopping experience. The trade-off is clear and there is an “opt-in” to use the technology.

“I don’t think [retailers] can ask customers to opt out of facial recognition technology being used in-store, or just alert them to it being there,” he says.

“They need to ask shoppers to opt in and sell them the benefits they would get, such as a cashless checkout, more rewards, personalised offers to your mobile as you enter the store. Don’t go down the route of assuming people will never opt in and not communicating effectively, because if you get it wrong then the trust is broken.

“Right now we are making a mess of [facial recognition technology] because people are already paranoid about sharing information online and now feel like they are being victimised in a bricks-and-mortar environment as well.”

McLeod concurs with that view.

He says: “Amazon Go is the kind of thing where people are making a choice upfront by downloading the app. That is different from walking into a shopping centre or having the technology foisted upon you in a way that isn’t transparent.

“It becomes far more pervasive in that setting, but the more fundamental issue is there isn’t a strong legal grounding for the use of the technology.”

Right side of the law

Greenfield emphasises that Facewatch is working with the ICO to ensure its technology remains compliant with current and incoming regulations.

“We are pushing like mad for legislation as quickly as possible,” he says. “We want to do everything that is good for the technology because the reality is we cannot put the genie back in the bottle; [facial recognition] is out there and it will be used by someone, so we should have legislation to ensure it is used properly.”

Johnson advises retailers to collaborate closely with engaged suppliers and legislators, and tread carefully when deploying facial recognition technology, but does not believe that current controversies should deter retailers from using it for good.

He says: “I absolutely think [retailers] should still be exploring it. The current environment should make them fully aware of the risks, but it isn’t going away and the potential rewards are large, from crime prevention to age verification and flagging relevant products to customers.

“We’ll hopefully see a period of innovation which shows people what [facial recognition] is useful for.”

https://www.retail-week.com/technology/analysis-the-risks-and-rewards-of-facial-recognition-in-retail/7032744.article

9th September 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png 0 0 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-09-09 14:10:392022-11-29 15:29:38The risks and rewards of facial recognition in retail - Retail Week reports

Crime against retailers and wholesalers continues to rise- New Gov report says

News, Retail

By Gaelle Walker 5 September 2019 Convenience store magazine

Workers in the retail and wholesale industries continue to suffer from the highest levels of crime out of all key business sectors, with retailers who experience crime being targeted more often than in previous years, new Home Office data shows.

Shoplifting

The crime rate in the retail and wholesale sector has risen every year since 2015, from 12,400 incidents per 1,000 premises to 27,400 incidents in 2018, the latest Commercial Victimisation Survey (CVS) reveals.

The number of assaults and threats has also continued to rise year on year, up to 1,600 incidents per 1,000 premises in 2018, a marginal increase on 2017 but significantly up from 500 incidents per 1,000 premises in 2016.

Theft accounted for 82% of all incidents reported in 2018 and almost three-quarters (71%) of all incidents of theft was theft by customers, with 19,300 incidents per 1,000 premises in 2018.

Theft of food or groceries accounted for over a quarter of stolen items in 2018.

The repeat victimisation rate for theft specifically has almost doubled in recent years, from 49 incidents per victim in 2012 to 92 incidents per victim in 2018.

The overall rate of repeat victimisation has also risen from 32 incidents per premises in 2012 to 69 per premises in the 2018 survey.

The Association of Convenience Stores (ACS) said the survey highlighted the need for a more targeted approach to dealing with repeat offenders.

ACS chief executive James Lowman said: “These findings show that businesses are being repeatedly targeted by criminals that are not only committing thefts, but are also being abusive and violent towards retailers and their staff.

“We need targeted action to deal with repeat offenders who are currently being all but ignored by the justice system.

“The increase in the number of assaults and threats is especially concerning, as no one should have to face violence or abuse in their work but it is being seen as just part of the job for many in the sector.

“We continue to urge retailers and their staff to report every incident when it occurs to ensure that the police are aware of the full extent of the problem.”

Figures from the 2019 ACS Crime Report show that retailers believe 79% of crimes are committed by repeat offenders, with around half of those offenders being motivated by a drug or alcohol addiction.

Read reports here:

Full report

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/829399/crime-against-businesses-2018-hosb1719.pdf

PDF overviews

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/828765/crime-against-businesses-infographic-2018.pdf

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/828766/crime-against-businesses-factsheet-wholesale-retail-2018.pdf

9th September 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/09/116386-Home_Office.jpg 1654 2598 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-09-09 13:45:142019-09-09 13:45:14Crime against retailers and wholesalers continues to rise- New Gov report says

The High Court said the use of facial recognition tech by police was legal

Legislation, News, Police

Reported by:Rowland Manthorpe

Technology correspondent @rowlsmanthorpe Wednesday 4 September 2019 12:45, UK Sky News

Facial recognition can be legally used by police forces in the UK, judges have ruled.

As the world’s first case against the controversial technology concluded, two leading judges dismissed the case brought by human rights campaign group Liberty on behalf of Ed Bridges, a Cardiff resident whose face was scanned by South Wales Police during a trial of facial recognition.

Lord Justice Haddon-Cave, sitting with Mr Justice Swift, concluded that South Wales Police’s use of live facial recognition “met the requirements of the Human Rights Act”.In a three-day hearing in May, Mr Bridges’ lawyers had argued that South Wales Police violated his human right to privacy by capturing and processing an image taken of him in public.

The judges also ruled that existing data protection law offered sufficient safeguards for members of the public whose faces were scanned by facial recognition cameras, and that South Wales Police had considered the implications.

Speaking in the High Court, Lord Justice Haddon-Cave said “the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate” – the facial recognition technology used by South Wales Police.

Liberty lawyer Megan Goulding said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.

“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.

“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

A spokesperson for the Information Commissioner’s Office said: “We will be reviewing the judgment carefully.

“We welcome the court’s finding that the police use of Live Facial Recognition (LFR) systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“Any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”

4th September 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/09/skynews-royal-court-justice_4692184.jpg 900 1600 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-09-04 16:05:452019-09-04 16:09:51The High Court said the use of facial recognition tech by police was legal

Facewatch exhibit at RETAIL RISK conference and exhibition in Leicester on 3rd October

News, Retail

The Facewatch team will be exhibiting at the Retail Week conference in Leicester on 3rd October.

Taking place at the Leicester City Football Club stadium it is a free conference and exhibition for retailers.

To arrange a meeting with Geoff Gritton please email: [email protected] or call him on Mobile 07711 756754

 

 

Don’t Miss The Biggest Day on The Risk Management Calendar…

Retail Risk – Leicester, followed by the Fraud Awards Gala Dinner the same evening, promises to be an outstanding day of networking, round tables and main stage presentations.

Last year, key decision-makers and influencers from most of Europe’s top 250 physical and online retailers attended, making it one of the very best places to meet new business contacts and catch up with old friends. And this year promises an excellent roster of speakers and round table hosts too.

The conference also boasts one of Europe’s largest exhibitions of risk management solutions for both physical and online stores, including the rapidly growing area of distribution centre and logistics security. So it is a great place to discover new solutions vital to your business.

If you want to be part of the most eagerly anticipated days on the risk management calendar, it is quick, simple and easy to register.

And remember to bring the whole of your team and others from your organisation whose remit includes the ever-evolving risk management brief.

 

What makes Retail Risk the world’s No 1 Risk and Loss Prevention conference series?

More executives, whose work involves risk and loss prevention, attend our conference series than any other in the world. Here’s why…

Be inspired by fresh thinking from international speakers, many of whom only speak at Retail Risk events
Take away case studies of others who have “learnt the hard way” and use their experiences in your business
Free Access All Areas VIP Delegate Passes for retailers, academics and law enforcement personnel
Enjoy unlimited refreshments and a delicious hot lunch at a superb hotel – all complimentary
Vendor numbers are limited, so our delegates don’t get overwhelmed by unsolicited sales approaches
Carefully constructed networking opportunities with peers as well as our international experts
Workshops are held under the Chatham House rule, so you can be assured of complete confidentiality
Potential for personal international profile development
Opportunity to participate on future steering committees and influence agenda
International publicity for all speakers through www.retailrisk.com

4th September 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png 0 0 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-09-04 13:24:192019-09-04 13:24:19Facewatch exhibit at RETAIL RISK conference and exhibition in Leicester on 3rd October

THE £1.9BN COST OF RETAIL CRIME – BRC

News, Retail

 

Violence remains a key issue this year. On average, 115 retail employees were attacked every day.

The combined cost of spending on crime prevention and losses from crime to the industry is a staggering £1.9 billion.

Over £700 million was lost to customer theft alone, a rise of 31% on the previous year.

 

Please accept statistics, marketing cookies to watch this video.

 

The British Retail Consortium’s (BRC) annual Retail Crime Survey has revealed the vast cost of crime to people and businesses up and down the country.

The total cost of crime and crime prevention for retailers was £1.9 billion last year, up 12% from the previous year (£1.7bn). This was made up of £900 million direct cost from retail crime, and £1 billion spent in efforts to prevent crime.

The direct costs of crime included a £700 million loss arising from customer theft, a 31% rise on the previous year. The total cost of crime, at £1.9bn, is equivalent to approximately 20% of the estimated profits of the entire retail industry.

The human cost of criminal enterprise was also laid bare as the survey revealed that 115 retail employees were attacked at work every day. The use of knives by assailants was pointed out as an issue of significant concern.

Approximately 70% of respondents described the police response to retail crime as poor or very poor. And while opinions showed the police response was generally better for violent incidents, as compared to customer theft or fraud, only 20% of respondents considered the response good or excellent.

Helen Dickinson OBE, Chief Executive of the British Retail Consortium, said:

“Violence against employees remains one of the most pressing issues retailers face, yet once again we have seen an increase in the overall number of incidents. Such crimes harm not just hardworking employees, but also on their families and communities. No one should go to work fearing threats and abuse.

“The spiralling cost of retail crime – both in losses and the cost of prevention – are a huge burden to a retail sector that is already weighed down by the twin challenges of skyrocketing business costs and Brexit uncertainty.

“We hope this report will act as a catalyst for Police and Crime Commissioners around the country to take action. Retail crime should be explicitly addressed by Police and Crime Plans. Furthermore, Parliament must play its part in stemming this tide of crime by creating a specific criminal offence to protect retail employees from assault at work, as has been done for emergency workers.”

Retailers are spending 17% more on cyber-security than last year (£162 million), and nearly 80% of the retailers surveyed have seen an increase in the number of cyber attacks.

Clare Gardiner, the National Cyber Security Centre’s Director of Engagement, said:

“The NCSC is committed to helping to improve the UK’s cyber security, which is why we have worked in partnership with the British Retail Consortium to produce the BRC Cyber Security Toolkit.

“Cyber attacks can have a huge impact, but to help potential victims pro-actively defend themselves we have published a range of easy-to-implement guidance on our website.

“Organisations can also share threat intelligence in a confidential way through the NCSC’s online Cyber Information Sharing Partnership (CiSP), which increases awareness to dangers and reduces the impact on UK businesses.”

The BRC is working with a number of organisations to campaign for greater protections for retail workers.

Paddy Lillis – Usdaw General Secretary said:

“Life on the frontline of retail can be pretty tough for many shopworkers and there is still a lot to do to help protect them. We launched our Freedom From Fear Campaign in the face of growing concerns amongst retail staff about violence, threats and abuse. The campaign works with employers to promote respect and make shops safer for staff.

“It is time for the Government to act by providing stiffer penalties for those who assault workers; a simple stand-alone offence that is widely recognised and understood by the public, police, CPS, the judiciary and most importantly criminals. Shopworkers are on the frontline of helping to keep our communities safe, they have a crucial role that must be valued and respected.”

DOWNLOAD THE BRC’S ANNUAL CRIME SURVEY

20th August 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/08/BRC_MasterLogo_Purple_RGB_screen.jpg 640 960 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-08-20 10:44:132019-08-20 10:44:13THE £1.9BN COST OF RETAIL CRIME - BRC

The Observer – Facial recognition… coming to a supermarket near you…

Legislation, News, Retail

Published in The Observer, by Tom Chivers, Sun 4 Aug 2019 09.00 BST

The technology is helping to combat crimes police no longer deal with, but its use raises concerns about civil liberties

Paul Wilks runs a Budgens supermarket in Aylesbury, Buckinghamshire. Like most retail owners, he’d had problems with shoplifting – largely carried out by a relatively small number of repeat offenders. Then a year or so ago, exasperated, he installed something called Facewatch. It’s a facial-recognition system that watches people coming into the store; it has a database of “subjects of interest” (SOIs), and if it recognises one, it sends a discreet alert to the store manager. “If someone triggers the alert,” says Paul, “they’re approached by a member of management, and asked to leave, and most of the time they duly do.”

Facial recognition, in one form or another, is in the news most weeks at the moment. Recently, a novelty phone app, FaceApp, which takes your photo and ages it to show what you’ll look like in a few decades, caused a public freakout when people realised it was a Russian company and decided it was using their faces for surveillance. (It appears to have been doing nothing especially objectionable.) More seriously, the city authority in San Francisco have banned the use of facial-recognition technologies by the police and other government agencies; and the House of Commons Science and technology committee has called for British police to stop using it as well, until regulation is in place, though the then home secretary (now chancellor) Sajid Javid, said he was in favour of trials continuing.

Paul Wilks, Owner Wilks Budgens

Paul Wilks. Owner of WilksBudgens

Wilks Budgens Store

There is a growing demand for the technology in shops, with dozens of companies selling retail facial-recognition software – perhaps because, in recent years, it has become pointless to report shoplifting to the police. Budgets for policing in England have been cut in real terms by about 20% since 2010, and a change in the law in 2014, whereby shoplifting of goods below a value of £200 was made a summary offence (ie less serious, not to be tried by a jury), meant police directed time and resources away from shoplifting. The number of people being arrested and charged has fallen dramatically, with less than 10% of shoplifting now reported. The British Retail Consortium trade group estimates that £700m is lost annually to theft. Retailers are looking for other methods. The rapid improvement in AI technologies, and the dramatic fall in cost, mean that it is now viable as one of those other methods.

“The systems are getting better year on year,” says Josh Davis, a psychologist at the University of Greenwich who works on facial recognition in humans and AIs. The US National Institute of Standards and Technology assesses the state of facial recognition every year, he says, and the ability of the best algorithms to match a new image to a face in a database improved 20-fold between 2014 and 2018. And analogously with Moore’s law, about computer processing power doubling every year – the cost falls annually as well.

In ideal environments such as airport check-ins, where the face is straight on and well lit and the camera is high-quality, AI face recognition is now better than human, and has been since at least 2014. In the wild – with the camera looking down, often poorly lit and lower-definition – it’s far less effective, says Prof Maja Pantic, an AI researcher at Imperial College London. “It’s far from the 99.9% you get with mugshots,” she says. “But it is good, and moving relatively fast forward.”

 

Each algorithm is different, but fundamentally, they work the same way. They are given large numbers of images of people and are told which ones are the same people; they then analyse those images to pick out the features that identify them. Those features are not things like “size of ear” or “length of nose”, says Pantic, but something like textures: the algorithm assesses faces by gradients of light and dark, which allow it to detect points on the face and build a 3D image. “If you grow a beard or gain a lot of weight,” she says, “very often a passport control machine cannot recognise you, because a large part of the texture is different.”

But while the algorithms are understood at this quite high level, the specific things that they use to identify people are not and cannot be known in detail. It’s a black box: the training data goes into the algorithm, sloshes around a bit, and produces very effective systems, but the exact way it works is not clear to the developer. “We don’t have theoretical proofs of anything,” says Pantic. The problem is that there is so much data: you could go into the system and disentangle what it was doing if it had looked at a few tens of photos, perhaps, or a few hundred, but when it has looked at millions, each containing large amounts of data itself, it becomes impossible. “The transparency is not there,” she says.

Still, neither she nor Davis is unduly worried about the rise of facial recognition. “I don’t really see what the big issue is,” Pantic says. Police prosecutions at the moment often rely on eyewitnesses, “who say ‘sure, that’s him, that’s her’, but it’s not”: at least facial recognition, she says, can be more accurate. She is concerned about other invasions of privacy, of intrusions by the government into our phones, but, she says, facial recognition represents a “fairly limited cost of privacy” given the gains it can provide, and given how much privacy we’ve already given up by having our phones on us all the time. “The GPS knows exactly where you are, what you’re eating, when you go to the office, whether you stayed out,” she says. “The faces are the cherry on top of the pie, and we talk about the cherry and forget about the pie.”

As with all algorithmic assessment, there is reasonable concern about bias. No algorithm is better than its dataset, and – simply put – there are more pictures of white people on the internet than there are of black people. “We have less data on dark-skinned people,” says Pantic. “Large databases of Caucasian people, not so large on Chinese and Indian, desperately bad on people of African descent.” Davis says there is an additional problem, that darker skin reflects less light, providing less information for the algorithms to work with. For these two reasons algorithms are more likely to correctly identify white people than black people. “That’s problematic for stop and search,” says Davis. Silkie Carlo, the director of the not-for-profit civil liberties organisation Big Brother Watch, describes one situation where an 18-year-old black man was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.

That said, the Facewatch facial-recognition system is, at least on white men under the highly controlled conditions of their office, unnervingly good. Nick Fisher, Facewatch’s CEO, showed me a demo version; he walked through a door and a wall-mounted camera in front of him took a photo of his face; immediately, an alert came up on his phone (he’s in the system as an SOI, so he can demonstrate it). I did the same thing, and it recognised me as a face, but no alert was sent and, he said, the face data was immediately deleted, because I was not an SOI.

Facewatch are keen to say that they’re not a technology company themselves – they’re a data management company. They provide management of the watch lists in what they say is compliance with the European General Data Protection Regulation (GDPR). If someone is seen shoplifting on camera or by a staff member, their image can be stored as an SOI; if they are then seen in that shop again, the shop manager will get an alert. GDPR allows these watch lists to be shared in a “proportionate” way; so if you’re caught on camera like this once, it can be shared with other local Facewatch users. In London, says Fisher, that would be an eight-mile radius. If you’re seen stealing repeatedly in many different cities, it could proportionately be shared nationwide; if you’re never seen stealing again, your face is taken off the database after two years.

Carlo is not reassured: she says that it involves placing a lot of trust in retail companies and their security staff to use this technology fairly. “We’re not talking about police but security staff who aren’t held to the same professional standards. They get stuff wrong all the time. What if they have an altercation [with a customer] or a grievance?” The SOI database system, she says, subverts our justice system. “How do you know if you’re on the watch list? You’re not guilty of anything, in the legal sense. If there’s proof that you’ve committed a crime, you need to go through the criminal justice system; otherwise we’re in a system of private policing. We’re entering the sphere of pre-crime.”

Fisher and Facewatch, though, argue that it is not so unlike the age-old practice of shops and bars having pictures up in the staff room of regular troublemakers. The difference, they say, is that it is not relying on untrained humans to spot those troublemakers, but a much more accurate system.

The problem is that, at the moment, there is very little regulation – other than GDPR – governing what you can and can’t do with a facial-recognition system. Facewatch say, loudly and often, that they want regulation, so they know what they are legally allowed to do. On the other hand, Carlo and Big Brother Watch, along with other civil liberties groups, want an urgent moratorium and a detailed democratic debate about the extent to which we are happy with technologies like these in our lives. “Our politicians don’t seem to be aware that we’re living through a seismic technological revolution,” she says. “Jumping straight to legislation and ‘safeguards’ is to short-circuit what needs to be a much bigger exercise.”

Either way, it needs to happen fast. In Buckinghamshire, Paul Wilks is already using the technology in his Budgens, and is finding it makes life easier. When he started, his shop would have things stolen every day or two, but since he introduced the system, it’s become less common. “There’s definitely been a reduction in unknown losses, and a reduction in disruptive incidents,” he says. As well as a financial gain, his staff feel safer, especially late at night, “which is good for team morale”. If enough retailers start using facial-recognition technology before the government takes notice, then we may find that the democratic discussion has been short-circuited already.

 

5th August 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/08/0bserver2.jpg 708 731 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-08-05 11:57:502019-08-05 12:07:02The Observer - Facial recognition... coming to a supermarket near you...

ACS crime report – The biggest concerns for retailers are violence, theft and verbal abuse

News, Retail

Crime Report 2019

The ACS 2019 Crime Report shows that crimes committed against the convenience sector cost an estimated £246m over the last year, equivalent to over £5,300 for every store in the UK, or what amounts to a 7p tax on every transaction.

The single biggest trigger for violence and abuse was shop theft. ACS estimates that there have been over a million incidents of theft over the last year, with retailers reporting that the vast majority of thefts committed against their business (79%) are by repeat offenders that aren’t being dealt with by local police forces.

Key findings from this year’s Crime Report include:

The three biggest concerns for retailers are violence against staff, theft by customers and verbal abuse against staff

The report estimates that there were almost 10,000 incident of violence in the sector over the last twelve months

Of crimes committed where a weapon was present, the most commonly used weapon was a knife (68% of incidents)

The report also shows that there is a clear link between retailers just doing their jobs by upholding the law, and being subject to abuse. The top three triggers for aggressive or abusive behaviour are challenging shop thieves (1), enforcing age restrictions, for example refusing a sale to someone without ID (2) and refusing to serve drunks (3).

Download the 2019 Crime Report

Watch video highlights here

29th July 2019/by Facewatch
https://www.facewatch.co.uk/wp-content/uploads/2019/07/maxresdefault.jpg 720 1280 Facewatch https://www.facewatch.co.uk/wp-content/uploads/2018/02/fwlogo.png Facewatch2019-07-29 18:05:042019-07-29 18:05:04ACS crime report - The biggest concerns for retailers are violence, theft and verbal abuse
Page 8 of 9«‹6789›

Archive

  • May 2025
  • January 2025
  • July 2024
  • June 2024
  • May 2024
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • March 2023
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • April 2022
  • March 2022
  • September 2021
  • August 2021
  • June 2021
  • May 2021
  • April 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019

Categories

  • Legislation
  • News
  • Personal
  • Police
  • Retail
  • Uncategorised

Connect with us

Find us on social media...

  • Twitter
  • Facebook
  • Linkedin

ASSOCIATIONS

Cyber Essentials logo Cyber Essentials logo

Commercial Partners

Forecourt Eye Monitor

Get in Touch

Call us on: 0207 930 3225
or email: [email protected]

Facewatch Limited
London WC2

© Copyright - Facewatch Limited, Company Number 7209931
  • Privacy
  • Contact
  • Cookie Declaration
Scroll to top