By 27 August 2019

Facial recognition technology burst into the headlines this month following an exposé in the Financial Times about its use in London’s King’s Cross.

The Information Commissioner’s Office has launched an investigation into the use of the technology, which scanned pedestrians’ faces across the 67-acre site comprising King’s Cross and St Pancras stations and nearby shopping areas, without their knowledge.

It is the latest controversy to embroil the technology. Manchester’s Trafford Centre was ordered to stop using it by the Surveillance Camera Commission, which works for the Home Office.

Information commissioner Elizabeth Denham said after details of the King’s Cross scheme emerged that she was “deeply concerned about the growing use of facial recognition technology in public spaces”.

“Scanning people’s faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all”

Elizabeth Denham, information commissioner

“Scanning people’s faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all,” she maintained.

“That is especially the case if it is done without people’s knowledge or understanding. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”

The European Commission is also understood to planning new regulation that will give EU citizens explicit rights over the use of their facial recognition data as part of an update of artificial intelligence laws.

What’s it for?

So what does that mean for retailers that are either already deploying or are considering a roll-out of facial recognition technology in their stores?

Given the level of concern and scrutiny from regulators and public alike about how such technology is used, can retailers deploy it in a way that adds value to their business and without risking alienating customers?

Innovation agency Somo’s senior vice-president of product management Tim Johnson says: “There’s a very wide range of things [facial recognition] could potentially be used for. It is a very significant technology and a really seamless process that provides a strong form of identification, so it is undeniably big news.

“But at the moment it is a big muddle in terms of what it is for, whether it is useful or too risky and in what ways. We’ll look back on where we are now as an early stage of this technology.”

One area where facial recognition technology has been piloted by retailers is in-store to crack down on shoplifting and staff harassment.

“The only information held is on those who are known to have already committed a crime in the store previously”

Stuart Greenfield, Facewatch

According to the BRC, customer theft cost UK retailers £700m last year, up 31% year on year, while 70% of retail staff surveyed described police response to retail crime as poor or very poor.

Against that backdrop, retailers such as Budgens have rolled out tech from facial recognition provider Facewatch to stores across the South and Southeast, after a trial in an Aylesbury shop resulted in a 25% crime reduction.

Facewatch marketing and communications director Stuart Greenfield explains that clear signage is displayed throughout any store where the platform’s technology is used, and any data is held in Facewatch’s own cloud platform, not by the retailers.

“The only information held is on those who are known to have already committed a crime in the store previously, anyone whose face is scanned by the system and does not correspond against our existing watchlist is deleted immediately,” says Greenfield.

He believes it is the “combination of marketing, in-store signage and the system itself” which acts as a deterrent to shoplifting and staff harassment in stores where Facewatch’s technology is used.

Shopping centre operator Westfield has teamed up with digital signage firm Quividi, which analyses passersby’s facial data based on their age, gender and mood to determine which adverts are displayed as a means of driving customer engagement and sales. Shoe specialist Aldo and jeweller Pandora also work with Quividi overseas.

Quividi chief marketing officer Denis Gaumondie argues that the platform’s technology is not facial recognition – rather it is facial analysis, because it does not store any data on passersby and would therefore not recognise a repeat customer, or link their data to purchases.

He adds that it is the responsibility of Quividi’s retail partners to inform shoppers that the technology is in use.

Hot potato

However, DWF partner Ben McLeod, who specialises in commercial and technology law, says even using facial recognition or analysis technology in-store as described above could land retailers in hot water.

“There is a general prohibition on processing special category data [which may, for instance, include racial or ethnic origin] unless a specific exception applies,” he points out. “Many of the exceptions relate to the public interest which doesn’t really apply to retailers, particularly where the primary purpose for the use of the technology is marketing or to prevent stock loss.”

“Processing is possible where the data subject [the customer] has given explicit consent, but in practice, this will be difficult to demonstrate, as merely alerting customers to the use of facial recognition technology will not suffice.”

“Given that the basis on which the police are using surveillance technology is also currently subject to legal challenge, retailers are advised to tread carefully,” he cautions.

Opting in

Facial recognition technology is prompting controversy

Facial recognition has also been tried out by the Co-op to verify the purchase of age-restricted products such as alcohol at self-service checkouts. Customers found to be over 30 were allowed to complete the purchase without the need for verification by a member of staff.

Johnson believes such use of facial recognition technology would be welcomed by many customers because it would require their specific consent to use it, as was the case with the Co-op, as would verification of the purchase of a whole shopping basket using biometric data.

“People are comfortable with using facial identification on their own device [such as Apple’s Face ID], so using it as a means of verifying purchases in-store feels like a logical next step. It would speed up the check-out experience.”

Capgemini principal consultant Bhavesh Unadkat also points to the roll-out of Amazon Go stores in the US, which verify shoppers’ purchases and link them to their Amazon account using biometric data including facial recognition technology.

He explains that shoppers who download the Amazon Go app and then go into one of the checkout-free stores understand what technology is being used, and how it is benefiting them by providing an efficient shopping experience. The trade-off is clear and there is an “opt-in” to use the technology.

“I don’t think [retailers] can ask customers to opt out of facial recognition technology being used in-store, or just alert them to it being there,” he says.

“They need to ask shoppers to opt in and sell them the benefits they would get, such as a cashless checkout, more rewards, personalised offers to your mobile as you enter the store. Don’t go down the route of assuming people will never opt in and not communicating effectively, because if you get it wrong then the trust is broken.

“Right now we are making a mess of [facial recognition technology] because people are already paranoid about sharing information online and now feel like they are being victimised in a bricks-and-mortar environment as well.”

McLeod concurs with that view.

He says: “Amazon Go is the kind of thing where people are making a choice upfront by downloading the app. That is different from walking into a shopping centre or having the technology foisted upon you in a way that isn’t transparent.

“It becomes far more pervasive in that setting, but the more fundamental issue is there isn’t a strong legal grounding for the use of the technology.”

Right side of the law

Greenfield emphasises that Facewatch is working with the ICO to ensure its technology remains compliant with current and incoming regulations.

“We are pushing like mad for legislation as quickly as possible,” he says. “We want to do everything that is good for the technology because the reality is we cannot put the genie back in the bottle; [facial recognition] is out there and it will be used by someone, so we should have legislation to ensure it is used properly.”

Johnson advises retailers to collaborate closely with engaged suppliers and legislators, and tread carefully when deploying facial recognition technology, but does not believe that current controversies should deter retailers from using it for good.

He says: “I absolutely think [retailers] should still be exploring it. The current environment should make them fully aware of the risks, but it isn’t going away and the potential rewards are large, from crime prevention to age verification and flagging relevant products to customers.

“We’ll hopefully see a period of innovation which shows people what [facial recognition] is useful for.”

Reported by:

As the world’s first case against the controversial technology concluded, two leading judges dismissed the case brought by human rights campaign group Liberty on behalf of Ed Bridges, a Cardiff resident whose face was scanned by South Wales Police during a trial of facial recognition.

Lord Justice Haddon-Cave, sitting with Mr Justice Swift, concluded that South Wales Police’s use of live facial recognition “met the requirements of the Human Rights Act”.In a three-day hearing in May, Mr Bridges’ lawyers had argued that South Wales Police violated his human right to privacy by capturing and processing an image taken of him in public.

The judges also ruled that existing data protection law offered sufficient safeguards for members of the public whose faces were scanned by facial recognition cameras, and that South Wales Police had considered the implications.

Liberty lawyer Megan Goulding said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.

“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.

“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

A spokesperson for the Information Commissioner’s Office said: “We will be reviewing the judgment carefully.

“We welcome the court’s finding that the police use of Live Facial Recognition (LFR) systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“Any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”

Sajid Javid, the Home Secretary, has backed trials of face recognition by the Metropolitan Police. The trials will be used to test AFR (automatic facial recognition) to help in the fight against child abuse.

Speaking at the launch of new computer technology aimed at helping police fight against online child abuse, Mr Javid said it was right for forces to…

“To be on top of the latest technology”

He added:

“I back the police in looking at technology and trialling it and… different types of facial recognition technology is being trialled especially by the Met at the moment and I think it’s right they look at that.”

Report from the BBC may be read here

“I actually believe facial recognition technology, properly overseen, properly thought about, properly circumscribed, is something that our public would expect us to be doing”