There is something deeply concerning about the increased use of facial recognition technology (FRT) by private retailers in the UK.
Sainsbury’s has been trialing the use of this technology in its stores since September last year. Several other retailers have embarked on similar paths, including Asda, Sports Direct and Flannels.
Facial recognition is a computer vision technique that identifies individuals based on a scan of their face. FRT is advanced enough to perform this even from far distances, enabling it to be integrated with camera systems set up inside retail stores and supermarkets.
FRT typically consists of several algorithms and techniques that perform four main steps:
Identifying a human face in a given picture or frame from video footage
Capturing the face from the input picture or video frame and converting this into a standardised form (e.g., resizing and converting to greyscale) to enable more accurate localisation of relevant features for analysis
Extracting and analysing the features from the facial image (such as eyes, nose and mouth), all codified as feature vectors (essentially a list of numbers that represent the facial features)
Checking for a match between the captured feature vector and a database of feature vectors determined by the level of similarity between the two vectors

The surface-level concerns relate to the privacy and data protection implications. FRT involves the processing of biometric data, information that is typically seen as having more serious consequences for individuals when collected and used. The UK GDPR for example treats biometric data as ‘special categories data’ (when used for the purpose of uniquely identifying an individual), the processing of which is by-default prohibited barring an appropriate legal basis and exception.1
Then there are the risks arising from the processing operation behind FRT itself. Discrimination and bias, misidentification and lack of transparency are but a few.
In February of this year, a man was forced to leave from a Sainsbury’s store after staff mistakenly identified him as an offender flagged by its facial recognition system. Other such instances have been reported before.
But these things are offshoots of a broader issue at play.
It is important to understand the underlying incentives motivating the choices being made here, and why retailers are deciding to use FRT in the first place.
In England and Wales, shoplifting offences have been rising.2 The Home Office has found that tackling this crime is made easier with more accessible evidence from, for instance, CCTV footage. Surveys show a high rate of verbal abuse and threats against shop workers.
The incentive structure is clear. With criminal activity on the rise and video surveillance proving valuable in combatting it, naturally retailers will start converging on this technological solution in response to the problem. And an increasing amount are turning to FRT.
Traditionally, if shops had a problem with a shoplifter or other malefactors, staff would intervene, the police would be called and give the person a warning or make an arrest, and the matter would progress through the appropriate legal processes. It is a slow-working mechanism, but the advantage is that each step has procedural protections, evidentiary standards and accountability mechanisms.
FRT’s proliferation across UK retailers changes the landscape quite significantly, in a way that gives rise to various issues, with privacy and data protection among them.
An important aspect regarding the modern use of FRT are the watchlists. With FRT, the system needs a database to compare the captured faces to for a match to be identified. This database, or watchlist, is central to how the system works.
But these watchlists are not merely a technical component for a computer vision system. They form part of the enforcement architecture deployed and managed by private enterprises.
Watchlists essentially function like a police database. Retailers add to these lists those whom they want to be alerted to if they enter the store and potentially remove them or report them to the police. The process for how one is added to these watchlists is essentially determined by the retailers.
Function creep can easily come into play here. The watchlist might start as “people suspected of theft”, but it could cover a variety of different things. It could expand to “people who have been aggressive to staff” or “people who have made staff uncomfortable.” And so on.3
Once the infrastructure exists and the incentives are there, function creep almost becomes inevitable.
Shops represent a prominent public space for many people. They can enable tranquil moments to survey the various products on offer as well as commune with others. They function as both mediums for acquiring new goods and establishing relationships with shop workers, locals and others. They are a way of connecting with the wider world, forming an integral part of one’s daily life.
But FRT turns these spaces into permanent enforcement zones that dampen the experience. With the knowledge of being monitored, people may start to move with purpose, avoid eye contact with cameras, maybe even think twice about going back. Shopping suddenly becomes a risk to manage rather than a privilege to enjoy.
This fits within a broader trend where private actors absorb more functions the state would typically perform - in this case, the identification, monitoring and apprehension of offenders. Commercial actors are increasingly governing conduct in public-facing spaces using tools that the state would legally struggle to deploy itself.
It means that shoppers are being policed by entities that do not face all the constraints of law enforcement agencies. Retailers and the developers of FRT are focused on solving the shoplifter problem. But in doing so they are also building a self-contained surveillance and enforcement infrastructure deployed and operated by private actors driven by closed-source incentives. One facial recognition camera at a time.
See Article 9(2) UK GDPR.
Sometimes people are added to watchlists for seemingly minor offences, such as disputes over 39p of paracetamol.




