Sunday , November 28 2021
Home / Naked Capitalism / UK Goes Full-On Big Brother, Employs Facial Recognition Technology to Expedite School Lunch Queues

UK Goes Full-On Big Brother, Employs Facial Recognition Technology to Expedite School Lunch Queues

Summary:
The UK government’s love affair with tech-enabled surveillance knows no bounds. [This story is a little dated, having first surfaced in the Financial Times on Monday. But on that day I decided to write a piece on what I thought was an even more pressing issue: Italy’s “no jab, no job” vaccine mandate, which threatens to render millions of people unemployed. But this story from the UK is such an outrageous example of creeping surveillance in the so-called “liberal” West that I thought it still worth sharing] As the pink paper reported, nine schools in the Scottish region of North Ayrshire have started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US). The BBC later reported that two schools in England were also piloting the

Topics:
Nick Corbishley considers the following as important:

This could be interesting, too:

Lambert Strether writes Links 11/25/2021

Yves Smith writes Why Don’t We See Headlines Touting the Pentagon’s Hefty Price Tag?

Yves Smith writes How the Pandemic Helped Spread Fentanyl Across the US and Drive Opioid Overdose Deaths to a Grim New High

Yves Smith writes The “Tesla Financial Complex”: Financial Times Details Wildly Outsized Speculation in Tesla Options

The UK government’s love affair with tech-enabled surveillance knows no bounds.

[This story is a little dated, having first surfaced in the Financial Times on Monday. But on that day I decided to write a piece on what I thought was an even more pressing issue: Italy’s “no jab, no job” vaccine mandate, which threatens to render millions of people unemployed. But this story from the UK is such an outrageous example of creeping surveillance in the so-called “liberal” West that I thought it still worth sharing]

As the pink paper reported, nine schools in the Scottish region of North Ayrshire have started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US). The BBC later reported that two schools in England were also piloting the system. At a time when many schools in the UK are facing crippling budget cuts, this speaks volumes about the local councils’ educational priorities.   

In response to the revelations, the Information Commissioner’s Office issued a weak-tea statement, encouraging schools to “carefully consider the necessity and proportionality of collecting biometric data before they do so.”

A statement from children’s digital rights group Defend Digital Me packed a meatier punch: “Biometrics should never be used for children in educational settings — no ifs, no buts. It’s not necessary. Just ban it.”

Normalising Biometric Surveillance

In its defence, North Ayrshire council said it had sent out a flyer explaining the technology to the children’s parents ahead of the enrollment. That flyer included this lovely little nugget: “With Facial Recognition, pupils simply select their meal, look at the camera and go, making for a faster lunch service whilst removing any contact at the point of sale.”

Apparently a whopping 97% of the school children or their parents consented to be enrolled in the pilot scheme. It seems that the council believes that preteens and teenagers are adequately equipped to decide for themselves whether or not the installation of facial recognition technologies in the school canteen infringes their privacy.   

Similar facial recognition systems have been in use in the United States for years, though usually as a security measure. In the case of the schools in Ayrshire, this is all about ease, speed and efficiency. Or so we are told.

“It’s the fastest way of recognising someone at the till,” said David Swanston, the managing director of CRB Cunninghams, the company that provided the system. Swanston added that the average transaction time using the system was five seconds per pupil: “In a secondary school you have about a 25-minute period to serve potentially 1,000 pupils. So we need fast throughput at the point of sale”.

One wonders how school cafeterias were able to cope with demand for so long without digital and biometric payment technologies. But critics argue that these pilot schemes have a much darker purpose than expediting school lunch queues; they are about conditioning children to the widespread use of facial recognition and other biometric technologies. 

“It’s normalising biometric identity checks for something that is mundane,” Silkie Carlo of the UK campaign group Big Brother Watch told the FT. “You don’t need to resort to airport style [technology] for children getting their lunch.”

Checkpoint Britain

The UK has on average 1 surveillance camera for every 6.5 people, according to a 2019 analysis by IHS Markit. That’s more than any other country in the world, except for China, which has 1 camera per 4.1 people, the US (4.6) and Taiwan (5.5). This data was featured in a 2019 CBS article warning about the US’ increasing adoption of surveillance technologies:

“During the past few years, coverage of the surveillance market has focused heavily on China’s massive deployments of cameras and artificial intelligence technology. What’s received far less attention is the high level of penetration of surveillance cameras in the United States,” report author Oliver Philippou, an analyst at IHS Markit, said in a note. ‘With the U.S. nearly on par with China in terms of camera penetration, future debate over mass surveillance is likely to concern America as much as China.'”  

Like their US counterparts, UK authorities have been trialling live facial recognition (LFR) surveillance in public places for several years. Many of the trials were monitored by activist group Big Brother Watch. In a 2019 article for Yahoo, Silkie Carlo wrote that watching “these live facial recognition trials is to watch your civil liberties slip away before your eyes.” She recounted an anecdote from an LFR trial in the East London borough of Romford. When a passing pedestrian called John (not his real name) noticed the Police cameras, he pulled his jumper over his chin. It was, Carlo says, “a small act of resistance to encroaching surveillance in his town”, for which he ended up paying a price:

I watched a plainclothes officer who had been loitering near us radio through to uniformed officers, instructing them to stop him. John was then surrounded and grabbed by officers, pushed to a wall and questioned. They demanded to know why he was covering his face.
“If I want to cover my face, I’ll cover my face,” he said. “Don’t push me over when I’m walking down the street.”
The plainclothes officer then took a photo of him on a mobile device anyway “for facial recognition.” Police had not told us, or anyone, that they had not only the capability to scan faces with fixed cameras but to “point and shoot” with handheld, mobile devices. He was made to hand over his ID as well. After police aggravated him and threatened to handcuff him, they issued him with a £90 ($115) fine for disorderly behaviour.

The New Frontier of Facial Recognition Surveillance

Since the Covid-19 pandemic police forces around the world have been given much broader surveillance powers. In August this year, the Mayor of London quietly green lighted a controversial proposal that will permit the Metropolitan Police, the UK’s biggest police force, to use Retrospective Facial Recognition (RFR), as part of a £3 million deal with Japanese tech firm NEC Corporation, reports Wired magazine:

The system examines images of people obtained by the police before comparing them against the force’s internal image database to try and find a match.
“Those deploying it can in effect turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years,” says Ella Jakubowska, policy advisor at European Digital Rights, an advocacy group. Jakubowska says the technology can “suppress people’s free expression, assembly and ability to live without fear”.
The purchase of the system is one of the first times the Met’s use of RFR has been publicly acknowledged. Previous versions of its facial recognition web page on the Wayback Machine shows references to RFR were added at some stage between November 27, 2020, and February 22, 2021. The technology is currently used by six police forces in England and Wales, according to a report published in March. “The purchase of a modern, high-performing facial recognition search capability reflects an upgrade to capabilities long used by the Met as well as a number of other police forces,” a spokesperson for the Met says.
Critics argue that the use of RFR encroaches on people’s privacy, is unreliable and could exacerbate racial discrimination. “In the US, we have seen people being wrongly jailed thanks to RFR,” says Silkie Carlo, director of civil liberties group Big Brother Watch. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless and rights-abusive police technologies.”

The Backlash Begins (in Brussels)

In September, the Geneva-based Human Rights Council (HRC) published a report recommending that the protection of human rights must be front and centre of the development of AI-based systems. While the report conceded that AI “can be a force for good,” it also flagged concerns around how data is stored, what it’s used for, and how it might be misused

“AI technologies can have negative, even catastrophic, effects if they are used without sufficient regard to how they affect people’s human rights,” Michelle Bachelet, the United Nations High Commissioner for Human Rights, said in a statement.

One of the biggest problems is the prevailing legal vacuum that makes it all but impossible to guarantee the safety of all the biometric data being harvested. As I reported for NC a few weeks ago, the facial recognition data of over 16 million UK residents, handed over to the NHS App, is being managed by undisclosed companies. The NHS appears to be sharing some of that facial recognition data with law enforcement bodies, according to The Guardian. The data is also likely to be of interest to UK and foreign intelligence services.

Yesterday (Oct. 21), continental rail operator Eurostar unveiled a new pilot scheme for a biometric identity verification technology that offers seamless travel across borders. The system allows travellers to upload their face and passport ahead of travel for a seamless touch-free passage through border checks. It’s fast, smooth, convenient and easy. One just has to hope that one’s facial recognition data is in safe hands.

The European Parliament does not seem convinced. In a recent unanimous vote MEPs called for a ban not only on police use of facial recognition technology in public places but also private facial recognition databases. The vote was non binding, of course, and some EU Member States are desperate to deploy facial recognition technologies to fortify their security apparatuses, but at least it’s a step in the right direction.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.