Automated recognition software: your rights in the public space

This is the text of my summer 2021 column for BECTU’s Stage, Screen & Radio, slightly extended and with added links. Sometimes the column – especially when published several months later – gets overtaken by events; occasionally concurrent events give it added relevancy and that’s the case with this one, with news this week that the Information Commissioner is stepping in over the case of facial recognition technology in Ayrshire schools ‘to speed up the lunch queue’; and with Eurostar testing the same to give ‘seamless travel across borders’ and a ‘touch-free journey through border checks’ (under plans originally announced last summer). As always, the language is of course interesting focusing on the upsides with little consideration of the (considerable) downsides. Passport checks – which already incorporate biotechnology – are one thing; whether school children are in a place to give informed consent for something as quotidian as school lunches is another thing entirely.

Anyway, on with the column…

The European Data Protection Supervisor – an agency which reinforces data protection and privacy standards – has called for a ban on the use of ‘automated biometric identification in public space’. This means a number of things connected with the use of what, for simplicity, we’ll call here ABI to categorise a range of features including, most obviously, facial recognition but also gait, voice, keystrokes and our other biometric or behavioural signals.

The EDPS is not concerned with the use of AI to unlock your smartphone, but it is concerned about the public space: law enforcement and also the wider commercial and administrative environments in which it might be deployed – for example ‘smart’ advertising hoardings and billboards, attendance at sporting and other mass events, in airport screening or wherever users access public services.

The call for a ban is clearly serious – but so is the context in which it was made: the European Commission’s legislative proposal for an Artificial Intelligence Act. This, the EDPS noted, did not address its earlier calls for a moratorium on the use of ABI in public, however otherwise welcome the initiative.

The UK has of course left the EU, but the Information Commissioner’s Office – the UK’s own data protection and information authority – is also concerned about these issues. A reference to facial recognition technology appeared very early in the ICO’s 2019/20 Annual Report; while the Office issued an Opinion on the use of facial recognition technology in law enforcement in October 2019. It also intervened in a judicial review on the use of such technology by South Wales Police – a review which the police lost on human rights and data protection grounds.

We know – and have done for some time – of the problems of ABI in distinguishing between people: it has a much lower accuracy record in correctly matching people of colour, women and those aged 18-30. Partly, this speaks to the lack of diversity amongst those developing ABI software and amongst those on whom it is tested; in either case, were the base to be more representative, its accuracy record may well be better.

This, in turn, speaks to the need for software development standards also to be more representative and more inclusive, and to take serious account of tightly-drawn standards of ethics.

(Whatever the comical faults of the LinkedIn jobs algorithm, it is AI that is responsible for diverting job advertisements in a way which reproduces the extent of existing occupational job segregation, and which may contravene sex discrimination laws, by sending grocery delivery jobs to women and pizza delivery jobs to young men).

Furthermore the EDPS spoke specifically of its concerns that AI ‘presents extremely high risks of deep and non-democratic intrusion into individuals’ private lives’ while the ICO being similarly exercised – expressly, and in very similar language, about its potential for ‘unnecessary intrusion into individuals’ daily lives’ – indicates a worry among regulatory authorities that there are unsettling data privacy and state surveillance aspects surrounding the use of ABI in this way.

ABI works on the basis of matching scanned images against a ‘watchlist’, deleting those where there is no match and otherwise prompting human intervention. What the authorities are concerned about is whether an individual could anticipate, and understand, their image (or data) being processed in this way; and whether this is both a necessary and a proportionate response. What you and I might be concerned about is how someone could put us on a watchlist – was it because we went on strike, perhaps, or demonstrated against racism? – and how the authorities would then be allowed to track us wherever we go without us knowing.

Unquestioning faith

Additionally it’s true that we tend to place a large amount of unquestioning faith in the results that machines give us. If our trust is not to be abused, we need to be confident that the ABI which lies underneath has been developed, and is being used, in a socially just way.

The South Wales Police case highlights that ABI could identify large numbers of people and track their movements. Few trade unionists – or others organising protest actions – will need a refresher course on what that might mean. The decision in this case recognises the need for precise legal boundaries on the use of ABI, something which EDPS also openly acknowledges, although what these will be has yet to be defined.

Where we impose limits on the use of surveillance technology, in a law enforcement capacity and in terms of our knowledge of our data rights and our trust, is something in which we should all be taking a keen interest.

2 thoughts on “Automated recognition software: your rights in the public space

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s