biometrics revisited

One day last week I stumbled across the advert above for Biometrics 2006 while trying to sort out my hopelessly disorganised backlog of digital files. In the folder alongside it were my panel discussion notes and closing keynote slides from the same conference. So what was I talking about 13 years ago – and how accurate were my predictions?

Biometrics, 2006 style

I opened with a recap of various biometric and authentication technologies and their strengths and weaknesses. False positives/negatives, false accept/reject rates, liveness testing and attended versus unattended uses. And an outline of the widening scope of biometrics, notably non-contact or behavioural biometrics – things such as gait and facial recognition, which can be done at a distance – as well as intrusive and non-intrusive deployment.

Some of the biometric approaches I cantered through – voice, fingerprints, iris, face, signature, keystrokes, hand geometry, retina, DNA, gait, brain neural wave analysis, vein – have become pretty mainstream now, while others have fallen by the wayside.

I also looked at the issues around effective deployment and how to ensure more ethical design – considering both raw biometric capture versus templates, liveness testing, the use of tokens, and nonces to avoid replay attacks.

But my wider concern – the underlying thread of my talk – was the lack of any framework within which biometrics were being developed and deployed.

In an effort to spark a debate about how we might ensure the better use of biometrics, and prevent them becoming a negative thing, I ran through Kim Cameron‘s ‘laws of identity‘. It provided a useful starting point for discussing a framework around identity and the creeping invasion of biometrics. Many of the basic principles I think still hold true, particularly at the technical levels.

I challenged the idea that systems require centralised data and could be designed instead around tamper-resistant edge models. I used a system from Microsoft Research (FaceCerts) to illustrate my point. FaceCerts provided cryptographically secure photo cards using a combination of public key cryptography, compression and barcode technologies. I liked the concept as it was entirely self-contained, with nothing stored on central systems and came with very low costs (you could potentially print out a ‘card’ for yourself at home on a piece of paper).

“Progress” since 2006

So how well has my 2006 keynote aged? Well, possibly better than my periodic efforts to sort my digital files.

Kim Cameron’s ‘laws’ never really got adopted either at the policymaking or technical design levels but I think they played a useful role in helping shape and improve thinking around the more socially responsible and human-centred nature of design. Kim himself has spoken of some of the things he got wrong with the laws.

Some of what I warned would happen has unfortunately come true – with hostile regimes and others now acquiring and holding copies of our biometrics, from facial images to fingerprints and more. This will continue to undermine their value outside of supervised environments or without sufficient countermeasures such as liveness testing.

It’s disappointing that governments didn’t heed the calls for better regulation long before we reached this stage. But that’s part of a bigger pattern of regulatory failure we see everywhere, from banking to artificial intelligence. It’s also worrying to see the intentionally discriminatory way that biometrics are being used, apparently aided by academics operating without any sense of ethics or humanity. They’ve become even more of a weapon than I anticipated, and weakened as an aid. And some of the poor practices in place – such as storing raw biometrics rather than templates and then having them breached – is just plain negligent.

So looking back, what was my biggest mistake? I had assumed that governments would move more quickly to put legal frameworks around the use of biometrics, given their importance. But we still seem to be in a poorly defined landscape where the rules are often opaque or non-existent.

There has been insufficient public debate about how and where biometric technology can appropriately and proportionately be used. It’s disappointing just how far the use of biometrics has intruded into citizens’ personal lives without adequate discussion or accountability, and how their ubiquity is increasingly likely to seriously undermine law and enforcement.

The net result? 13 years on we still lack the right balance of public policy, technological aptness and citizen benefit. And the most worrying thing? I think I could probably present the exact same talk today – and get away with passing it off as something new.

One comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.