… the end of biometric security?

This piece on “Imitating people’s speech patterns precisely could bring trouble” in the Economist caught my eye. It mentions a new technology that means

any voice—including that of a stranger—can be cloned if decent recordings are available on YouTube or elsewhere

To emphasise the problems this creates, it goes on to mention that

When tested against voice-biometrics software like that used by many banks to block unauthorised access to accounts, more than 80% of the fake voices tricked the computer.

Not exactly welcome news for those organisations that assume voice recognition can be used to identify users. Back to the drawing board time. Again.

This development is not surprising. It was, in fact, entirely predictable. I anticipated this happening some 11 years ago (see below). It’s part of the well understood, and inevitable, process of commodification and diffusion that I wrote about in King Canute, diffusion and the Investigatory Powers Bill.

My concern is not that a development such as this has happened – but that little seems to have been done to prepare for it and to mitigate its impacts. It was only a matter of time before a disruptive technology such as this entered the mainstream.

Yet there seems to be a general failure across both private and public sectors to anticipate and prepare for such inevitable developments. If we’re going to better protect and secure the future of our increasingly technologically-dependent society, we also need to ensure there is much better analysis and understanding of the impact of technological innovation, including processes such as diffusion.

This need to anticipate and prepare for change is becoming particularly acute in an age where we all leave pervasive digital footprints – including potentially sensitive personal data, both biographical and biometric – in our wake, for others to mine and potentially abuse. Governments in particular need to become far better at understanding, anticipating and mitigating these inevitable processes.

This latest voice synthesis technology may not itself be the end of biometrics as part of the process of identifying users, but it is potentially the beginning of the end of the naive trust placed in biometrics as some flawless, wonderful panacea.

As I say, none of this is anything new. The Economist’s article reminded me of a blog post I wrote back in August 2006. As my old blog engine is no longer available, I’m reproducing that 2006 post below. Nearly 11 years on, it still seems to raise valid concerns.


biometrics: enabling guilty men to go free? Further adventures from the law of unintended consequences

[Originally published August, 2006]

 

Dateline: the near future

Setting: the Old Bailey. A tense, invitation-only event. A spectacle of the kind that London has made its own since long before the days of Newgate Prison and the macabre carnivals of the public hangings at Tyburn Tree.

Outside, armed policemen, guard dogs and riot barriers prevent the curious crowds pushing too close. On the office rooftops – police marksmen. In the Victorian drains below the courtroom – boiler-suited bomb teams, knee deep in London’s toxic wastes.

This is a trial that must not, cannot go wrong. The media has been in a full-on, Fleet Street frenzy for months. Driven by political rhetoric, media pressure and public concern, the police and intelligence services have been running faster than they have ever run, worked harder than they have ever been worked. Dawn raids, arrests, releases. High hopes, false hopes. Trails hot and trails turned cold.

And now – victory. The alleged perpetrators of the terrorist bombing caught. Remanded behind bars. Charged. The prosecution case a year in the making. Finally, today is their day in court. The evidence is compelling. Justice will be done.

The court room quietens. The judge enters, takes his seat. A quiet, insignificant-looking man. But his record tells another story. He is sharp, shrewd. Prosecutors and defending counsel alike respect him.

And so the case begins

The prosecution case is damning. It silences the court.

Item 1: transcripts and computer disk recordings of phone calls made by the defendants. Backed up by expert forensic voiceprint analysis. The calls and their transcripts damn the defendants out of their own mouths. Chilling, deadly, inhuman, hateful spite. Clear evidence these bombing attacks were ruthless in both planning and execution.

Item 2: fingerprints found in the bombers’ flat. Fingerprints on detonators and fuses. Matched by both computer and expert human analysis.

Item 3: CCTV footage showing the bombers’ movements. Time and day recorded. The collection of the components. Their assembly at the bomb scenes. Painful, but compelling to watch.

Item 4: DNA recovered from the bomb site. DNA that matches beyond any doubt two of those who now stand accused in the dock.

The prosecution’s case is damning. Brilliantly assembled and executed. It appears yes, invincible. After weeks of evidence, the prosecution team sit down, convinced their job is done.

But wait: what has the defence team to say? It seems a mere formality. The media have written the headlines, news stories and analyses already. The defence is a mere inconvenience, a routine formality to be endured, little more than a legal hesitation before a verdict already known can be delivered.

Yet what is this? The defence counsel start to ask strange questions, start to probe and find holes in the prosecution case, holes that only moments before seemed not only unlikely, but impossible.

One by one the pieces of forensic evidence are questioned and – impossible thought – undermined.

Item 1: the computer-captured and analysed voiceprints of the defendants. Doubts quickly turn from ill-formed ghosts into hard, awkward facts. They turn out not to be such a damning indictment after all. The defence team rapidly establish that copies of the defendants’ voiceprints are held on computer systems in over twenty different call centres, from banks to budget airlines and catalogue shopping companies. And are held not only in the UK, on UK-based computer systems, but scattered around the world. Each automated call centre operates under a different legal jurisdiction. Many of their data security procedures are evidently questionable at best laughable at worst.

It is, as the defence team soon demonstrate, a simple matter to use another person’s voiceprints in the fabrication of fake conversations, in the fabrication of evidence making them appear to say anything. The prosecution’s voiceprint evidence is undermined. Worthless. Struck from the record. Struck from the prosecution’s case.

Item 2: the fingerprints. Now surely we are on safer ground, surely these will be unassailable as evidence? But doubts now start to form and grow. Fingerprints in a pre-digital age were rarely forged: whilst occasionally they might be maliciously lifted and re-used to falsely incriminate others, the odds were against it. And experts could often detect such basic efforts to subvert evidence.

But, as the defence counsel establish, our world is now very different. Our fingerprints are now stored in tens, maybe even hundreds of different computer systems. From Disneyworld to border control systems, from school library book loans to car rental systems. And many of these systems are run and operated by regimes unlikely to win any awards for their security or human rights reputations. Fingerprints are now so ubiquitously stored on computers around the world, they can be accessed, replayed, planted and generally misused at will. Their reliability as evidence is gone.

Strike 2.

Item 3: now surely the defence counsel cannot undermine CCTV footage? Surely this is irrevocable evidence of the defendants’ guilt? Ah. Chip, chip, chip. Precision questioning from the defence team once more. There is no evidence that the date and time stamps on the digitised footage can be relied upon. That the defendants’ visits to the crime scenes happened when the prosecution claim. In fact, there is no proof to suggest that they were carrying anything more than an instant dinner from Tesco. The evidence wobbles, but stands – although behind it also now stands a large, worrying question mark.

Finally, item 4: DNA. The jewel in the crown of the prosecution case. DNA that matches two of the defendants. Found on fragments of the bombs, found at the crime scene. A clear, accusatory pointing finger of guilt. Unassailable.

Let’s see what the defence counsel will make of this, smile the prosecution team, confident, assured.

But wait.

The defence team establish that our DNA is hardly a secret. Like fingerprints and other biometrics, DNA itself can also easily be obtained and just as easily planted. In fact, two of the defendants – the very ones whose DNA is alleged to be on the bomb fragments found at the scene – are shown to be participants in a major medical trial that involves publication of their entire genome sequence on the Internet. And, as an expert scientific witness for the defence points out, one of the risks of the project has always been that someone would take a volunteer’s details, make synthetic DNA corresponding to the volunteer and plant it at a crime scene.

There remain legal formalities to be gone through, weeks of arguments and counter arguments – but the outcome is clear to all but the most partial observer long before the judge pronounces “Case dismissed”.

After months of fevered speculation and sobering evidence, the headlines and analyses need to be re-written. The courtroom becomes a madness of racing reporters, of victims’ families and relatives in tears, of the defendants’ families rejoicing. The trial of the century has been thrown out. It will fill international press and TV news coverage for days, weeks to come.

But who is right? Have guilty men gone free? Or were innocent men set-up? Is it possible that the vital cornerstone of our criminal justice system – the forensics of DNA, of biometrics, from fingerprints, to voiceprints – could become too contaminated by the ubiquity of their acquisition and storage in computer systems to be regarded as any kind of evidence at all?

A science fiction future?

Consider this. The Personal Genome Project is no science fiction future, but an established reality at Harvard University. And the project does indeed warn participants that they run the risk of someone taking a volunteer’s details, making synthetic DNA corresponding to the volunteer and planting it at a crime scene.

More and more organisations, more and more regimes, are demanding and storing our biometrics in more and more computer systems. We know no computer system is 100% infallible.

In a world where our biometrics are acquired and stored by all types of regimes and organisations, we must be rigorously analytical of the risks involved and where they may lead us. If we do not do so, I believe we run the risk of losing our best evidence, our best defence against organised and serious crime: the very opposite of what was intended. These are not outcomes we should countenance lightly.

What will happen in a world where more and more computer systems capture and store our biometrics and even our DNA? What happens when every time we travel, to any regime and country, or deal with retail, entertainment and other organisations, copies of our biometrics are captured and stored on computer systems over which we have no control, no guarantees? We know from experience no system is 100% secure, that all systems bleed and leak.

We need to think very carefully indeed about where this simplistic belief that biometrics will be a universal panacea to issues of identity could lead. We know that the law of unintended consequences will always undermine our best intentions. And, if that were to happen, there would be no way this particular genie could ever be put back into the bottle. We only have the one set of biometrics, one set of DNA.

If these important issues are not thought through clearly, if we do not have a proper discussion – including of the international dimension – about the way in which biometrics and our DNA are acquired, stored and used, our ability to investigate and prosecute criminals based on forensic evidence could be lost forever.

Could we ever let this happen? I hope not. It would be an unacceptable outcome, a violation at the very heart of the way our society works. Yet if we are to avoid it doing so, we should already be planning and reaching consensus on the way in which biometrics and DNA samples are acquired and stored in computer-based systems.

It’s time that debate started to happen. The topic of biometrics is one I’ll be addressing from various perspectives, before, during and after my participation in and closing keynote at Biometrics 2006.

Please join me in this important debate.


[Note: I am in the process of aiming to re-host my older blog posts, such as the one above.  More news as and when this happens.]

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s