Border control, day 2

Day two of the Global Border Control Technology Summit here in London. And another day of interesting insight into the state of the art around electronic documents, the use of biometrics and automated e-channels.

Joseph Atick, President and CEO, Identix, talked about the challenges of both authentication and knowledge discovery. The quality of biometric images is the single biggest impact on false positives and false negatives. This has been a major focus for NIST (the US National Institute of Standards and Technology) and the US-VISIT programme, aiming to find metrics for scoring quality of images. At the best quality, image accuracy of 99.4% is achieved. But at the other end of the scale, just 27.8%. The impact of image quality is far greater than the impact of the difference in algorithms: so it’s likely that terrorists will aim to ensure image quality is poor – but checking all individuals will be too time consuming since about 20% fall into this category. This has led to multi-finger sampling becoming the main strategy – to overcome poor image quality: it is able to improve to over 95% accuracy, using “slap devices” that capture full multi-fingers. Overall, the best combination has proved to be 10-print slaps (ie. all fingers) + face – leading to over 99% accuracy.

Next up was Gary McDonald, Executive Director, Corporate Services, Passport Office, Canada. He started with a brief recap on ICAO, which just happens to be based in Montreal. Even at this level, there is the same debate about standards versus specifications that we are seeing take place elsewhere. When is a specification a standard? And who decides what is and is not a standards body? Gary chairs the New Technologies Working Group which includes ISO, IATA, the Airport Council International and Interpol. The latest specifications (or is that standards?) can be found at http://www.icao.int/mrtd.

Frank Moss, Deputy Assistant Secretary for Passport Services, US Department of State gave a very straight and insightful talk on the US learnings from early work on biometrics and more sophisticated border control systems. One of the first points he was anxious to make was that biometrics are a tool not a solution and that too many vendors have oversold what they can achieve. The US is aiming to have a fully biometric process by the summer of 2006. This has been an interesting set of challenges since the US has no tradition of ID cards and no desire to introduce them. In addition, with some 7000 acceptance agents for Passport applications, taking on biometrics would be a huge challenge.

A major problem they encountered was the fact that it was possible to read the new e-Passports well beyond the 10cm range of the ICAO specification. They are working on ways of ensuring this can be blocked and cut down to just 2cm: they are very alive to the concerns that distance reading of passports can raise. One difficult but inevitable fact is that the cost base also keeps moving and dramatically increasing. In addition, Frank acknowledged that privacy concerns are highly valid and must be dealt with: “We need a security in depth process around privacy issues” and taking on privacy feedback has helped produce a much better e-Passport.

This is the same point we’ve been making for some time: well designed systems make a partnership of security and privacy instead of setting them at loggerheads.

NIST has done much of the real testing on the new e-Passports and they intend holding an open day to include privacy organisations to show what has been done and discuss what else might be needed. This is a maturity of engagement between government, its agencies and various interest groups that appears lacking elsewhere. Frank also emphasised the need to tap into existing industry expertise – disagreement can be healthy to produce better outcomes. Absolutely. One key lesson to take away: don’t rush these things, it’s worth putting in the time and investment to get them right.

The next speaker was Marcel van Beek, Program Manager, Passenger Process, Amsterdam Airport, Schiphol. They now have a frequent flyer programme that uses an expedited e-channel and has  some 20,000 people enrolled. It makes use of an iris scan and provides a 12s average passing time, with a rejection rate of 1.5%. That said, Marcel admitted the rejection rate does start higher but then drops as people acclimatise to use of the system. Some 500,000 crossings were made last year using it. In addition, the airport has deployed a staff access system using contactless badges with iris verification. The verification is to a token – there is no storage of raw biometrics in databases or systems, with a deliberate decision taken that only conscious capture should be possible (ie. that the system should not enable remote monitoring of people’s movements and presence without their awareness).

Following on was David Leppan, CEO, World-Check, who talked about whether we should all be making more use of open source (public) intelligence information for border control. He cited a few examples of individuals they had spotted as potential problems before anyone else, including apparently the Bali bomber. But he also mentioned some 300,000 people they are currently tracking which does beg the question of how anyone can sensibly track, monitor and evaluate risk in any meaningful way. That said, David was one of the few people during the conference to step outside the immediate authentication/verification issue and consider that without accompanying intelligence about the person you have just authenticated we are all missing a major part of the story.

Claudia Hager, Executive Director, Austrian State Printing House stated that global interoperability was the key goal of much of what they are trying to achieve with e-Passports. They also of course need to be reliable and durable (Claudia demonstrated some interesting experiments that have taken place with hammers, nails, washing machines, acids and ovens, to name a few) to test how well chips are likely to survive in the real world. As with Frank Moss pointing out that biometrics are just an additional tool, not a solution in their own right, Claudia pointed out that the chip is just a supplement to a high quality document. Claudia also highlighted that the new ICAO 9303 standard will shortly be published in its 6th version.

The final speaker of the day was Barry Kefauver, Principal, Fall Hill Associates. He dealt with the topic of travel document system integrity – “the gun waiting to smoke” as he put it. New Zealand was cited as a country that has used database linkages (which involved legislative changes) to help with the associated intelligence systems that need to accompany border control and anti-fraud measures foe example. Over 70% of identity theft/fraud is apparently attributed to stolen and lost identity documents – with the main impact being felt elsewhere (rather than directly on border control) for relying parties such as banks. Improved data sharing of identity and threat information between public and private sectors is needed in Barry’s view to reduce risk. This again echoes the ongoing debate here in the UK about secure, appropriate data sharing across public and private organisations.


This blog post originally appeared when I hosted NTOUK on SimpleBlog. It’s one of several I’m retrieving and posting here to bring together my posts in one place. The content and date shown for this post replicates the original. Many links are, inevitably, broken: where I can, I’ll substitute ones that work, particularly where the Internet Archive Wayback Machine has captured the content originally linked to.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.