digital leaders TV discussion

BBC Click’s Kate Russell hosted myself, Mark Thompson and Alan Brown on a recent episode of Digital Leaders TV. Our topic? How to understand and implement new digital business models in the public sector, with questions and interventions both from Kate and from submitted questions from those tuning in to the broadcast Hangout.

The discussion centred on our new best-selling book “Digitizing Government: Understanding and implementing new digital business models” (available in the UK from Amazon, Waterstones or your local independent bookshop, and in the US from Amazon and others).

This is the full video:

And this, for those of you pressed for time, is the 15 minute edited “highlights edition”:

One theme that emerged from the questions submitted was a fear that moving to digital public services is about putting everything online, leaving those unable or unwilling to use technology behind. Putting existing services onto an electronic screen is a minor part of what “digital” really means: online services should not simply be about the citizen being forced to use an online channel, but about improving all channels (including face-to-face services delivered in an office or in our homes) by improving the processes, systems and organisations that sit behind them.

Our discussion aimed to bring out this often overlooked perspective, mirroring much of what our book is about.

Posted in digital, future Britain, IT, IT strategy, open government, public services, social exclusion, social inclusion, taxation, technology, technology policy | 1 Comment

(continued) more thoughts on government in the digital age

bide-imageOn the back of the launch of our new book, Digitizing Government, I posted a few background thoughts in my previous blog — very imaginatively entitled, er, more thoughts on government in the digital age. I continue exploring a few more themes here.

Cloud computing ≠ shared services

“Shared services” dominated much of the discussion about government use of technology over the last decade or so. But for all the talk, little was achieved. In the UK there were the few shared, and ageing, API-based components collectively known as the “Government Gateway” (providing common cross-government services such as authentication, transaction handling and payments): but generally the whole debate became typified by the standoff “I’ll share my service with you, but your service isn’t any good for us”.

The idea that a simplistic, “one size fits all” vertically-integrated, shared services solution for functions such as HR, CMS or ERP was the magic bullet was well-intentioned but naive, given how different organisations operate. The approach lacked a sufficiently detailed analysis of the needs of the various organisations involved and how mature, or bespoke, their requirements actually were. It failed to decompose requirements into those areas where processes and functions were common — and could potentially utilise shared services infrastructure — and where they were unique to each organisation. They also lacked the accompanying management drive necessary to rationalise, simplify and standardise many of the existing processes and functions prior to using a shared technology platform.

So is today’s poster-child — cloud computing — just more of the same, part of the current vogue around ‘smac’ (social, mobile, analytics and cloud)? Like any technology, poorly managed and poorly applied, it’s not going to magically solve complex problems of service design any better than any other option. But as part of a meaningful strategy, such as the UK’s G-Cloud initiative, the adoption of cloud computing will have far more impact than shared services. In the USA:

“The federal government has been gradually adopting shared service business models for administrative services for nearly 30 years. Today, the buzz is all about “the cloud” and its potential to transform shared services as we know them. There’s much hype and a tendency to conflate shared services and cloud computing—things that have many similarities but are not exactly the same. As tips of the spear in an all-out war on government inefficiency, shared services and cloud computing could help drive hundreds of billions of dollars in long-term savings while enabling enormous transparency and performance improvements throughout the government.” [1]

An important recognition in this transition is the eradication of the “special” or “home-baked” processes, while the accompanying cultural and organisational challenges are eerily familiar:

“The [US] government’s most significant achievement in three decades of shared services gradualism has been elimination of scores of agency-specific payroll systems and consolidation into four centralized providers that serve the entire government today. To this day, most agencies continue to self-serve for most administrative services. Redundant shadow staffs remain scattered throughout most agencies. Inefficient legacy systems continue to operate despite faster, better, and cheaper shared service or cloud computing alternatives. Most government shared services currently operating are under-used and under-performing relative to the state-of-the-art in other sectors. The government remains stuck in an obsolete, industrial age organizational model with vast redundancies and inefficiencies. It has flat-out failed to transform with the times into a lean, high performance enterprise suitable for 21st century challenges.” [2]

We also need to be ruthlessly honest about the problems that need to be tackled, and the opportunities on offer:

“Enforcing acceptance of standardized systems throughout the government would be one of the toughest, but most critical challenges determined leaders must face. Like the tax code, government administration is rife with complexity—the byproduct of over-designed, agency-unique systems. Agencies must be forced to accept plain vanilla and give up fancy flavors with marginal business value. Moving agencies onto common platforms is fundamental to the streamlining and consolidation necessary to unlock potential savings. It would also open up the government like never before to transparency and performance management improvements.” [3]

As I mentioned above, in the UK the shared services agenda historically made little headway: few departments share common services and systems. The few cross-government exceptions include examples such as the recent publishing platform, GOV.UK, its underlying performance dashboard,  GOV.UK Verify identity assurance and the much earlier API-based platform components of the Government Gateway (currently due to end life sometime in 2016).

This propagation of the “we’re special” mantra and the associated wholesale mid- and back-office bespoking and replication of processes, roles and functions arises not because of any technical constraints, but rather because of a failure to move the public sector away from its inefficient structural silos and the technology stacks that mirror them. A shared services culture and a platform-based approach won’t work in environments where the organisation is not able to re-engineer the way it operates to focus on users (in this case, citizens, business and frontline employees) and their needs rather than its own self-motivated organisational imperatives.

There remain deep-seated cultural, leadership, and organisational issues in the public sector’s current configuration that need to be tackled if we’re not to continue expending precious public sector resources on internal overheads rather than our public services. Part of the problem in the current debate is the failure to distinguish between those jobs (and their associated processes and organisations), that are much needed, and those that are duplicating and repeating what is already being done multiple times over elsewhere. In this sense, as in so many other areas, government is no different to any other large-scale organisation and their inherent tendency towards inertia and the status quo:

“Companies will quickly recognize ideas that fit the pattern that has proved successful for them before. But they will struggle with ideas that require a very different configuration of assets, resources and positions to be successful.” [4]

Or, as Marshall McLuhan put it more succinctly back in 1967, “We look at the present through a rear-view mirror”.

Deverticalisation / Utility-Commodity

UntitledThe move away from vertical structures to horizontal ones has rapidly become one of the hallmarks of the digital age. This deverticalisation has in part been enabled by open standards, commoditisation and the existence of common platforms. It’s not a new phenomenon, but part of a cycle of improvement that has moved through numerous industry segments over time. What is different now is that IT — and the essential role of open standards in technology — has brought deverticalisation to traditional “white collar” job roles, functions, processes and organisations.

In the private sector, competitive pressures drive its application. For the public sector to realise equivalent benefits, it has to generate its own “cathartic moment”, to foster an epiphany amongst both the political and public official classes that will enable resource to be moved away from the proliferation of roles, functions, processes and organisations that duplicate and replicate behind the scenes and instead enable them to be deployed to the frontline. While the technology may be different, we’ve seen such processes and impacts before:

“Something happened in the first years of the 20th century that would have seemed unthinkable just a few decades earlier: Manufacturers began to shut down and dismantle their water wheels, steam engines and electric generators. Since the beginning of the Industrial Age, power generation had been a seemingly intrinsic part of doing business, and mills and factories had had no choice but to maintain private power plants to run their machinery. As the new century dawned, however, an alternative started to emerge. Dozens of fledgling electricity producers began to erect central generating stations and use a network of wires to distribute their power to distant customers. Manufacturers no longer had to run their own dynamos; they could simply buy the electricity they needed, as needed, from the new suppliers. Power generation was being transformed from a corporate function to a utility.” [5]

The same applies now to many of the duplicated processes, functions, roles and organisations of the public sector. Yet the idea of the growth of utility or commodity IT services is hardly a new kid on the block. As Rappa commented in 2004

“The utility business model is shaped by a number of characteristics that are typical in public services: users consider the service a necessity, high reliability of service is critical, the ability to fully utilize capacity is limited, and services are scalable and benefit from economies of scale.” [6]

The more interesting question is why has this transition proved so slow? The answer is in part found in the behavioural and cultural inertia of the organisations involved — combined with significant vested interests in holding onto and maintaining the lucrative, and failed, business models of the past, such as long-term, wholesale outsourcing and new public management (NPM) (which we discuss in some detail in our book “Digitizing Government” and which has been extensively analysed by Dunleavy et al in “Digital Era Governance”).

Many industries and organisations are beginning to grasp the scale of the change required, but government remains behind the curve and needs to catch up fast — something the recent refocus on the Government as a Platform vision that the Government Digital Service set out in mid-2013 should help expedite. Its online guidance for CTOs recognises that:

“This move to platforms does not assume that government has to develop everything in-house: many of government’s needs can be met by cost-efficient utility services that already exist. Yet government can also help establish best practice in areas such as the privacy of citizen’s personal data, helping lead by example. Wherever appropriate, the government should use existing external platforms, such as, for example, payments services (ranging from third party merchant acquirer services to the UK’s national payments infrastructure). Deciding to develop platforms in-house will happen only where that makes best sense in terms of meeting users’ needs in the most flexible and cost-effective way.”

This distinction between what government needs to do itself (its genuinely unique needs) and those that it can consume is essential. After all:

“The global economy is in the midst of a major business-process revolution as significant as the one that occurred a century ago. As a result of a substantial decline in interaction costs, the new revolution is leading to the widespread de-verticalization of corporate business structures. De-verticalization is the process of separating functions and services from a vertically integrated business. Companies are undergoing this change because they can operate more efficiently and achieve better results by relying on partners to perform certain functions, rather than by maintaining control of these processes themselves. As de-verticalization unfolds in a given industry, supply-chain partners focused on particular aspects of the value chain emerge. Frequently, these partners develop greater economies of scale and superior skill than their in-house counterparts. The development of these partners reduces redundancy of operations in an industry and lowers the barriers to entry. [7]

The well-managed application of the benefits of deverticalisation will produce a significant reduction in costs — and supports the tight-loose approach that the UK government has been promoting. Good use of technology aligned to effective organisational design enables improved “… internal management, monitoring and control … This can facilitate both greater centralization and paradoxically greater autonomy of decision-making downward and outward.” [8]

It’s not difficult to see how many of these following benefits might be productively applied within the organisation of our public services:

“Both the reductions in transactions costs and the timeliness of information flow expands the span of control of managers and results in the flattening of organisations. They also encourage “de-verticalization”, “globalization” and “out- sourcing”. The “Product Cycle” has been greatly shortened–reduction in “time to market”–the average product cycle has shortened from 5 years to 12-18 months.” [9]

Think of the “product cycle” or “time to market” above in terms of successful implementation of a new policy “product” — such as improved welfare services or corporation tax — to understand its potential impact on government. Deverticalisation can help bring about the creative destruction of inefficient and expensive ways of operating, enabling many processes and functions to be simplified and improved. It’s increasingly unsustainable to propagate the current operating model. Public services need to be of the highest quality and delivered as cost-effectively as possible: this requires a major change in the way both the organisations that provide them and the services they provide are designed and delivered.

This change is no longer a divisive political choice of “right” or “left”, but a moral, societal and economic necessity. What does remain far more political is the choice of which services are delivered in-house and which from external services and suppliers. Part of resolving this long-standing issue may lie in properly mapping the landscape — understanding those roles that are unique and specific to the public sector (most typically those on the frontline of service delivery) which can best be kept in-house, and those that are generic (such as many middle- and back-office functions, systems and processes) that can best be sourced externally. But for public services to operate efficiently, in whatever balance of public-private engagement a government and its electorate desires, they will still require the underlying operating model to deverticalise, to transition to one that uses open standards and platform-based commodity components. Some of the characteristics of such changes are summarised in the table below.

Screen Shot 2014-12-12 at 09.57.36

[10]

As organisations encourage deverticalisation to develop:

“…more fulfillment partners emerge to seize the new business opportunity. Competition among fulfillment partners forces them to improve their skills even further; often, they become more skilled in their own domain than integrated players. Eventually, however, competition also tends to force down prices and lead to abundant capacity. Therefore, once the majority of the industry adopts a deverticalized operating model, pricing often falls to commodity-like levels.” [11]

Without taking advantage of the significant user benefits of deverticalisation — in particular reduced costs and better use of resources  — the public sector  will face an increasingly corrosive and existential crisis. The result will be the continued, progressive weakening of our public services over time whilst simultaneously inflating their costs and unnecessary structural overheads.

If frontline public services degrade, quality and productivity drop, staff morale decline, and costs continue to escalate, citizens and businesses alike will not only fail to receive the services required, but will also become increasingly disillusioned with the political and leadership processes that have isolated the public sector from renewal and improvement.

This is why the move to digital has to work: the alternative is unthinkable. Those who continue to stall or block government’s long-overdue modernisation for their own narrow self-interest are playing with the very soul of our public services.

SOURCES

[1] Marshall, J. Shared Services and the Cloud: Seize the Opportunity. The Public Manager, Fall 2010. p.62.

[2] Marshall, J. Shared Services and the Cloud: Seize the Opportunity. The Public Manager, Fall 2010. p.66.

[3] Marshall, J. Shared Services and the Cloud: Seize the Opportunity. The Public Manager, Fall 2010. p.67.

[4] Chesbrough, H. Open Business Models: How to thrive in the new innovation landscape. 2006. Harvard Business School Press. p.4.

[5] Carr, N. The End of Corporate Computing. SPRING 2005 MIT Sloan Management Review. p.57

[6] Rappa, MA. The utility business model and the future of computing services. IBM Systems Journal, Vol 43, No 1, 2004. p.32

[7]  Raskin, A; Mellquist, N. The New Industrial Revolution: De-verticalization on a Global Scale. Research on Strategic Change, August 2005. http://www.alliancebernstein.com

[8] Lau, LJ. The New and Traditional Economies. January 2001. p.4 . Retrieved from http://www.stanford.edu/~ljlau/Presentations/Presentations/011201.PDF on 26.10.2012]

[9]  Lau, LJ. The New and Traditional Economies. January 2001. p.6 . Retrieved from http://www.stanford.edu/~ljlau/Presentations/Presentations/011201.PDF on 26.10.2012

[10] Raskin, A; Mellquist, N. The New Industrial Revolution: De-verticalization on a Global Scale. RESEARCH ON STRATEGIC CHANGE, August 2005. p.2. http://www.alliancebernstein.com

[11] Raskin, A; Mellquist, N. The New Industrial Revolution: De-verticalization on a Global Scale. Research on Strategic Change, August 2005. http://www.alliancebernstein.com p.9

Posted in future Britain, IT, IT strategy, public services, technology, technology policy | 1 Comment

more thoughts on government in the digital age

Digitizing Government SmallThe book I’ve written with Alan Brown and Mark Thompson — Digitizing Government — is out. It’s here on Amazon UK and here as a Kindle edition: although it’s also here if you’d rather order online AND support your local independent bookshop. (The US version is due out 26th December — on Amazon US here).

We had a great open event launch party for the book last week at which a variety of distinguished panelists participated — Chi Onwurah MP, Liam Maxwell (UK Government CTO), Paul Brewer (Director for Digital Resources at Adur and Worthing Councils) and Paul Shetler (Chief Digital Officer at the Ministry of Justice), along with my fellow co-author Mark Thompson.

Our book looks at how the public sector needs to re-design itself for the digital age to help cultivate better public services. This isn’t just in terms of technology but about the behaviour, culture and re-designed services of truly digital organisations. In fact, much of what we focus on is as relevant for any large organisation struggling to make the most of “digital”.

I thought I’d set out in some occasional blogs a few background thoughts, themes and ideas that help provide additional backstory to some of the critiques, observations and recommendations we make in the book. This time I’m going to kick-off by looking at “Outsourcing” and “The Wrong Debate?” — with more to come in random future blogs across a wide range of topics that play into this space …

Outsourcing

The undifferentiated outsourcing that has dominated public sector thinking has been a blunt tool often inexpertly used. This isn’t to say there’s no role for outsourcing — far from it: it can play an essential role. But it needs to be intelligently applied as only one of many possible options, and people need to understand when it’s appropriate and when it isn’t.

Traditional suppliers are understandably keen to promote the role of outsourcing in helping fix some of the public sector’s many problems. In a 2011 interview, Capita called for more outsourcing of public sector roles to the private sector, stating that “90 per cent of the UK’s 500,000 civil servants were performing back and mid-office functions, which could easily be better managed by the private sector” [1].

Such a shift from public to private sector of clerical, support and administrative roles may or may not end up being more efficient and help save costs, but doesn’t really start the discussion in the right place. Undifferentiated outsourcing of what’s already within the public sector as it’s currently configured would potentially repeat what IT outsourcing did: hand another arbitrary organisation a set of people, systems, processes and costs frozen at a single moment in time.

Outsourcing applied simplistically becomes a costly displacement activity and does little to tackle the real issue — how public sector services can best be designed and delivered in order to better meet user need. Instead, these frozen services, with perhaps some marginal but largely inconsequential savings, are then merely re-sold by the private sector, as-is, back to the public sector. Worse, it becomes far more difficult (if not impossible) to sensibly redesign the end to end service given that parts of it are now under entirely separate ownership and management.

I’m not sure anyone really wins in this situation: the private sector company is often frustrated by the cumbersome and micro-managed contracts that prevent them innovating, and the public sector is frustrated by the belated realisation that little of the benefits it anticipated have come to fruition. As one Dell executive complained in 2010:

“Government expects its outsourcing service provider to maintain the complexity rather than to simplify and standardise the work processes,” he said.

“Processes and people are moved to the provider in their existing state and are independently managed next to countless similar processes of other companies. Consequently, the cost and service benefits of standardisation and simplification are lost.” [2]

It’s time we moved away from starting with any “solution” — such as outsourcing — without first understanding why, how and when it might be best applied: and when it might not be appropriate at all.

The wrong debate — public v private?

We all too often seem to end up in a very binary, Christmas pantomime-like debate about the role of public and private sectors: public sector good (hurrah!), private sector bad (boo!); or, just as inane, public sector inefficient (boo!), private sector efficient (hurrah!).

In describing the transition to digital government [3], Tim O’Reilly tries to move things away from the binary, bunkered down attitudes that often seem to prevent us properly discussing how we can get the best possible publicly funded services for citizens:

“…The idea that we have to choose between government providing services to citizens and leaving everything to the private sector is a false dichotomy. Tim Berners-Lee didn’t develop hundreds of millions of websites; Google didn’t develop thousands of Google Maps mashups; Apple developed only a few of the tens of thousands of applications for the iPhone.

Being a platform provider means government stripped down to the essentials. A platform provider builds essential infrastructure, creates core applications that demonstrate the power of the platform and inspire outside developers to push the platform even further, and enforces “rules of the road” that ensure that applications work well together.”  [4]

Meanwhile, in Canada, it also seems to be about far more than frontline cuts or “efficiency savings”:

“… fiscal restraint measures are driving the need to standardize, consolidate and re-engineer the way government operates and delivers services. By re-thinking how government delivers services, it will help lower the costs of services while improving the service experience.” [5]

In recent BBC coverage of the Chancellor’s Autumn Statement, the Director General of the CBI seemed to be a lone voice in raising the fundamental question of looking at how the public sector is designed, operated and maintained:

CBI director general John Cridland said the government would have to be “much more imaginative” about how it makes further spending cuts.

“Most of what we’ve done in this parliament, frankly, has been efficiency savings, cuts in head count, controls on pay,” he told BBC Radio 4’s Today programme.

“If you’re going to make the cuts we now need to make you’ve got to be far more lateral, you’ve got to re-engineer the whole model.” [6]

Our book examines these complex issues, looking at how the digital culture and practices of modern organisations can help improve the design and operation of government itself, and hence our public services.

The meaningful reform and renaissance of our public services requires us to move beyond the narrow “operational efficiencies” lens that currently dominates the political and media domains. The real task at hand is being side-tracked by the unacceptable — and unnecessary — axing instead of frontline services that impact some of the most vulnerable in our society. This “cut services” narrative misses the fundamental opportunity that the digital age provides: which is to rethink and radically improve government itself, stripping out the layers of duplication and redundancy, and to put an end to cutting the very services that the public sector is there to provide.

The opportunity that digital offers is about so much more than technology. It’s about enabling more resource to flow where taxpayers wanted it to go in the first place: the frontline.

[Update: this blog continues with a second post — (continued) more thoughts on government in the digital age]

[1] Gill Plimmer, Financial Times, August 23, 2011

[2] Kelly Fiveash, The Register, 9th July 2010. Retrieved from http://www.channelregister.co.uk/2010/07/09/dell_francis_maude_it_spending_cuts

[3] Kitsing, M. An Evaluation of E-Government in Estonia. Prepared for delivery at the Internet, Politics and Policy 2010: An Impact Assessment conference at Oxford University, UK, on September 16-17, 2010.

[4] Tim O’Reilly, Government as a Platform, 2010.

[5] International Council for IT in Government Administration (ICA). Canada Country Report for 2012. p2.

[6] BBC News, 4th December 2014. Retrieved from http://www.bbc.co.uk/news/uk-politics-30323690

Posted in digital, future Britain, IT, IT strategy, public services, technology, technology policy, Uncategorized | 1 Comment

Understanding and implementing new digital business models

Digitizing Government SmallOur new book about understanding and implementing new digital business models, Digitizing Government, is published 1st December in the UK and 26th December in the USA.

I’ve written it with Alan Brown and Mark Thompson, bringing together a range of experiences from our work with large organisations trying to adapt to the digital age, together with some of our own academic research. All three of us are a blend of active practitioners as well as being academics — Alan as Professor of Entrepreneurship and Innovation at the University of Surrey’s Business School, Mark as Senior Lecturer in Information Systems at Cambridge Judge Business School, and myself as Senior Research Fellow at Bath Spa University.

Although we’ve focused on governments — since they face some particularly complex challenges in transforming into truly digital organisations — investment in digital technologies and adaptation to digital culture has become essential for success and sustainability across a whole variety of organisations.

We’ve aimed to make our book practical rather than academic in nature, sharing experiences, insights and advice for understanding and implementing digital transformation to increase business value and improve client engagement. It’s in three broad sections — a “why”, “what”, and “how” that articulate and explore the major elements of digital transformation, and offer clear steps for execution of digital strategies.

We’ve included case studies from both private and public sectors, together with a detailed chronology of current digital change efforts here in the UK government. We relate these to government efforts in the USA and elsewhere in the world.

Ultimately we hope it provides a practical and unique set of insights into organisations in the digital economy. We don’t claim to have all the answers — merely to help nudge things in a better direction and to open up a more open and constructive debate about the future of our public services.

You can order it on Amazon UK and Amazon US, amongst other places — or better, why not order through your local bookshop and show them a bit of support? If you do read it, please let me know what you think: the move to digital is very much work in progress. We all need to share and learn more effectively if we’re going to help organisations successfully adapt.

Posted in digital, future Britain, IT, IT strategy, open government, public services, technology, technology policy | 1 Comment

be careful what you wish for … (part 94)

IMG_0592

Hold on — haven’t we been here before? There’s something very familiar about the recent unveiling of new powers for the state to snoop on the UK population through a proposed new Counter-Terrorism and Security Bill.

I doubt that anyone reasonable argues with the superficial intent — to detect criminals (specifically terrorists) and bring them to justice. The much more complex issue is the question of where lines are drawn: what is the most appropriate way of achieving that outcome? How much should the state intrude into everyone’s daily lives?

We need to find a solution that is proportional, sustainable and reasonable in our democratic state. The worst possible outcome would be that we blindly erode the very values we once used to uphold — not because someone bombed us day after day into doing so, but because we voluntarily surrendered our own freedoms, and hence our legitimacy, out of a misplaced sense of fear.

Part of the problem is that often these issues only seem to be considered through one lens: that of counter-terrorism. When in opposition, political parties tend to listen to a wide variety of expert opinion, and at least offer us the hope of developing reasonably balanced policies. Once in power however, governments seem to turn to single sources of truth, all of them peering through the same lens. Over time, it becomes almost impossible to distinguish a new government’s policy on these issues from the ones that preceded it, as their once-intended policy becomes progressively degraded.

These latest counter-terrorism proposals seem intended to work by eradicating any remaining vestiges of anonymity and privacy in our daily lives. This extinction of personal anonymity has profound implications — not just for journalists whose sources can no longer be protected, not just for MPs trying to meet in confidence with their constituents, or for NHS whistleblowers, but also for the police themselves. And indeed for the rest of us, just trying to muddle along and get on with our lives.

I do wonder which undercover criminal sources, or “double-agent” jihadists, are going to run the risk of communicating with or meeting with the police or intelligence agencies, when they know it’s no longer either secret or safe to do so? With that telltale, pointing-finger trail of mobile phone interactions and email exchanges left in their wake, who would risk putting their lives on their line? The risks for such essential insider informants are being multiplied by the very measures presumably intended to help.

I’m surprised we haven’t heard more from the likes of important players such as Crimestoppers — given that their assurances of anonymity for those wanting to help presumably plays a key role in encouraging people with information to come forward. Their site says they received over 100,000 pieces of useful information about crime from the public last year, and over 6,000 criminals were arrested and charged. Yet who has evaluated the impact that the removal of anonymity will have on such essential sources of information?

Without a proper debate, and a rigorous assessment of the likely real-world impacts of a further erosion of our online and electronic device privacy, who knows whether these latest familiar proposals changes will actually assist — or degrade — counter-terrorism intelligence work?

Part of the debate that we need includes news programmes and journalists asking these sort of questions: “What impact will the end of anonymity have on essential intelligence gathering sources like Crimestoppers?”, “How will the the police be able to meet with informers if all the details of who met with whom and when are automatically being gathered electronically?” and even (admittedly more self-serving) “How will journalists protect their sources?”. But ultimately these issues and their very real impacts are not going to go away merely because a properly informed debate doesn’t take place.

We need a much better public discussion about where these lines are drawn: what information is gathered, from whom, in what detail, what it is stored and how it is protected and how accessed. No computer system is 100% secure. There’s no such thing. Information will leak from the systems holding all these sensitive information. The odd rogue insider will occasionally — and inevitably — abuse their position: sources will be compromised, confidence undermined, sources of intelligence lost. Possibly far, far worse.

If these proposals do go ahead, the controls and democratically-accountable oversight regimes put into place must be robust and demonstrably independent to counterbalance them. Those who abuse the system — and they will — must be brought promptly and publicly to trial, and those who are inadvertently exposed — police sources, journalist sources, MPs’ constituents, NHS and financial services whistleblowers — rigorously protected. Parliament needs the capability, commitment and power to ensure our (unwritten) constitution is not undermined by the drip, drip, drip of incremental responses to the fear of terrorist activities.

To answer my own opening question — yes, we have been here before. And I’m sure we’ll be here again. We’ve seen this well-meaning, but one-sided perspective in the past: that’s partly what my semi-dramatised 2006 blog when guilty men go free was all about. When proposals such as this are put in front of us, they need to be robustly assessed by a credible public challenge rooted in the wider reality of the way our country operates and our people live their lives — and not simplistically considered through a counter-terrorism lens, darkly.

Posted in future Britain, privacy, security, technology, technology policy, Uncategorized | 1 Comment

Happy 20th anniversary online government

It’s 20 years ago this month that the UK government first launched a website intended to provide a simplified, single point of access to information from across the public sector. I thought I’d add a little more detail — or at least, a few historic screenshots — to support my recent CIO column marking the anniversary.

The Government Information Service (GIS), hosted at open.gov.uk, launched in November 1994. It was intended that over 400 public sector organisations, including government departments, local authorities and police forces, would provide their information on the site, which received around 200,000 hits a day shortly after launch.

GIS

In July 1996, this summarised the state of play:

23 July 1996

By mid 1997 it was approaching 2m requests a week.

GIS

In 1999, the “Portal Feasibility Study” (PDF) set out plans for a more comprehensive approach to delivering all government services online in one place. The portal element of this architecture was originally nicknamed “me.gov”: below are some mockups from 2000 of how it was envisaged it might look during early envisaging.

me.gov 1 2000

me.gov 2 2000By the time of its launch, it had become “UKonline”. UKonline initially appeared as a beta site in November 2000, followed by a formal launch in February 2001.

UK Online

UKonline aimed to provide more integrated services, built around citizens’ “life episodes” (events that had meaning to them), rather than just projecting the departmentally-based  silo services already in existence.

UK Online life episodes

The 1st March 2004 saw another rebrand and relaunch, this time as Directgov.

DirectGov

In May 2011, Directgov (and its sister site, BusinessLink — dedicated to meeting the needs of UK business users) began to be superseded by GOV.UK, initially as an alpha.

Alphagov

In October 2012, the site replaced Directgov and went fully operational as GOV.UK, celebrating its second birthday just last month.

UK.GOV.October 2014

I’ve collated some stats on the usage of the online site(s) in various guises over the past 20 years below — not helped by early stats relating to “hits” or “visits” and more recent measures relating to “unique visitors/users”. So don’t take this as the definitive or final comment on the growth of online government information and services but a partial snapshot at a moment in time … (and if any of you have additional interim dates and usage stats not shown, let me know and I’ll revise/improve the list).

  • 1994 — 200,000 hits a day
  • 1997 — 285,000 hits a day
  • 2004 — 172,257 unique visitors a day
  • 2012 — 1m unique visitors a day
  • 2014 — 1.4m unique visitors a day

Happy 20th anniversary!

[A more detailed narrative of the last 20 years of online government is provided in an earlier blog here]

Posted in IT, IT strategy, open government, public services, technology, technology policy | 1 Comment

random access memories

I’m often asked how I got into computing in a generation when neither IT not computer science were on the school curriculum. So, I’ll try to fill in a few random gaps from some cobwebby parts of my memory …

It all started as a hobby. One of my earliest memories of using a computer was thanks to the North Kent Amateur Computer club and the host of hobbyists and enthusiasts that used to attend. It was the age of the kit computer, when the likes of the UK101 ruled supreme.

Compukit_UK101_Mainboard_s1

I think it was Thursday evenings after school when I’d turn up at some local venue — usually a spare school hall — to find a room full of enthusiastic, often bearded participants, all hunched over circuit boards and small TV sets acting as monitors.

From time to time one of them would pick up a soldering iron and make an adjustment to something, or unplug and restart the system or wallop a TV set to stop the picture jumping. These are the folks I’ve always thought of as “digital natives” — people who understand how to design and use digital technology — but that phrase seems to be (ab)used now merely to mean people who have grown up using digital devices as consumers.

I remember testing out some of the simple 3D green mazes and early scrolling games that tried to make it feel like I was speeding across a largely imagined landscape and into the distance. The members of the club were always unfailingly generous in letting me try out their latest hardware builds and hand-cranked software instead of merely dismissing me as another irritating spotty teenager from the local comprehensive avoiding homework with a much more interesting displacement activity.

Towards the  end of my sixth form days, the first computer had turned up at the school — a Commodore PET.

pet3d

Ever since, I’ve regarded this as what a “real” computer should be like — with a built-in monitor and keyboard, and its own sophisticated storage system (aka a cassette tape deck). We were able to book the PET after school to use for something like an hour at a time — just about enough time to load something from the tape, suss out how it worked and then hand it over to the next person. We seemed to spend most of our time standing around it and working out how to win at the text-based version of the Star Trek game.

I remember at home we also ended up owning one of the earliest video game consoles — a Magnavox Odyssey. My mum bought it for us secondhand (I’m not quite sure how we managed to persuade her to do that, particularly given how tight money was). It seemed great fun for its time, complete with a light rifle and a set of screen overlays that would be held in place on the TV screen courtesy of the static charge.

magnavox-odyssey-ad

These imaginative static overlays turned our black and white TV into “colour” for the first time, with most of the games based on a variation of Pong. To change game, you removed one of the cards that came with the console, and inserted another one in its place. I remember using the rifle to take pot shots at the ball of light as it moved around the “haunted house” overlay, appearing momentarily at some of the “windows” cut into the static film. State of the art or what?

Like most of my early machines, sadly I’ve no idea what happened to the Odyssey over the years — lost in one of those many post-university moves between various rental properties in and around rundown parts of London.

My own first purchase was a Sinclair ZX-80, then the ZX-81, then the ZX-Spectrum.

Screen Shot 2014-10-29 at 07.32.04

Like many of my generation, I owe a lot to the pioneering spirit of Sir Clive Sinclair: thanks mate.

sinclair_zx80_hr_1s

At university, the Spectrum ended up in the communal room, used for gaming alongside our “artistic” pyramid built from dozens of empty beer cans assembled on the mantelpiece. The dusty two-bar electric fire that made this the only warm room in the house was at the end of slightly melted extension cable which ran into the adjacent bedroom — the only room in the Stoke Newington house we all shared that had no electricity slot meter in it. (From what I remember, most of the house was wired via a spaghetti ball of cables back into that room and its free electricity supply …).

Initially, programming these early computers either consisted of copying in program listings from hobbyist magazines or writing your own code. With the magazine listings there always seemed to be errors in the code, meaning it was necessary to buy the next month’s edition as well for the errata — unless I managed to work out for myself what lines had been missed or garbled in the meantime. Later came the use of cassette decks, a rather erratic and unpredictable way of loading programs — often only failing after 15 or 20 minutes or more when the Play button on the cassette player would pop up and you’d realise the program had failed to load.

Later I moved onto a Commodore 64, a BBC Model B and then onto a “proper” computer — the Apricot F10. My programming efforts, which had started with BASIC, experimented with everything from assembly language (on both Z80 and 6502 processors) to Fortran, Prolog and Pascal.

act_apricot-f2_1

At work when I started, it was an age when the IBM PC had yet to dominate — it was not unusual to find an office where almost everyone was working at a different incompatible computer. On one desk would be an Apple II, at another a Commodore and at another an Apricot. Apricot were doing well in the UK at the time, particularly in the public sector — they ran faster and more effectively at a lower price than their American cousins. Not that it helped them much in fighting off the competition …

… to be continued (possibly) at some future point …

[With thanks to http://www.computinghistory.org.uk, www.rewindmuseum.com and http://www.computinghistory.org.uk for the assortment of images used in this blog]

Posted in technology, Uncategorized | 1 Comment