I’ve written before about the need to better integrate technology and policymaking — in 360-degree policy making, policy making in the digital age, and many earlier pieces going back over several decades, such as my 2006 co-authored paper The New World of Government Work.

Governments have been keen to take advantage of digital, data and technology (DDaT). Yet compared to their original aspirations, the e-government and digital government movements have achieved little over the past three decades. Yes, technology has been used to move services online and make them slicker and more efficient, but usually within the existing policy and organisational silos of government.

Much of the focus has been on the front end, the presentation tier of government, improving the usability of existing services and processes through incremental improvements to their website design. DDaT has been little used to help inform, shape and improve policy itself, and to help redesign government to be more effective.

This failure to integrate technology and policymaking lies at the heart of the failure of DDaT to achieve its potential. DDaT’s true significance lies in helping gather and analyse data and needs, supporting the development of evidence-based programmes, modelling and measuring outcomes, and helping refine them through processes of continuous feedback and improvement. It’s essential to help us better understand how effective (not simply how efficient) policies are. It can help governments become more informed and more responsive.

Policymaking remains stuck in the past, slow and often based on political dogma and ideology, rather than evidence and efficacy. Gathering data and evidence is slow, partial and erratic, partly because of the lack of a consistent data infrastructure that would facilitate the gathering and analysis of relevant information.

Policymaking remains regimented and siloed within the narrow organisational and service structures of the pre-digital age, leaving policies on everything from welfare to education to housing to operate as if they’re somehow separate from each other, rather than interconnected. For all the welcome improvements at the front end of government, policymaking remains inefficient, incomplete, and contradictory.

Policymaking should adopt the evidence-based, continuous-improvement model of software development

The failure to make the promised progress with open data and APIs (interfaces) is of particular significance to the failure of DDaT to improve the policymaking process. Open data and open APIs would enable data to be drawn from across all relevant systems and processes, regardless of their owning department. Policymakers could gather evidence to tackle complex socio-economic and user needs rather than being constrained by the Victorian structure of governments’ current institutions. Large-scale data sets could be navigated through the use of digital technologies such as artificial intelligence, helping policymakers to analyse, make sense of, and respond to complex policy challenges.

While designing policy, policymakers could work in a transparent way. They could use digital tools to improve participation and co-creation, trying ideas on a small scale, gathering continuous feedback and improvement, and designing truly integrated, and more effective, services. Over time, the existing organisational silos of government could be reformed and redesigned around these integrated services, providing a migration path towards more effective and efficient digital organisations and practices.

The e-government and digital government movement started back in the early 1990s. Years of trying to modernise government by starting at the front end, improving the presentation tier, has failed to bring the expected benefits. Governments need to return to earlier ideas: they should mandate the use and exposure of open data from their systems and make it available through open, standardised APIs. This is nothing new: governments have repeatedly recognised the value of data and the need for standard formats and interfaces. Where they have failed is in ensuring its delivery and implementation.

It’s time for governments to re-energise the delivery of open data and interfaces as the first tangible step towards more effective policymaking. They should prioritise the public data that’s likely to deliver the quickest benefits, establishing pilot projects, building the capabilities and skills both within government and in its supplier ecosystem, and doing so in the open to build momentum and trust. By starting small, showing practical and worthwhile results, and then scaling the work more widely and deeply across government, it will help build and sustain a delivery model that’s long been lacking.

As open data and interfaces become more ubiquitous across the public sector, policymaking will finally be positioned where it needs to be: centred on needs and users, rather than working within the narrow silo structures of government. Government will be able to identify how it can play a more effective role, delivering policies and outcomes that break down the current narrow domains and focus instead on the needs of those they are there to serve. Policy can become more joined-up and integrated across areas such as health, housing, education, and welfare.

It’s time to unshackle DDaT and bring it into the heart of policymaking. Time to realise its potential to radically improve the effectiveness of policymaking, the outcomes it achieves, and the quality of life for citizens.


Updated 11.05.2022 with the diagram of ‘policymaking should adopt the evidence-based, continuous-improvement model of software development’, taken from an interim illustration intended for my new book.