"Data is the most valuable asset on Earth," says Britanny Kaiser, the former director of Cambridge Analytica, in The Great Hack, a documentary on the work of the controversial and now-defunct political consultancy that specialised in the use of data. If this is true, why are planning departments, with all the data they hold and produce, in such financial difficulties? Because planners have not learnt to value and nurture their data.
The biggest single source of data planning departments own is the evidence base for local plans. A typical local plan might have 50 supporting documents tapping into 400 different datasets, covering everything from heritage to viability and demographics. But unless you work in the planning policy team, are an inspector, or a sufficiently determined community activist, it’s unlikely that you will ever see the reams of data, pages of tables, and an atlas-worth of maps contained in these documents.
To find them, you have to navigate through hundreds of broken web links, poorly-named files and reports segmented into multiple files. Not only is the original data hard to find, but it’s locked up inside PDFs that are impossible to search or analyse, making it incredibly hard to use by either human or machine.
Every local planning authority in the UK spends hundreds of thousands of pounds on their local plan evidence base. Not only is this a sunk cost to planning departments, who have to pay to update reports time after time, the format this data is stored in means it can’t be easily shared with other departments - such as housing, health, education or transport.
For example, many of the datasets collected as part of a housing market assessment are the same as those which inform a community infrastructure levy, a strategic housing land availability assessment, or an infrastructure capacity assessment. But, bewilderingly, the information for these four studies is all procured separately. And any synergies or interdependencies that do occur between them are managed by human hand, making the process slow, with errors and loss of fidelity throughout the process.
What’s needed is for planning departments to hold all their spatial data in robust digital registers, which can grow over time, be easily shared, and used not just to support local plans but across departments. This would not just provide efficiency savings by reducing the cost of local plans, but also ensure everyone is working with the same figures and assumptions, and make it easier to build tools to access, interpret, and analyse the data.
These registers will allow local authorities to maximise the value of the generated data, reduce the time it takes to produce local plans and make them more transparent and understandable to citizens and developers. The data is there to be used. We just need to modernise a little to make use of it.
So next time you commission a study or report to support your local policies, make sure you ask for the raw data behind the research. Make sure you get this in a machine-readable format, such as a .csv file. You might not have the tools and infrastructure to do much with it today, but as is often said in technology circles: software ages like fish, but data ages like wine.