https://civilservice.blog.gov.uk/2016/06/27/5-ways-we-are-putting-data-in-the-driving-seat/

5 ways we are putting data in the driving seat

Richard Heaton, MoJ Permanent Secretary
Richard Heaton, MoJ Permanent Secretary

We want the Ministry of Justice to be a data-driven department. It’s part of a plan to make the department smaller, simpler and smarter.

But nowadays everyone’s talking about data. So, what do I actually mean by being ‘data-driven’? Here are five things.

  1. Open data and performance

Ministry of Justice name plateFirst, we want data and evidence to be the engine for reforms to prisons and courts.  The Secretary of State for Justice speaks of the need to create a self-improving and accountable system with sophisticated performance measurement. That would mean visible impact indicators – generating real-time, automated information for all to see.

And that in turn requires a positive attitude to transparency and openness, and a sensible relationship with risk. Neither our performance nor our data will ever be perfect or complete what is? But the more we publish, and the more we listen to users and to critics, the better we will become. So, expect to see the MoJ releasing more data (to open data standards) and publishing our own performance tools, too.

  1. Unblocking data flows in the justice system

Data sharing has long been a contested subject. Rightly so; it’s important that the rules properly balance privacy and public good. But in a complex network like criminal justice, the whole system could be quicker, more accurate, fairer for victims, and certainly more efficient, if information flowed more easily. Basic stuff like charging decisions, convictions, sentences. Specialised information like offender risk assessments. Data files like video from body-worn cameras.

Sometimes, this data flows smoothly between the many different bodies involved. Frequently, it doesn’t. The same piece of information often gets recorded more than once, on different systems, by different organisations, in different ways. Or it’s passed on manually, mistranslated or corrupted. Most of it is personal data – so scrupulous adherence to data protection law is essential. But across the criminal justice system, we need a fresh approach; we need to recognise the importance of free-flowing data and act on that insight.

So, we’ve been talking – with police forces, government departments, CPS – about what the principles of data stewardship should be. We are testing the idea that there should be a duty on the creator of data to look after it on behalf of the entire system.  A principle that it must be of high quality; accessible to others; and that no charge should be made for access to it. The starting point is not a prescription of technical standards (though they are important), but a simple agreement on terms of trade. Then we can build new platforms with confidence; and we can work out which current data flows are blocked and need attention, and for what reason.

  1. Data good, evidence better

There’s lots of research, population and performance data, and longitudinal studies in the justice field. Ideally, all that knowledge ought to be available to practitioners and decision-makers. But the activity tends to be disconnected, with no central focus and no single collection of evidence on themes such as race, or domestic violence.  Some parts of the sector are well-served. Some issues are well-evidenced. Others are not.  We are determined to bring the ‘what works’ approach to the whole system, so that we can generate scale, enhance existing activity, fill gaps, and allow more decisions to be better informed.

And as an occasional lawyer, I wonder whether the legal profession also needs to understand better where to find and how to use these new mines of information. Courtroom and legal training are geared towards assessing the relative strengths of competing accounts; the law does not naturally turn to population data or system-wide evidence to determine what intervention is likely to be most suitable and just. And when it does, the data is heavily mediated by expert witnesses; it is their opinion that carries the day. Would justice be better served if courts had better access to evidence of what works – in family cases, for example?

  1. Predictive analytics: a no-go for government?

Insurance companies harness data from activity trackers and wearable tech, and can offer reduced premiums based on your exercise routine. Genetics testing can, at least in theory, determine your life insurance. And car insurance can be influenced not just by the car you drive and where you live, but also by your driving style, as recorded by a black box on your dashboard. Advertisers rely on analytics from tech companies to create directly targeted campaigns. Predictive analytics is not just big data – it’s increasingly big business.

I have no doubt that the Civil Service needs to catch up. These techniques will help us find better ways of delivering public services, and of shaping policy advice.  Algorithms applied to large datasets are going to be useful for operations – to plan flood defences, for example. Comparing individual data against population data will help managers predict and prevent patterns of infection in hospitals, or incidents of violence or self-harm in prisons. You will all be able to think of similar examples.

Could we go further, and replace human decisions about people’s lives with machine learning and predictive analysis? Perhaps the better question is, how far should we go? Getting the ethical framework right is as important as the technical capability; and nowhere is that ethical framework more important than in justice.

But even where we continue to rely on human judgements, we need to be confident enough to know when it is safe to allow data to inform those decisions. We need to know when a predicted pattern is good enough to rely upon, and for what purposes. We need to be clear, ethically, about when predictions are and are not a legitimate basis for action. And we need to be aware that artificial intelligence is capable of eliminating many errors, but might equally reinforce bias.

  1. Being open to open data

At the end of last year I asked Sir Michael Barber to chair a new Data, Evidence and Science Board for the MoJ.  It has a range of external experts: people like Ben Goldacre, Gavin Starks, Betsy Stanko, Peter Neyroud and Jenn Rubin; alongside data colleagues from MoJ and GDS. They look at how we go about things and challenge us to go further, or faster, or in a different direction. The board’s meetings are stimulating and challenging sessions, and if colleagues elsewhere in government would like to try something similar, I’d be happy to chat about what it’s taught us.

3 comments

  1. Comment by Graham Parsons posted on

    There is some really good stuff here but point 2 misses something "we’ve been talking – with police forces, government departments, CPS – about what the principles of data stewardship should be" what about the bar and solicitors who play an hugely important part in the CJS? Without their involvement and agreement then this initiative will stall as previous attempts have. It will not be acceptable behaviour to enforce standards or stewardship or worse still ignore these parties altogether.

  2. Comment by Graham Meaden posted on

    I am surprised at the point: Predictive analytics: a no-go for government?

    It seems government departments still do not share their knowledge and expertise.

    When I worked at HMRC ten years ago for driving improvements in compliance was to establish a data-driven risk and response management capability. The ability to use retrospective analysis to determine risk profiles and apply those in a predictive way to determine the best (ROI) response to a known or emerging risk was a fundamental to a) efficiency savings b) effectiveness of compliance c) reduce the tax gap.

  3. Comment by Mike Smith posted on

    We should have been doing this years ago but it is typical of the Civil Service that we still don't have the tools to get anywhere near meeting these ambitions. My job is analysing large amounts of data - though much of the data cannot be extracted from ancient databases or can only be extracted at the cost of a great deal of staff time or by paying an external 'service provider' a fortune to do so. The tools at my disposal? An obsolete PC, Windows XP, IE7 and Office 2003 - though bizarrely I'm not allowed MS Access - and hopelessly inadequate server capacity. What should take minutes takes hours, what should take hours takes days.
    My own Department is introducing a common IT platform and guess what? There isn't enough money to ensure adequate data extraction and analysis and anyway it's not regarded as high priority.
    So what you have to say Richard is welcome - but it isn't going to happen to any great degree without investing in the tools to do it.

    P.S. Great to see Ben Goldacre involved though. 🙂