This week Bob Neill MP, Chair of the Justice Committee launched their report on the government’s £1.1 billion court reform programme. “We understand that courts and tribunals are strained to breaking, with systems that ever more people are having to try to navigate for themselves. Court staff and the judiciary are trying hard to improve services in the face of underfunding and cuts. But we are concerned that a vulnerable person – a victim of crime, a woman seeking an order to protect her children, a person with learning difficulties – could be left trying to negotiate enough time at a library to file papers or tune in to an evidence hearing where they are trying to get justice”. The report is highly critical of aspects of the programme, including the lack of evidence of how the reforms affect access to justice and justice outcomes.
The way the government is testing these new services – such as online pleas and all video remand – has long troubled me. In this long read (originally written for the Public Law Project), I’ve tried to article why user testing does not test access to justice:
“Access to justice….means something more than being able to complete an online form and feel comfortable with the process. It requires the ability to engage, to participate, to be dealt with by fair procedures and to receive a substantively just outcome”.
The words of Dame Hazel Genn are as relevant to the court reform programme now as they were two years ago when she gave the Birkenhead lecture. So how does the programme affect access to justice? My focus in this essay is on online aspects of the programme, the use of online forms and processes and video. And on the HMCTS development method – service and agile design. The court reform programme is aimed at building a system around user needs and experiences. This ‘service design’ approach has been around for fifteen years and was adopted by the government digital service four years ago.
The process should start with identifying a service or product which could be improved. Then ethnographic and other research is used to find out how the user experiences that service, and what their views are. Next the designers use their understanding of the users’ experience to change the service. Then they test the adapted service with users, and keep making “iterative” changes until the product is ready to launch. Conventional research is not part of the process.
Service design is fantastic for consumer services, but is the court reform programme adhering to its core principles and should HMCTS be so reliant on it? The service designer should start with a totally open mind as to how a consumer problem might be resolved. But in the case of court reform, the solutions to the problem – how do we maintain or improve access to justice while improving efficiency in the court system – had already been decided, at least in outline. HMCTS had decided the solutions were online processes, video hearings, and court closures before the service design process started. This means all the processes are working backwards.
I’ll give an example. HMCTS had a problem – they wanted to reduce the number of hearings in criminal courts, to reduce usage of the courts and to save on the transport and supervision of defendants. All those detained by the police overnight have to be “produced” in court the next working day. Most detainees are transported in vans from the police station to the nearest magistrates’ court. Already in 2015 a few police stations had a video hearing room in their custody suite from which defendants would appear in the local magistrates court. HMCTS decided to use these as a starting point for testing all video (virtual) police remand hearings ie court hearings where no-one was physically in the court room, and all parties were on separate video screens.
A good question to ask at the outset would have been: how do remand hearings work now both when everyone is in the court and when the defendant appears on video and everyone else is in the court room? What is the defendants’ experience and what are the factors which affect the outcome? There was no good research on this. But the user testing process was designed around one possible solution – video remand hearings – and how to make them work. Early in 2018 I found out that HMCTS had started the service design process and I put in a freedom of information request for the user experience research generated. I obtained a document about the “discovery phase”, in which the researchers had done interviews and focus groups with police, lawyers and court staff, but not with any defendants, the main users of the system.
No ethnographic research was done to follow a defendant’s journey from custody suite to video or real court, and then to prison/home. And justice outcomes were not mentioned. Even the researchers seemed to acknowledge that “all video” remand hearings were not necessarily the right solution to the chaotic remand process they had witnessed: “This initial study has identified a number of substantial challenges and issues within remand hearings, when initial focus has been scoping for the implementation of a technology-based process…. In-house enabling projects such as Virtual Hearings are developing products that may not be suitable for remand hearings due to the nature of the domain. Therefore, process and business change or alternative solutions are required which may prove to create additional challenges to meet (e.g. [government digital service] assessments)”
In fact the government are still trying to make all video remand hearings work, and in this case have switched from user testing to traditional academic research. We await the results of their evaluation of video hearings at Medway Court which, incidentally, are not all video. This research will monitor the effect on justice outcomes of putting defendants on video, an issue which user research did not touch. The problem with user testing is that it does not have a code of rules. It does not abide by any of the principles of government social research, including that all research should be published. It is not actually research. All the HMCTS user testing is done behind closed doors, without anyone knowing what users are recruited, what they are asked, what they test and how their testing is evaluated. We also don’t know if any testing is done ethnographically – in a real home, with a real case, rather than in a lab.
An example of another criminal process which is almost impossible to find out about, but which was presumably subject to user testing, is the single justice procedure (SJP). This criminal process actually started pre-reform, but has been embraced by the reform process and digitised. The single justice procedure enables a low-level criminal case (e.g. non payment of TV licence) where the defendant pleads guilty to be dealt with on the papers, in a closed court by one magistrate assisted by a legal advisor. More than half of all criminal cases go through this procedure. Defendants are sent a charge in the post and plead guilty either by filling in a paper form or by going online.
I have seen the online process once only at an HMCTS stakeholder event. I was concerned that there was no explanation on the website as to how the judgment on level of fine (the standard sanction) would be reached, nor any information about the criminal record the defendant would gain. I was told the digital SJP had been successfully tested. I suppose people did go through the form in a lab and HMCTS refined it until most users completed it within a certain timeframe. But were they effectively participating if they did not understand the full implications of the process? 73% of people say they were satisfied with the online SJP service But what does this mean? That they felt the form was easy to fill in?
As a whole the single justice procedure doesn’t seem to be meeting the needs of the majority of its “users” – 80% of those sent a charge in the post are not responding to it, either by snail mail or online. They are then judged guilty by default and get the maximum penalty available. There is no data as to how many of those convicted pay the fine, and how many actually know they have a criminal conviction. The user is supposed to be at the centre of the court reform process but if most people don’t engage with the single justice procedure, users are not even in the process.
The service design process doesn’t monitor justice outcomes since success is equated with usage and satisfaction. It is designed for consumer services not justice. But without monitoring the impact of reform on justice outcomes we have no means of judging the impact of digital processes on access to justice. When I read the blogs of HMCTS staff it appears two completely separate worlds are operating side by side. HMCTS are focussed on making the court process as easy and efficient as possible, while lawyers, defendants and plaintiffs value other principles too, some more highly than convenience.
A recent blog was on “applying principles of performance management to the justice system”. It started by referring to the most popular measurement – time taken for cases to be processed. The blog suggested other performance measures to monitor
These are really important but surely there are even more important aspects of performance to measure such as
These are taken from the excellent Legal Education Foundation report on how to evaluate the court reform programme
Of course there are some overlaps between the two sets of performance measures, but the HMCTS user research is definitely not measuring effective participation and impact on fair trial rights. The HMCTS blog points out that “adjourning hearings (moving the date of a hearing back) can be really disruptive for people” without acknowledging that adjournment can also lead to defendants spending longer in prison or to trials collapsing.
HMCTS are putting nearly all their eggs in the service design/user research basket. But they are subverting that method through starting with the solution and ignoring the tricky issues which can’t be user tested. Only by starting with a blank sheet, using open policy making processes and commissioning both social research and user testing, could the programme have truly answered the challenge – how do we modernise the justice system and improve access to justice while spending less on it.