Digital forensics in the UK is in need of reform, says one expert, as the deadline to advise the government on computer evidence rules arrives.
According to Peter Sommer, professor of digital forensics at Birmingham City University, various issues threaten the reliability of digital evidence, from the software used to gather it to the manual methods that are deployed when platforms don't provide it themselves.
Sommer published his thoughts in a public response to the Ministry of Justice's (MoJ) call for views about how the admissibility of computer evidence is working in practice.
Issuing the call, Sarah Sackman KC MP, minister for courts and legal services, specifically cites the Post Office's infamous Horizon IT scandal as one of the influential reasons for the review.
Up until the landmark conclusion of the long-running scandal, which was later dramatized on TV, the legal presumption was that computer evidence was reliable, as per a Law Commission paper published in 1997.
Sackman also highlighted how it has been over 20 years since the current principles of computer evidence-gathering were established, and that alone is enough of a reason to revisit them.
For those among our readership who are interested in the full breadth and context of what digital evidence looks like in the UK at present, Sommer does a great job outlining it in his response [PDF], although the main forms are:
Digital communications, such as SMS
Communications within online platforms such as social media sites and email
Mobile phone extraction reports
All three are subject to the same two problematic processes: The methods of data extraction from a device and the fact technicians are required to manually process that extracted data to ensure it's presentable as an exhibit.
Because both of these are dependent on the use of software Sommer described as "questionable," the overall reliability of computer evidence cannot be absolutely certain.
Sommer argues that evidence is never binary – reliable or unreliable – but the reliability of software-extracted evidence is a concern primarily due to nearly all software comprising some form of a third-party library, which the developer doesn't scrutinize for its reliability to produce digital evidence.
Plus, some software will allow investigators to run their own scripts or automations, which in turn could lead to skewed or otherwise biased results if they're not generated in line with set standards.
In cases where investigators pull all the files from a suspect's smartphone, for example, the massive volume of raw data can't be presented to a lay jury because it would be too technical and abundant. Prosecutors take that data and convert it into a human-readable form.
So, even if the data itself can be wholly trusted, in such cases where human intervention takes place between the raw data being extracted and its delivery in court, this could influence how the evidence is portrayed to a jury, potentially weakening its reliability.
Research from 2021 concluded that digital forensics practitioners would find more or less evidence on devices depending on the way in which they were briefed about the case. The study highlighted how unintentional bias can slip into an evidence-gathering process that is legally presumed to be substantially reliable.
There are additional complications to consider beyond the software argument, which forms the backbone of Sommer's submission. When evidence derives from an online platform, for example, which may or may not be helpful in handing over data for a case, police investigators often resort to downloading data manually and screengrabbing what they need – processes that are neither formalized nor governed by principles to ensure their reliability.
This, coupled with the cases of significant faults in single systems, such as Horizon, are credible threats to the reliability of evidence, but the lack of standards overseeing the software used to collect evidence from various devices is the professor's biggest concern.
Sommer told The Register: "My main concern was that the MoJ shouldn't just be looking at the 'big computer produces misleading results' scenario à la Post Office Horizon but at all the other sources and forms of digital evidence such as smartphones, vehicular forensics, IoT forensics etc. – and the extent to which software is used by law enforcement to analyze data and then produce vivid exhibits for trial use."
He said in the submission that the software being used to extract data from less-mainstream devices such as internet-connected cars and other IoT devices is still being developed. Similarly, the software used for smartphones is more refined but undergoes frequent revisions, so the output could be different with each version.
Proposed solution
Whenever concerns are raised about the integrity of a process as important as digital evidence-gathering, many people's first instinct is to look to government for legislative answers.
After all, it was the Law Commission in 1997 that repealed section 69 of the Police and Criminal Evidence Act 1984 (PACE), effectively replacing the law's stipulation that prosecutors must prove the computer source of evidence was working properly so that it could be admissible with the legal presumption of machine reliability.
The decision was, in retrospect, a misguided one – arguably due to a misinterpretation of the advice the Law Commission received.
- Ransomware crims hammering UK more than ever as British techies complain the board just doesn't get it
- US sensor giant Sensata admits ransomware derailed ops
- Europol: Five pay-per-infect suspects cuffed, some spill secrets to cops
- UK officials insist 'murder prediction tool' algorithms purely abstract
Sommer said a legislative approach could be taken, but would likely lead to disputes and other difficulties.
He said: "The problem with a statutory approach is that some material will then become inadmissible and others admissible and which will depend on definitions embedded in the law. The inevitable result will be disputes as to whether particular items are included or excluded. There may also be attempts at circumventing any operationally inconvenient definitions as we saw during this section 69 regime and the 'real evidence' exceptions.
"A further problem will be deciding who would have the competence to issue such a certificate. Not the least of the difficulties in locating such a person is the extent to which computer output may be the product of multiple data inputs from multiple external computer systems and software that has been compiled from third-party libraries."
The problem remains, though. If there is a lack of standardization for the processes involved in evidence gathering, standards should ideally be introduced and enforced in some way.
Sommer's recommended approach, which received support from forensic investigators and a King's Counsel, involves the creation of a code of practice or set of procedural guidelines for the relevant existing legislation, such as PACE, which would be ultimately enforced by a judge.
He said this was "a much better approach" compared to one involving a prescriptive set of definitions that allow or disallow specific types of evidence. It places more value on the strength of the evidence instead of whether the evidence itself is admissible.
Part of this could also be a questionnaire handed to prosecutors who would provide assurances as to the quality of the evidence they present in court. Sommer suggested some suitable questions, the general themes of which are summarized below:
Provenance: How the data was gathered and preserved, avoiding contamination or alteration. What tools were used to turn raw data into a presentable format and the justification for using them?
Where processes are standardized, highlight them
Where processes are non-standard, highlight and justify them. Provide assessments of their accuracy
Details of technicians appearing as expert witnesses
Beyond new rules and regulations, the MoJ could explore other avenues for improving the state of digital evidence in the UK. The Association of Chief Police Officers (ACPO) Good Practice Guide to Digital Evidence, for example, hasn't been updated since 2011, so that could be a good place to start.
Sommer also pointed out that putting the onus on judges to evaluate the reliability of digital evidence is an issue. Some may be up to it, he said, but others won't and the Judicial Studies Board does not currently offer training courses on the matter.
Likewise, the police may need more training in this area, and the professor posited that there could be a need for a scheme to certify digital forensic experts before they appear as expert witnesses.
For organizations, the Forensic Readiness Program is an initiative that aims to ensure systems are set up so that digital forensics practitioners can easily extract the data they need should an incident occur, but awareness of it remains relatively low. ®