Algorithmic Opacity as a Challenge to the Rights of the Defense

Algorithms are increasingly used in criminal proceedings for evidentiary purposes and for supporting decision-making. In a worrying trend, these tools are still concealed in secrecy and opacity preventing the possibility to understand how their specific output has been generated. Does opacity represent a threat to fair trial rights? Exploring the case of Exodus, a malware used for criminal investigations, provides a first glimpse at the deeper implications of the problem. 

An article by Francesca Palmiotto

Exodus from legality

In March 2019 a group of researchers from Security without Borders and Motherboard discovered Exodus, a malware that infected hundreds of people in Italy. A malware, simply put, is a malicious software that is installed secretively on a device, where it can then conduct a wide range of invasive activities. Exodus was disguised in other apps, seemingly harmless, available on Google Play and on the Apple store. When the user installed one of these apps, Exodus could hack the phone and steal data from it. After download, it was able to (for the Android version) retrieve browsing history, media exchanged through WhatsApp and SMS, record surroundings using the microphone and phone calls, take pictures with the camera, extract the GPS position, events from the Calendar, the call logs, contact list from Facebook etc. Exodus relied on highly invasive techniques that significantly reduced the overall security of the device, making it vulnerable to other hacking attacks. Furthermore, the software was never remotely disinfected by the operator. Italian authorities launched an investigation into eSurv, the company that is suspected of having developed the malware.

An illegal investigative tool

Exodus is a powerful malware that infects smartphones indiscriminately. This story is not new. But what if it was designed to be used as a mean of obtaining evidence for criminal investigations? This would make it particularly interesting from a legal perspective. According to a document published online by the Italian Police, eSurv won a tender for the development of a ‘passive and active interception system’ in 2017 (see Progressivo n° 615). However, it is still unclear whether this malware has been actually used by Italian law enforcement authorities. If it was true, it would amount to a severe violation of individual rights. The use of hacking practices for law enforcement is allowed, with different restrictions, in several Member State of the European Union, such as France, Germany, Poland and UK. In 2017, Italy introduced a law regulating malware for criminal investigations and the technical requirements that must be respected during their development and use. Due to the invasive nature of this tool, specific provisions have been adopted. Firstly, a judicial authorization is required for hacking a specific device and the request must meet certain ex ante conditions (e.g. restricting the use of hacking tools to investigations related to crimes of a certain gravity; limiting the duration of hacking activities; preserving the security of the device). Accordingly, the malware must have a ‘target validation procedure’ checking whether the infected device is intended to be legally intercepted or not. Moreover, only microphone activation through remote control is allowed. Once the operation is over, it must be removed from the device. Therefore, if a malware is designed for law enforcement purposes, it must meet these requirements. The relevant question thus is: is Exodus in compliance with the relevant Italian provisions regulating hacking practices? Undoubtedly, it is not. This malware, in fact:

  • had no the procedure to ensure it would infect only valid targets;
  • exposed the device to third party attackers;
  • allowed many prohibited functionalities;
  • was never removed from the device.

The lack of supervision and oversight

Since the discovery of Exodus, the focus was on the culpability of the company that developed the software as well as on the responsibility Google and Apple for the lack of efficient filters in their app stores. Even if the role of tech companies remains essential, this event also raises other crucial questions, such as: who is responsible for controlling the technical elements of software that are used for law enforcement purposes? What information is needed for the purpose of supervision? What are the mechanisms for providing transparency and accountability?

The right to challenge algorithm-based evidence

Narrowing down these issues and focusing on the context of the criminal proceeding, it is necessary to ensure the rights of the defense. In particular, according to the case-law of the European Court of Human Rights on Article 6§3, the right to confrontation not only requires that defendants should be in a position to challenge the probity and credibility of evidence against them, but that they should also be able to test their truthfulness and reliability (Al-Khawaja and Tahery v UK [GC], ECHR 2011). In criminal investigations, malware is used as a mean of obtaining digital evidence (i.e. algorithm-based evidence). Consequently, the quality of the evidence presented – in terms of accuracy, reliability and legality – is strictly dependent on the quality of the software and techniques used to acquire it. When evidence is gathered through technological means, the defense has the right to check that the investigative operations have been carried in compliance with the relevant provisions and that no further manipulation of data collected has occurred. In the case illustrated above, the defense should be in the position to confront the evidence collected through Exodus and to show that it had been illegally obtained. Nevertheless, in order to do so, it is necessary to comprehend how the software works and how it had been used.

The ‘knowledge problem’ of algorithmic opacity

In this field, however, legal professionals are confronted with a ‘knowledge problem’. The technological tools of interest are not only concealed in secrecy and opacity, but lack specific procedural safeguards at the same time, which generally hinders understanding how their output has been generated. In this sense, opacity can stem from barriers to accessing and understanding the relevant information that is required to comprehend how the output (i.e. the evidence) has been generated. Access to information can be impeded by proprietary protection, state or corporate secrecy or privacy rights. As to the possibility to understand the information, this option can be hampered by the lack of technical literacy and the lack of interpretability of the system (see Burrell 2016). In an expert interview conducted for the study “Legal Framework for Hacking by Law Enforcement”, Giovanni Ziccardi raised concerns on the de facto inability to challenge the instruments by legal professionals, due to their lack of adequate knowledge. Without knowing the object of the scrutiny, it is not possible to challenge its reliability, accuracy and legality. In this sense, it is the role of procedural justice to provide legal remedies to compensate this knowledge gap (e.g. access to source code, written answers about how the machine works, cross-examination of who used the software, access to the necessary documentation regarding how the machine is functioning, testimony from the programmers of machines performing forensic analyses). In the absence of specific procedural safeguards for oversight and supervision ex post, the opacity surrounding this type of software represents an obstacle and a possible violation of the rights of the defense.

Published under licence CC BY-NC-ND.

Francesca Palmiotto Written by:

Francesca Palmiotto is a PhD researcher at the Law Department of the European University Institute of Florence. My research focuses on the impact of technological changes on criminal procedural law.