(This article presents the results of research by the SURVEILLE [Surveillance: Ethical Issues, Legal Limitations, and Efficiency] consortium, a European Union-funded multidisciplinary research effort examining surveillance technologies.)

Electronic mass surveillance – including the mass trawling of both metadata and content by the US National Security Agency – fails drastically in striking the correct balance between security and privacy that American officials and other proponents of surveillance insist they are maintaining.

We arrived at this conclusion by subjecting a wide-range of surveillance technologies to three separate assessments by three parallel expert teams representing engineers, ethicists, and lawyers. Each team conducted assessments of surveillance technologies, looking at ethical issues they raise; the legal constraints on their use – or those that should exist – on the basis of privacy and other fundamental rights; and, finally, their technical usability and cost-efficiency. This work was fed into and commented upon by two end-user panels, one consisting of law enforcement officials and the other of representatives of cities and municipalities. Various surveillance techniques and technologies were assessed during a scenario that simulated real-life experiences where surveillance has been used. The results of this research were published in the SURVEILLE paper, Assessing Surveillance in the Context of Preventing a Terrorist Act, available here.

Our research found that electronic mass surveillance performed poorly in terms of practical usability, ethical grounds, and the protection of privacy rights, whereas traditional (non-technological) surveillance or strictly targeted electronic surveillance might have a chance to strike a “balance.” The novelty in our work is demonstrating this through semi-quantification and numerical scores.

Electronic mass surveillance technologies were assessed in a terrorism prevention scenario in which six different surveillance methods were used to try to detect a terrorist act that was possibly in preparation.

Several of the six surveillance techniques assessed were closely modeled on the basis of what we know about NSA surveillance from Edward Snowden’s revelations and other sources. The scenario started with the splitting of a submarine fiber-optic communications cable arriving in a country, to collect all data passing through it, with separate retention rules for content and metadata. This mass data was then searched with Phantom Viewer software. Through that process, a select group of targets was identified and their communications subjected to a social network analysis to identify their contacts and to assess (by humans) which individuals may be actual suspects in an evolving terrorist plot. At this point, our scenario narrowed its focus to one suspected individual. The authorities then used non-technological surveillance methods against him, namely the opening of his luggage in search of components of explosives and direct observation of the suspect by a surveillance team. The team also installed Finspy technology on Internet café computers likely to be used by the suspect in order to conduct real-time targeted surveillance upon all of his online activities.

Our three expert teams then ranked the technology and techniques according to their usability, ethical risks they posed, and their intrusiveness to citizens’ fundamental privacy rights.

The three assessments by the expert teams produced the following matrix where their findings are ranked (this matrix can also be found on page 11 of SURVEILLE’s research paper):

SURVEILLE

Ethical risk ranking: green = low ethical risk, amber = intermediate ethical risk, red = severe ethical risk.

Usability ranking: 0 = least technically usable, 10 = most technically usable.

Fundamental rights intrusion ranking: 0 = no rights intrusion, 16 = most serious rights intrusion.

(Click here for a brief explanation of how the scoring system was developed.)

Our technology assessment expert team gave the highest usability scores to traditional non-technological methods of surveillance and to social network analysis when focused on a narrowed-down group of persons. Cable splitting and Phantom Viewer, representing NSA methods of electronic mass surveillance, produced a mediocre usability score (five out of 10 in both cases). Finspy, the last targeted method of electronic surveillance also produced a low usability score due to various weaknesses in the technology, including dependency on a private provider.

The assessments of the technology experts coincided with those of the ethicists. What worked best in terms of usability also raised the smallest ethical concerns. The gravest ethical concerns were identified with the same three methods of electronic surveillance that gave the lowest usability scores.

The legal team’s assessments largely coincided with the consensus of the ethicists and technologists. The lawyers found that the three methods of electronic surveillance that gave low scores on usability and “red” ethical alerts also produced the maximum score of 16 for privacy intrusion. This was due to multiple factors, including the high degree of privacy intrusion against bystanders not suspected of any wrongdoing that we found in all three cases. (We refer to this as “third-party intrusion.”) Meanwhile, the two traditional surveillance techniques gave a very low intrusion score (3/4), and the one form of electronic surveillance that was found best in terms of usability (social networking analysis) gave a high, but perhaps not intolerable, intrusion score of eight.

The scoring approach applied in SURVEILLE primarily represents an effort to combine different disciplines and their specific expertise into the assessment of surveillance technologies. The resulting numerical scores can however be used to illustrate how the respective areas of expertise can then be brought together into a common discussion. High usability scores (e.g., eight in our example) signify that a technology is “good” in the sense that it is capable of producing better security in a cost-efficient manner. Privacy intrusion scores above 10 would indicate that there is no justification for the use of a particular technology, as the negative human rights impact would be too high. Privacy intrusion scores that are high but nevertheless below 10 represent the “hard cases” in a discussion of a possible balance between privacy and security.

For example, targeted social networking analysis produced the identical score of eight for both usability and privacy intrusion. In such a situation, it would be natural to seek ways to mitigate the privacy intrusion by improving the technology or its use, including by reducing any third-party intrusion. In our model, in-built privacy-by-design features are rewarded twice, as they result in a higher usability score and a lower privacy intrusion score. It could very well be imagined, that through small modifications in this surveillance method, the usability score could go up to nine and the privacy intrusion score down to seven, demonstrating a better case for a proper “balance” being struck. The amber warning light from our ethicists would nevertheless suggest that we should proceed with caution.

Our legal team conducted separate intrusion assessments in relation to the right to privacy and the right to the protection of personal data, which is often seen as just one dimension of privacy. In the exercise reported here, this merely affected the scoring of the traditional (non-technological) methods of surveillance, but in our earlier analysis of surveillance in a different context, there was much more differentiation between the two sets of scores. Our analysis suggests that when conscientiously extended to cover data protection issues, privacy intrusion provides a good proxy for also assessing surveillance’s impact upon other human rights, such as freedom of expression, freedom of movement, freedom of association, and the like. Those impacts are real, but in our view, will be captured by a proper assessment focusing on privacy rights and third-party intrusion.

An earlier exercise looked into a wide range of surveillance technologies used to investigate cross-border organized crime. Its results were reported in SURVEILLE’S Matrix of Surveillance Technologies.

A third round of assessments relating to the use of closed circuit TV and other surveillance technologies in an urban security context is underway and will be reported in due course.

SURVEILLE’s take on the facts of NSA mass surveillance, in light of Edward Snowden’s revelations and other sources, is presented in another SURVEILLE paper: Mass Surveillance by the National Security Agency (NSA) of the United States of America.