Last month, we wrote a post raising the potential that recent changes to the Wassenaar Arrangement could apply to “zero-days” and other computer exploits.

As we noted in our original post, support for these recent changes comes from privacy advocates concerned with keeping exploits and network surveillance tools out of the hands of repressive regimes, economic spies, and other bad actors.

In recent weeks, experts involved in the negotiations that led to these changes have stressed to us that the intent of the changes is to curb trade in software systems used to disseminate and implement intrusion software, including large-scale, commercial surveillance programs and not to control dissemination of malware, rootkits, or exploits that are often used for legitimate purposes by security researchers. For example, the drafters intended to curb export of surveillance software such as FinFisher. FinFisher’s parent company markets the tool to secret police agencies, which are told that the product takes advantage of security lapses in computer software update procedures to covertly install spyware on targets’ machines.

The Wassenaar changes attempt to distinguish between exploits and surveillance software, the latter of which requires relatively large investments of human time and fairly extensive systems of software to implement. The language of the changes tries to reflect this intent.  The document provides a fairly broad definition of “Intrusion Software”:

“Software” specially designed or modified to avoid detection by ‘monitoring tools’, or to defeat ‘protective countermeasures’, of a computer or network capable device, and performing any of the following:

a.  The extraction of data or information, from a computer or network capable device, or the modification of system or user data; or
b.  The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.

However, the agreement does not purport to control intrusion software but rather other software specially built to interoperate with intrusion software:

  • 4. A. 5. Systems, equipment, and components therefore, specially designed or modified for the generationoperation or delivery of, or communication with, “intrusion software”.  [Emphasis added]
  • 4. D. 4. “Software” specially designed or modified for the generation, operation or delivery of, or communication with, “intrusion software”.
  • 4. E. 1. c “Technology” for the “development” of “intrusion software”.
  • “Software” specially designed or modified for the “development” or “production” of equipment or “software” specified by 4.A. or 4.D.
  • “Technology” according to the General Technology Note, for the “development”, “production” or “use” of equipment or “software” specified by 4.A. or 4.D.]

In sum, the language emphasizes control of components for the generation, operation, delivery of, or communication with intrusion software not the components of such software itself.

We credit the experts’ intent behind the changes, but our objections remain. The suggested language does not draw a clear line between things that should be regulated and things that should not. We question whether there is really a difference between a security auditor’s root kit and her tool that deploys or communicates with the root kit.  Is the code that communicates with or installs the software part and parcel of the root kit itself? The attempt to draw this line raise unanswered technical questions. When are lines of code “specially designed” for, say, installing intrusion software, as opposed to installing any software package? Is the controlled item the line of code that says “install rootkit.exe” or “install finfisher.exe”? If so, how useful is that regulation in achieving the desired goal of regulating distribution of surveillance software to repressive regimes?

Understanding the intent behind the changes is important, but the new language does not clearly accomplish what the proponents intended. Adopting countries face a real challenge in crafting regulations that achieve the desired goal of regulating tools like FinFisher but not root kits, vulnerability information, or auto-updaters. Without greater regulatory and technological clarity and consensus about the purported distinction between peripheral software needed to run intrusion software and the components of intrusion software themselves, security providers may be chilled, may incur added legal costs, and may face unintended liability.