(A version of this article was also published at The Conversation.)

You wouldn’t think that academic computer science courses could be classified as an export of military technology.

But unfortunately, under recently passed laws, there is a real possibility that innocuous educational and research activities could fall foul of Australian defence export control laws.

Under these laws, despite recent amendments, such “supplies of technology” — and possibly a wide range of other benign activities — come under a censorship regime involving criminal penalties of up to 10 years imprisonment.

The Defence and Strategic Goods List

How could this be?

The story begins with the Australian government’s list of things it considers important to national defence and security. It’s called the Defence and Strategic Goods List (DSGL). Goods on this list are tightly controlled.

Regulation of military weapons is not a particularly controversial idea. But the DSGL covers much more than munitions. It includes many “dual use” goods – goods with both military and civilian uses – including for instance substantial sections on chemicals, electronics, and telecommunications.

Disturbingly, the DSGL veers wildly in the direction of over-classification, covering activities that are completely unrelated to military or intelligence applications. To illustrate, I will focus on the university sector, and one area of interest to mathematicians like myself — encryption — which raises these issues particularly acutely. But similar considerations apply to a wide range of subject material, and to commerce, industry and government.

Encryption: An essential tool for privacy

Encryption is the process of encoding a message, so that it can be sent privately; decryption is the process of decoding it, so that it can be read. Encryption and decryption are two aspects of cryptography, the study of secure communication.

As with many technologies subject to “dual use” regulation, the first question is whether encryption should be covered at all.

Once the preserve of spies and governments, encryption algorithms have now become an essential part of modern life. We use them almost every time we go online. Encryption is used routinely by consumers to guard against identity theft; by businesses to ensure the security of transactions; by hospitals to ensure the privacy of medical records; and much more. Given that email has about as much security as a postcard, encryption is the electronic equivalent of an envelope.

Encryption is perhaps “dual use” in the narrow sense that it is useful to both military/intelligence agencies as well as civilians; but so are other “dual use” technologies like cars.

Moreover, while States certainly spy on each other, essentially everyone with an internet connection is known to be spied on. Since the Snowden revelations — and much earlier for those who were paying attention — we know about mass surveillance by the NSA, along with its Five Eyes partners, which include Australia.

While States have no right to privacy — this is the whole point of Freedom of Information laws — an individual’s right to privacy is a fundamental human right. And in today’s world, encryption is essential for citizens to safeguard this human right. Strict control of encryption as dual-use technology, then, would not only be a misuse of State power, but the curtailment of a fundamental freedom.

How the DSGL covers encryption

Nonetheless, let’s assume for the purposes of argument that there is a justification for regarding at least some aspects of cryptography as “dual use”. (Let’s also put aside the efforts of government, stretching back over decades now, to weaken cryptographic standards and harass researchers.)

The DSGL contains detailed technical specifications covering encryption. Very roughly, it covers encryption above a certain “strength” level, as measured by technical parameters such as “key length” or “field size”.

The practical question is how high the bar is set: how powerful must encryption be, in order to be classified as “dual use”?

The bar is set low. For instance, software engineers debate whether they should use 2048 or 4096 bits for the RSA algorithm, but the DSGL classifies anything over 512 as “dual-use”. It’s probably more accurate to say that the only cryptography not covered by the DSGL is cryptography so weak that it would be foolish to use.

Moreover, the DSGL doesn’t just cover encryption software: it also covers systems, electronics and equipment used to implement, develop, produce or test it.

In short, the DSGL casts an extremely wide net, potentially catching open source privacy software, information security research and education, and the entire computer security industry, in its snare. This is typical of its approach.

Most ridiculous, however, are some badly flawed technicalities. As I have argued elsewhere, the specifications are so poorly written that they potentially include a little algorithm you learned at primary school called division. If so, then division has become a weapon, and your calculator (or smartphone, or computer, or any electronic device) is a delivery system for it.

These issues are not unique to Australia: the DSGL encryption provisions are copied almost verbatim from the Wassenaar Arrangement, an international arms control agreement. What is unique to Australia is the harshness of the law relating to the list.

Criminal offences for research and teaching?

The Australian Defence Trade Controls Act (DTCA) regulates the list, and enacts a censorship regime with severe criminal penalties.

The DTCA prohibits the “supply” of DSGL technology to anyone outside Australia without a permit. The “supply” need not involve money, and can consist of merely providing access to technology. It also prohibits the “publication” of DSGL technology, but after recent amendments, it only applies to half the DSGL: munitions only, not dual-use technologies.

What is a “supply”? The law does not define the word precisely, but the Department of Defence seems to think that merely explaining an algorithm would be an “intangible supply”. If so, then surely teaching DSGL material, or collaborating on research about it, would be covered.

University education is a thoroughly international and online affair — not to mention research — so any such supply, on any DSGL topic, is likely to end up overseas on a regular basis.

Outside of academia, what about programmers working on international projects like Tor, providing free software so citizens can enjoy their privacy rights online? Network security professionals working with overseas counterparts? Indeed, the entire computer security industry?

Examples of innocuous, or even admirable, activities potentially criminalised by this law are easily multiplied. Such activities must seek government approval or face criminal charges — an outrageous attack on academic freedom, chilling legitimate enquiry, to say the least.

To be sure, there are exceptions in the law, which have been expanded under recent amendments. But they are patchy, uncertain and dangerously limited.

For instance, public domain material and “basic scientific research” are not regarded as DSGL technology. However, researchers by definition create new material not in the public domain; and “basic scientific research” is a narrow term which excludes research with practical objectives. Lecturers, admirably, often include new research in teaching material. In such circumstances none of these provisions will be of assistance.

Another exemption covers supplies of dual-use technology made “preparatory to publication”, apparently to protect researchers. But this exemption will provide little comfort to researchers aiming for applications or commercialisation; and none at all to educators or industry. A further exemption is made for oral supplies of DSGL technology, so if computer science lecturers can teach without writing (giving a whole new meaning to “off the books”!) they might be safe.

Unlike the US, there is no exception for education; none for public interest material; and indeed, the Explanatory Memorandum makes clear that the government envisions universities seeking permits to teach students DSGL material – and, by implication, criminal charges if they do not.

On a rather different note, the DTCA specifically enables the Australian and US militaries to freely share technology.

Thus, an Australian professor emailing an international collaborator or international postgraduate student about a new applied cryptography idea, or explaining a new variant on a cryptographic algorithm on a blackboard in a recorded lecture viewed overseas — despite having nothing to do with military or intelligence applications — may expose herself to criminal liability. At the same time, munitions flow freely across the Pacific. Such is Australia’s military export control regime.

Now, there is nothing wrong in principle with government regulation of military technology. But when the net is cast as broadly as the DSGL — especially as with encryption — and the regulatory approach is censorship with criminal penalties — as with the DTCA’s permit regime — then the result is a vast overreach. Even if the Department of Defence did not exercise its censorship powers, the mere possibility is enough for a chilling effect stifling the free flow of ideas and progress.

The DTCA was passed in 2012, with the criminal offences schedule to come into effect in May 2015. Thankfully, emergency amendments in April 2015 have provided some reprieve.

Despite those amendments, the laws remain paranoid. The DSGL vastly over-classifies technologies as dual-use, including essentially all sensible uses of encryption. The DTCA potentially criminalises an enormous range of legitimate research and development activity as a supply of dual-use technology, dangerously attacking academic freedom — and freedom in general — in the process.

This story illustrates just one of many ways in which basic freedoms are being eroded in the name of national security.

Unless further changes are made, criminal penalties of up to 10 years prison will come into effect on 2 April 2016.

The day after April fool’s day. Jokes should be over by then.

Paranoid defence controls could criminalise teaching encryption

Leave a Reply

Your email address will not be published. Required fields are marked *