Open main menu

Wikipedia β

Export of cryptography from the United States

Export-restricted RSA encryption source code printed on a T-shirt made the T-shirt an export-restricted munition, as a freedom of speech protest against U.S. encryption export restrictions (Back side).[1] Changes in the export law means that it is no longer illegal to export this T-shirt from the U.S., or for U.S. citizens to show it to foreigners.

The export of cryptographic technology and devices from the United States was severely restricted by U.S. law until 1992, but was gradually eased until 2000; some restrictions still remain.

Since World War II, many governments, including the U.S. and its NATO allies, have regulated the export of cryptography for national security reasons, and, as late as 1992, cryptography was on the U.S. Munitions List as an Auxiliary Military Equipment.[2]

Due to the enormous impact of cryptanalysis in World War II, these governments saw the military value in denying current and potential enemies access to cryptographic systems. Since the U.S. and U.K. believed they had better cryptographic capabilities than others, their intelligence agencies tried to control all dissemination of the more effective crypto techniques. They also wished to monitor the diplomatic communications of other nations, including those emerging in the post-colonial period and whose position on Cold War issues was vital.[3]

The First Amendment made controlling all use of cryptography inside the U.S. illegal, but controlling access to U.S. developments by others was more practical — there were no constitutional impediments.

Accordingly, regulations were introduced as part of munitions controls which required licenses to export cryptographic methods (and even their description); the regulations established that cryptography beyond a certain strength (defined by algorithm and length of key) would not be licensed for export except on a case-by-case basis. This policy was also adopted elsewhere for various reasons.

The development and public release of Data Encryption Standard (DES) and asymmetric key techniques in the 1970s, the rise of the Internet, and the willingness of some to risk and resist prosecution, eventually made this policy impossible to enforce, and by the late 1990s it was being relaxed in the U.S., and to some extent (e.g., France) elsewhere. As late as 1997, NSA officials in the US were concerned that the widespread use of strong encryption will frustrate their ability to provide SIGINT regarding foreign entities, including terrorist groups operating internationally. NSA officials anticipated that the American encryption software backed by an extensive infrastructure, when marketed, was likely to become a standard for international communications.[4] In 1997, Louis Freeh, then the Director of the FBI, said

For law enforcement, framing the issue is simple. In this time of dazzling telecommunications and computer technology where information can have extraordinary value, the ready availability of robust encryption is essential. No one in law enforcement disputes that. Clearly, in today's world and more so in the future, the ability to encrypt both contemporaneous communications and stored data is a vital component of information security.

As is so often the case, however, there is another aspect to the encryption issue that if left unaddressed will have severe public safety and national security ramifications. Law enforcement is in unanimous agreement that the widespread use of robust non-key recovery encryption ultimately will devastate our ability to fight crime and prevent terrorism. Uncrackable encryption will allow drug lords, spies, terrorists and even violent gangs to communicate about their crimes and their conspiracies with impunity. We will lose one of the few remaining vulnerabilities of the worst criminals and terrorists upon which law enforcement depends to successfully investigate and often prevent the worst crimes.

For this reason, the law enforcement community is unanimous in calling for a balanced solution to this problem.[5]



Cold War eraEdit

In the early days of the Cold War, the U.S. and its allies developed an elaborate series of export control regulations designed to prevent a wide range of Western technology from falling into the hands of others, particularly the Eastern bloc. All export of technology classed as 'critical' required a license. CoCom was organized to coordinate Western export controls.

Two types of technology were protected: technology associated only with weapons of war ("munitions") and dual use technology, which also had commercial applications. In the U.S., dual use technology export was controlled by the Department of Commerce, while munitions were controlled by the State Department. Since in the immediate post WWII period the market for cryptography was almost entirely military, the encryption technology (techniques as well as equipment and, after computers became important, crypto software) was included as a Category XIII item into the United States Munitions List. The multinational control of the export of cryptography on the Western side of the cold war divide was done via the mechanisms of CoCom.

By the 1960s, however, financial organizations were beginning to require strong commercial encryption on the rapidly growing field of wired money transfer. The U.S. Government's introduction of the Data Encryption Standard in 1975 meant that commercial uses of high quality encryption would become common, and serious problems of export control began to arise. Generally these were dealt with through case-by-case export license request proceedings brought by computer manufacturers, such as IBM, and by their large corporate customers.

PC eraEdit

Encryption export controls became a matter of public concern with the introduction of the personal computer. Phil Zimmermann's PGP cryptosystem and its distribution on the Internet in 1991 was the first major 'individual level' challenge to controls on export of cryptography. The growth of electronic commerce in the 1990s created additional pressure for reduced restrictions.

In 1992, a deal between NSA and the SPA made 40-bit RC2 and RC4 encryption easily exportable using a Commodity Jurisdiction (which transferred control from the State Department to the Commerce Department). At this stage Western governments had, in practice, a split personality when it came to encryption; policy was made by the military cryptanalysts, who were solely concerned with preventing their 'enemies' acquiring secrets, but that policy was then communicated to commerce by officials whose job was to support industry.

Shortly afterward, Netscape's SSL technology was widely adopted as a method for protecting credit card transactions using public key cryptography. Netscape developed two versions of its web browser. The "U.S. edition" supported full size (typically 1024-bit or larger) RSA public keys in combination with full size symmetric keys (secret keys) (128-bit RC4 or 3DES in SSL 3.0 and TLS 1.0). The "International Edition" had its effective key lengths reduced to 512 bits and 40 bits respectively (RSA_EXPORT with 40-bit RC2 or RC4 in SSL 3.0 and TLS 1.0). Acquiring the 'U.S. domestic' version turned out to be sufficient hassle that most computer users, even in the U.S., ended up with the 'International' version,[6] whose weak 40-bit encryption could be broken in a matter of days using a single personal computer. A similar situation occurred with Lotus Notes for the same reasons.

Legal challenges by Peter Junger and other civil libertarians and privacy advocates, the widespread availability of encryption software outside the U.S., and the perception by many companies that adverse publicity about weak encryption was limiting their sales and the growth of e-commerce, led to a series of relaxations in US export controls, culminating in 1996 in President Bill Clinton signing the Executive order 13026[7] transferring the commercial encryption from the Munition List to the Commerce Control List. Furthermore, the order stated that, "the software shall not be considered or treated as 'technology'" in the sense of Export Administration Regulations. The Commodity Jurisdiction process was replaced with a Commodity Classification process, and a provision was added to allow export of 56-bit encryption if the exporter promised to add "key recovery" backdoors by the end of 1998. In 1999, the EAR was changed to allow 56-bit encryption and 1024-bit RSA to be exported without any backdoors, and new SSL cipher suites were introduced to support this (RSA_EXPORT1024 with 56-bit RC4 or DES). In 2000, the Department of Commerce implemented rules that greatly simplified the export of commercial and open source software containing cryptography, including allowing the key length restrictions to be removed after going through the Commodity Classification process.[8]

Current statusEdit

As of 2009, non-military cryptography exports from the U.S. are controlled by the Department of Commerce's Bureau of Industry and Security.[9] Some restrictions still exist, even for mass market products, particularly with regard to export to "rogue states" and terrorist organizations. Militarized encryption equipment, TEMPEST-approved electronics, custom cryptographic software, and even cryptographic consulting services still require an export license[9](pp. 6–7). Furthermore, encryption registration with the BIS is required for the export of "mass market encryption commodities, software and components with encryption exceeding 64 bits" (75 FR 36494). In addition, other items require a one-time review by, or notification to, BIS prior to export to most countries.[9] For instance, the BIS must be notified before open-source cryptographic software is made publicly available on the Internet, though no review is required.[10] Export regulations have been relaxed from pre-1996 standards, but are still complex.[9] Other countries, notably those participating in the Wassenaar Arrangement,[11] have similar restrictions.[12]

U.S. export rulesEdit

U.S. non-military exports are controlled by Export Administration Regulations (EAR), a short name for the U.S. Code of Federal Regulations (CFR) Title 15 chapter VII, subchapter C.

Encryption items specifically designed, developed, configured, adapted or modified for military applications (including command, control and intelligence applications) are controlled by the Department of State on the United States Munitions List.


Encryption export terminology is defined in EAR part 772.1.[13] In particular:

  • Encryption Component is an encryption commodity or software (but not the source code), including encryption chips, integrated circuits etc.
  • Encryption items include non-military encryption commodities, software, and technology.
  • Open cryptographic interface is a mechanism which is designed to allow a customer or other party to insert cryptographic functionality without the intervention, help or assistance of the manufacturer or its agents.
  • Ancillary cryptography items are the ones primarily used not for computing and communications, but for digital right management; games, household appliances; printing, photo and video recording (but not videoconferencing); business process automation; industrial or manufacturing systems (including robotics, fire alarms and HVAC); automotive, aviation and other transportation systems.

Export destinations are classified by the EAR Supplement No. 1 to Part 740 into four country groups (A, B, D, E) with further subdivisions;[14] a country can belong to more than one group. For the purposes of encryption, groups B, D:1, and E:1 are important:

  • B is a large list of countries that are subject to relaxed encryption export rules
  • D:1 is a short list of countries that are subject to stricter export control. Notable countries on this list include China and Russia
  • E:1 is a very short list of "terrorist-supporting" countries (as of 2009, includes five countries; previously contained six countries and was also called "terrorist 6" or T-6)

The EAR Supplement No. 1 to Part 738 (Commerce Country Chart) contains the table with country restrictions.[15] If a line of table that corresponds to the country contains an X in the reason for control column, the export of a controlled item requires a license, unless an exception can be applied. For the purposes of encryption, the following three reasons for control are important:

  • NS1 National Security Column 1
  • AT1 Anti-Terrorism Column 1
  • EI Encryption Items is currently same as NS1


For export purposes each item is classified with the Export Control Classification Number (ECCN) with the help of the Commerce Control List (CCL, Supplement No. 1 to the EAR part 774). In particular:[9]

  • 5A002 Systems, equipment, electronic assemblies, and integrated circuits for "information security. Reasons for Control: NS1, AT1.
  • 5A992 "Mass market" encryption commodities and other equipment not controlled by 5A002. Reason for Control: AT1.
  • 5B002 Equipment for development or production of items classified as 5A002, 5B002, 5D002 or 5E002. Reasons for Control: NS1, AT1.
  • 5D002 Encryption software. Reasons for control: NS1, AT1.
    • used to develop, produce, or use items classified as 5A002, 5B002, 5D002
    • supporting technology controlled by 5E002
    • modeling the functions of equipment controlled by 5A002 or 5B002
    • used to certify software controlled by 5D002
  • 5D992 Encryption software not controlled by 5D002. Reasons for control: AT1.
  • 5E002 Technology for the development, production or use of equipment controlled by 5A002 or 5B002 or software controlled by 5D002. Reasons for control: NS1, AT1.
  • 5E992 Technology for the 5x992 items. Reasons for control: AT1.

An item can be either self-classified, or a classification ("review") requested from the BIS. A BIS review is required for typical items to get the 5A992 or 5D992 classification.

In cultureEdit

In xkcd cartoon "Legal Hacks", a hacker argues that they should have let the United States government classify cryptography algorithms as munitions, then claim legal possession under the Second Amendment.[16]

See alsoEdit


  1. ^
  2. ^ Department of State -- International Traffic in Arms Regulations, April 1, 1992, Sec 121.1
  3. ^ Kahn, The Codebreakers, Ch. 19
  4. ^ The encryption debate: Intelligence aspects. See reference below, p. 4
  5. ^ Statement of Louis J. Freeh, Director, Federal Bureau of Investigation before the Senate Judiciary Committee. July 9, 1997
  6. ^ "January 25, 1999 archive of the Netscape Communicator 4.61 download page showing a more difficult path to download 128-bit version". Archived from the original on September 16, 1999. Retrieved 2017-03-26. 
  7. ^ US Executive order 13026
  8. ^ "Revised U.S. Encryption Export Control Regulations". EPIC copy of document from U.S. Department of Commerce. January 2000. Retrieved 2014-01-06. 
  9. ^ a b c d e Commerce Control List Supplement No. 1 to Part 774 Category 5 Part 2 - Info. Security
  10. ^ "U. S. Bureau of Industry and Security - Notification Requirements for "Publicly Available" Encryption Source Code". 2004-12-09. Archived from the original on 2002-09-21. Retrieved 2009-11-08. 
  11. ^ Participating States Archived 2012-05-27 at The Wassenaar Arrangement
  12. ^ Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies: Guidelines & Procedures, including the Initial Elements[permanent dead link] The Wassenaar Arrangement, December 2009
  13. ^ EAR Part 772
  14. ^ EAR Supplement No. 1 to Part 740
  15. ^ EAR Supplement No. 1 to Part 738
  16. ^ Randall Munroe. "XKCD 504:Legal hacks". 

External linksEdit