This page has been speech-enabled for Macintosh owners using the Talker Netscape Plug-in. Hit Escape to discontinue speech.
At the heart of the issue is how the law should be updated to account for changes in technology and the political environment. Electronic commerce and the security of electronic messages rely on encryption. Traditionally, encryption was used by spies and governments during the Cold War to keep secret plans to sink submarines and blow up embassies and the like. With this in mind, encryption hardware and software, certain technical data, and discussions of the higher math that forms the basis of cryptography have been treated by the U.S. as munitions.
Nowadays, however, much stronger forms of encryption than those which were used during the last few world wars are used to protect the $5 smart-card you may use to buy a Slurpee at the local 7-11. Nonetheless, the law has not changed to match the evolving role of the technology, or the environment in which that technology is used.
The Clinton administration has been broadly criticized over its policies concerning data security and privacy. The Clinton administration's initial "Clipper Chip" data standard would require use, in some situations, of a secret encryption standard which would necessitate that the government have access to the content of your secret messages when proper conditions are met. The Clipper Chip plan started to unravel not so much due to the popular outcry against the government's standard, but as a result of scientists demonstrating that the standard was flawed and would not work as promised.
The initial Clipper Chip proposal was followed by "Clipper II" and "Clipper III." All the while, the Clinton administration has maintained its need for access to encrypted communications in order to thwart the four horsemen of the Internet apocalypse-- the money launderer, the drug dealer, the child pornographer and the international terrorist. Some legislators decided that this was entirely the wrong idea, and proposed legislation to liberalize controls on cryptography. The legislation was never passed.
In the mean time, lawsuits were filed in two federal courts over export controls on encryption, and a third suit was filed over restrictions on teaching about encryption-- in the U.S.-- in classes with foreign students enrolled.
Finally, at the end of last year, the Clinton administration changed its policy again, loosening export controls if the governments' "key recovery" plan is used to allow government decryption of any messages using the stronger means of encryption which would now be allowed for export.
Needless to say, this plan was also not well received. Not only did the plan go against the findings various security experts who have suggested substantially more secure forms of encryption and who oppose any "key-recovery" or "key escrow" plan, but the latest Clinton administration encryption plan also ignores studies commissioned by the same administration which recommended loosening of restrictions on cryptography and its export.
Furthermore, one manufacturer of security software offered a reward to the first person who could crack the strongest level of encryption which would be readily allowed for export under the administration's liberalized policy-- it took a college student only three and a half hours to collect.
While U.S. residents may use more robust encryption schemes (which use longer and therefore more secure encryption keys) these schemes may not be exported, as has been mentioned. What this means is that software companies must either use the weaker forms of encryption in products intended for international distribution, or they must create domestic and international versions of their software using different strengths of encryption in each version.
At the same time, their foreign competitors do not have such restrictions-- they may create one version of their software for internal or foreign use, and they may use strong encryption. U.S. software companies argue that this has the affect of putting U.S. companies at a competitive disadvantage-- in essence, U.S. cryptographic policy amounts to 'export jobs, not cryptography.'
All of these concerns bring us to the current proposed federal legislation. On February 12, Congressman Bob Goodlate, R-Va., re-introduced H.R. 695, the Security and Freedom Through Encryption (SAFE) Act, which is identical to legislation he had proposed last year. The broad bi-partisan support for the bill (55 initial co-sponsors) includes Representatives Tom Ewing and Don Manzullo, both Republicans from Illinois.
The SAFE bill begins by spelling out that any U.S. citizen shall have the right to use encryption, of any type, and of any strength or "key length" and in any medium. It also prohibits federal and state governments from requiring that users of encryption products turn over their keys to an escrow agent. The bill does, however, provide additional penalties for anyone who uses encryption in the furtherance of the commission of a criminal offense. Furthermore, the legislation eases export restrictions on any "generally available" or public domain software with a cryptographic component unless there is "substantial evidence that such software will be (A) diverted to a military end-use or an end use supporting international terrorism; (B) modified for military or terrorist end-use; or (C) reexported [without any required authorization]."
The second bill is S376 IS, the Encrypted Communications Privacy Act of 1997, proposed on February 27, 1997 by Senators Patrick Leahy, Conrad Burns, Patty Murray, and Ron Wyden. This bill also guarantees the right to use encryption of any strength or form domestically. The bill also prohibits federal or state governments from mandating any form of key recovery or escrow of people's secret keys. As does the SAFE bill, this bill would provide similar loosening of export controls on readily accessible encryption software. The bill would also provide for additional penalties for persons caught using cryptography to impede law enforcement investigation of a felony.
This Encrypted Communication Privacy Act bill goes further than the SAFE bill in that it provides sections which address the responsibility of an escrow agent who is voluntarily entrusted with the responsibility of holding the secret key to an individuals encrypted data. The provision provides for civil and possibly criminal liability for an unauthorized disclosure, and it provides a procedure for law enforcement access to an escrowed key in certain circumstances.
The final bill entitled the Promotion of Commerce Online in the Digital Era (or Pro-CODE)-- S. 377, was also introduced on February 27, 1997 by Senators Burns and Leahy (and 16 other co-sponsors). This bill is similar to legislation with the same name that the senators introduced last year. While this bill shares many of the same traits as the other two, there are some noticeable differences. The Pro-CODE bill would prohibit the Secretary of Commerce from establishing an encryption standard or policy for any group other than the government itself-- in other words, it prohibits a Clipper IV.
Another difference that has resulted in this bill being more popular among civil liberties groups than some of the others is that it does not contain additional sanctions or punishments for those who use encryption in the course of committing acts that are already illegal and subject to punishment. The Pro-CODE bill also calls for the creation of an information security board to foster coordination between government and industry, and to collect and disseminate non-proprietary information about cryptography.
The Pro-CODE bill has created some controversy, however, as a result of one of the exceptions contained in the bill. In addition to the exceptions that the SAFE or Encrypted Communications Privacy Act bills contain preventing export of cryptography products if they are likely to be coopted for military or terrorist use, this bill would prohibit export of certain software or hardware to an individual, organization, or country if the Secretary of Commerce determines that there is substantial evidence that the software or hardware will be used intentionally "to evade enforcement of United States law or taxation by the United States or by any State or local government." Such a prohibition would actually constitute an extension of current law, and critics claim the bill could be used to outlaw untraceable electronic cash or anonymous remailers (which strip identifying information off of e-mail messages before passing them on to their destinations).
While all of these bills have certain strengths and weaknesses, what is important is what they represent-- an awareness that this is a foundational technology behind the future of commerce and business conducted in a networked environment. The currently exportable standard of encryption does not provide adequate protection for particularly sensitive data. Harder math is more secure. An important element behind a good cryptographic system is not needing to trust others to preserve your privacy-- which is one reason why any sort of key-escrow or key-recovery is so antithical to many users and designers of cryptographic products. Trust math, not the government.
On the other hand, the government does have a legitimate concern in its desire to allow law enforcement to do its job. The Constitution, however, provides for a right to privacy-- not a right for the government to be able to read my mail. Thus, I believe these bills offer a promise to do more good than evil.