1015dculturedcbooks8232214.0001.001 in
    Page  161

    Privacy and Security Policy in the Digital Age

    This may be in sharp contrast with the conventional wisdom, but I have come to conclude that privacy is better protected now by technical means than it ever has been in human history. How could this be so?

    If you think about how messages have been traditionally sent by messenger, by carrier pigeon, by phone or fax—all before encryption— they were easily intercepted. These messages could be read by a casual observer, a postmaster and, of course, the police. Strong encryption, however, typically used in day-to-day communication, prevents routine violation of privacy. The capacity of strong encryption technologies to protect communication represents a significant factor of change— increasing the individual’s power to protect privacy by a factor of a thousand, perhaps a million. This same principle applies as well to databases. Take health records, for example, databases often full of intimate details. Until rather recently, most patient medical records were kept on paper in file folders protected only by the lock on the file room door—a level of security easily overcome with a little effort. Now with the transfer to digital records, when encryption is engaged and the data is protected by access audit trails, the information is much more secure. The electronic audit trail provides an important level of security by identifying who entered the data as well as who accessed it and when it was accessed. It is not a matter of technological determinism, of course; it represents a technical affordance. In order to take advantage of the protective shield of encryption, one has to take the initiative to encrypt one’s communication, whether routine or otherwise.

    Page  162

    In the wake of 9/11, however, some of this new potential for privacy has been diluted. The reasons for this are complex and require us to carefully examine the trade-offs between privacy, clearly an important and fundamental right, and other important values and rights. I have argued elsewhere that the implicit assumption that personal privacy trumps all other claims needs to be carefully examined. It may require a carefully crafted balance among core values. Perhaps it is best to understand this as a dynamic process akin to an arms race between technologies. Every year new technical means for invading privacy are invented as are new countermeasures. Government agencies such as the National Security Agency may become more intrusive, but private citizens have new and sophisticated security techniques available to them as well—it’s a tug of war. The war has not been won or lost by either side; it is ongoing and that is the focus of this chapter. I will review the technical developments related to personal privacy in several technical domains—three that enhance personal privacy (I refer to them as liberalizing technologies): cellular phones, Internet communications, and strong encryption; and two developments that enhance surveillance (I label them public protective): technologies for intercepting digital communication, and technologies for intercepting actual computer keystrokes.

    I argue that both individual rights and public safety must be protected. Given that on many occasions advancing one requires some curtailment of the other, the key question is what the proper balance between these two cardinal values should be. The concept of balance is found in the Fourth Amendment. It refers to the right not to be subjected to unreasonable search and seizure. Thus, it recognizes a category of searches that are fully compatible with the Constitution—those that are reasonable. Historically, courts have found searches to be reasonable when they serve a compelling public interest, such as public safety or public health.

    The counterclaims of advocates on both sides are best understood within a historical context. Societies tend to lean excessively toward the public interest or toward liberty. Corrections to such imbalances then tend to lead to overcorrections. For example, following the civil rights abuses that occurred during the years J. Edgar Hoover was the director of the FBI, the attorney general imposed severe limitations on the agency in the 1970s. These limitations excessively curbed the agency’s work in the following decades. The public safety measures enacted after September 11 removed many of these restrictions and granted law enforcement agencies and the military new powers. These changes arguably tilted excessively in the other direction. This overcorrection was Page  163 soon followed by an attempt to correct it (for example, by limiting the conditions under which military tribunals can be used and spelling out procedures not included in their preliminary authorization). Historical conditions also change the point at which we find a proper balance. The 2001 assault on America and the threat of additional attacks have brought about such a change. This chapter argues that we should strive to achieve a balance by focusing on accountability.

    Liberalizing Technologies

    In 1980, communication surveillance could be carried out easily by attaching simple devices to a suspect’s landline telephone. In the following decades, millions of people acquired several alternative modes of convenient, instantaneous communication, most significantly cellular telephones and e-mail. According to CTIA Wireless estimates, by 2007 there were over 250 million cellular phone subscribers in the United States, a penetration of about 83 percent. E-mail and Internet usage are similarly pervasive.[1]

    By 2007 Nielsen//Net Rating estimates, 216 million people in the United States were online, representing a penetration of about 72 percent of the population.[2] These technological developments greatly limited the ability of public authorities to conduct communications surveillance using traditional methods.

    Attempts were made to apply the old laws to new technologies, but the old laws did not fit the new technologies well. The law governing full intercepts, contained in Title III of the Omnibus Crime Control and Safe Streets Act of 1969, originally required that court orders for intercepts specify the location of the communications device to be tapped and establish probable cause that evidence of criminal conduct could be collected by tapping that particular device. Hence, under this law, if a suspect shifted from one phone to another or used multiple phones, the government could not legally tap phones other than the one originally specified without obtaining a separate court order for each. Once criminals were able to obtain and dispose of multiple cellular phones like “used tissues,” investigations were greatly hindered by the lengthy process of obtaining numerous full intercept authorizations from the courts.[3]

    The rise of Internet-based communications further limited the ability of public authorities to conduct communications surveillance under the old laws. Title III did not originally mention electronic communica- Page  164 tions. Similarly, the language of the Electronic Communications Privacy Act of 1986 (ECPA) that governed pen/trap orders (recording the telephone numbers of the caller and called party but not the content of the call) was not clearly applicable to e-mail. To determine how to deal with this new technology, courts often attempted to draw analogies between e-mail and older forms of communication. Because electronic communication used to travel largely over phone lines, courts extended laws governing intercepts or traces for telephones to electronic messages as well. However, reliance by the police on such interpretations was risky because there was a possibility that a court would rule that e-mail did not fall under a pen/trap order.

    Extending laws that were written with telephones in mind to e-mail was an imperfect solution because e-mail messages differ from phone conversations in important ways. Unlike phone conversations, e-mails do not travel in discrete units that can be plucked out. Each e-mail is broken up into digital packets, and the packets are mixed together with those of other users. This makes it difficult to intercept individual e-mails. Law enforcement agents attempting to intercept or trace the e-mail of just one user may violate the privacy of other users.

    The decentralized nature of the Internet created additional complications in carrying out wiretap orders. When the old legislation was enacted, a unified phone network made it easy to identify the source of a call. E-mail, by contrast, may pass through multiple Internet service providers (ISPs) in different locations throughout the nation on its way from sender to recipient. As a result, public authorities would have to compel information from a chain of service providers. Thus, until recently, if a message went through four providers, four court orders in four different jurisdictions would be needed to find out the origin of that message.[4]

    Similarly, agents faced jurisdictional barriers when they tried to obtain search warrants for saved e-mail. Under old laws, a warrant had to be obtained from a judge in the jurisdiction where the search would take place. E-mail, however, is not always stored on a personal computer but often is stored remotely on an ISP’s server. This means that if a suspect in New Jersey had e-mail stored on a server located in Silicon Valley, an agent would have to travel across the country to get a warrant to seize the e-mail.

    In short, the introduction of both cellular phones and e-mail made it much more difficult to conduct communications surveillance, even in cases in which the court authorized such surveillance. The old laws and enforcement tools were not suited to deal with these new technologies.

    Page  165

    Public authorities were also set back by the development of strong encryption. Although ciphers have existed for thousands of years, programmers have only recently developed 128-bit key length and higher levels of encryption that are extraordinarily difficult to break, even by the National Security Agency (NSA). Moreover, software that uses strong encryption is readily available to private parties at low cost. Today, manufacturers routinely prepackage these programs on computers. Thus, encrypted messages are more private than any messages historically sent by mail, phone, messenger, or other means. Similarly, now data stored on one’s own computer is protected much better than analogous data stored under lock and key. Despite court orders, strong encryption has frustrated the efforts of law enforcement in a growing number of cases.

    The impact of the development of strong encryption is qualitatively different from the impact of the other privacy-enhancing technologies. The main factor that constrained public authorities in the area of new modes of communication was the obsolescence of laws. In the case of strong encryption, on the other hand, the technology imposes its own barrier. Updating the law was sufficient to enable law enforcement to handle the challenges posed by the other new technologies. By contrast, no court order can enable strong encryption to be broken (Russell and Gangemi 1995, 11; Denning and Baugh 1997).

    These technological developments have provided all people—lawabiding citizens and criminals, nonterrorists and terrorists—greater freedom to do as they choose. In this sense, these technologies are “liberalizing.” At the same time, they have significantly hampered the ability of public authorities to conduct investigations. Some cyberspace enthusiasts welcomed these developments, hoping that cyberspace would be a self-regulating, government-free space. In contrast, public authorities clamored for the laws to be changed in order to enable officials to police the new “territory” as they do in the world of old-fashioned, landline telephones. Such pressures led to some modifications in the law before the 2001 attack on America, but the most relevant changes in the law have occurred since.

    One provision of ECPA attempted to make the laws governing communications intercepts more effective by providing for “roving wiretaps” in criminal investigations. Roving wiretaps are full intercept orders that apply to a particular person rather than to a specific communications device. They allow law enforcement to intercept communications from any phone or computer used by a suspect without specifying in advance which facilities will be tapped.

    Page  166

    The process for obtaining a roving intercept order is more rigorous than the process for obtaining a traditional phone-specific order. The Office of the United States Attorney General must approve the application before it is even brought before a judge. Originally, the applicant had to show that the suspect named in the application was changing phones or modems frequently with the purpose of thwarting interception. After the Intelligence Authorization Act for Fiscal Year 1999 changed the requirement, the applicant merely had to show that the suspect was changing phones or modems frequently and that this practice “could have the effect of thwarting” the investigation. Although roving intercepts have not yet been tested in the Supreme Court, several federal courts have found them to be constitutional.

    Prior to September 11, the FBI could not gain authorization to use roving intercepts in gathering foreign intelligence or in investigations of terrorism. The Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001 (USA PATRIOT Act) amended the Foreign Intelligence Surveillance Act of 1978 (FISA) to allow roving intercept orders. FISA provides the guidelines under which a federal agent can obtain authorization to conduct surveillance for foreign intelligence purposes. Agents who wish to conduct surveillance under FISA submit an application first to the attorney general’s office, which must approve all requests (as with roving intercepts under ECPA). If the attorney general’s office finds the application valid, the application will be taken to one of seven federally appointed judges, who together make up the Federal Intelligence and Security Court (FISC), for approval. The FISC allows no spectators, keeps most proceedings secret, and hears only the government’s side of a case.[5]

    There has been some debate in the courts and among legal scholars about the application of the Fourth Amendment to the new technologies and to the new legislation governing these technologies. Before 1967, the Supreme Court interpreted the Fourth Amendment in a literal way to apply only to physical searches. In Olmstead v. United States, the Court ruled that telephone wiretaps did not constitute a search unless public authorities entered a home to install the device. The Court held that the Fourth Amendment does not protect a person unless “there has been an official search and seizure of his person, or such a seizure of his papers or his tangible material effects, or an actual physical invasion of his house.”[6]

    In 1967, the Court replaced this interpretation of the Fourth Amendment with the view that the amendment “protects people, not places.” Page  167 In Katz v. United States, the Court established that an individual’s “reasonable expectation of privacy” would determine the scope of his or her Fourth Amendment protection. Justice Harlan, in his concurring opinion, set out a two-part test: the individual must have shown a subjective expectation of privacy, and society must recognize that expectation as reasonable.

    Although legal scholars have criticized this test, Katz still represents the state of the law. However, the emergence of new technologies requires a reexamination of what constitutes a reasonable expectation of privacy. In United States v. Maxwell, the court determined that there was a reasonable expectation of privacy for e-mail stored on America Online’s “centralized and privately-owned computer bank.” However, the court in United States v. Charbonneau, relying on Maxwell, held that an individual does not have a reasonable expectation in statements made in an Internet chat room.[7]

    Additionally, there is some question as to whether roving intercepts are constitutional. The Fourth Amendment states, “No warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” Because roving intercepts cannot name the location to be tapped, they may violate the particularity requirement of the Fourth Amendment.

    The argument in favor of their constitutionality is that the particularity of the person to be searched is substituted for the particularity of the place to be searched. In United States v. Petti, the Ninth Circuit Court of Appeals upheld the use of roving intercepts. It explained that the purpose of the “particularity requirement was to prevent general searches.” As long as a warrant or court order provides “sufficient particularity to enable the executing officer to locate and identify the premises with reasonable effort,” and there is no “reasonable probability that another premise might be mistakenly searched,” it does not violate the Fourth Amendment. In other words, a court order to tap all phones used by a specific person does describe particular places but in an unconventional way. Public authorities cannot use the order to tap any location they wish. They can only tap a set of specific locations, namely those used by a specific person.

    Additional questions may arise regarding differential application of the laws to various classes of people. Should noncitizens be treated the same as citizens? Terrorists the same as other criminals? International terrorists the same as domestic terrorists? These are significant issues Page  168 that go to the heart of the debate about the rights of noncitizens. These issues raise potential problems, such as how to define terrorism and whether that definition should extend to citizens, as well as the danger that a loose definition might allow ordinary criminals to be encompassed by terrorism laws.

    Public Protective Technologies

    The liberalizing technologies already addressed enhance individuals’ liberties but hinder public authorities. The following technologies are public protective technologies, which enhance the capabilities of government authorities and accordingly may curtail individual rights.

    In July 2000 the FBI unveiled a new resource awkwardly labeled “Carnivore” to signal the breadth of its power to capture online communication. It was designed to capture a suspect’s e-mail messages or trace messages sent to and from a suspect’s account. To do so, it sorts through a stream of many millions of messages, including those of many other users. Carnivore has a filter that can be set to scan various digital packets for specific text strings or to target messages from a specific computer or e-mail address. The program can operate in two different modes: “pen” or “full.” In pen mode, it will capture only the addressing information, which includes the e-mail addresses of the sender and recipient as well as the subject line. In full mode, it will capture the entire content of a message. Carnivore was designed to copy and store only information caught by the filter, thus keeping agents from looking at any information not covered by the court order. In response to ongoing negative press coverage the technology was renamed a more innocent “DCS1000” and the special-purpose Carnivore package was shelved in 2002 in favor of commercial off-the-shelf products that function similarly and are known as “packet sniffers.” The use of this digital traffic-monitoring software generally requires the cooperation of a suspect’s Internet Service Provider, which may be voluntary or by court order.[8]

    Because packet sniffers still cannot overcome the protective power of strong encryption, security authorities sought means to track actual physical computer keystrokes to capture passwords and text before encryption. The FBI has developed two technologies, the Key Logger System (KLS), which requires physical installation on a computer, and the software-based Magic Lantern that can be surreptitiously downloaded and installed on a computer.

    Page  169

    Once agents discover that they have seized encrypted information, they can seek a warrant to install and retrieve KLS. In the case of Nicodemo Scarfo, a suspected racketeer, agents had to show both probable cause that Scarfo was involved in crime and probable cause that evidence of criminal activity was encrypted on his computer before installing KLS. As in other warrants, the FBI had to specify the exact location of the computer on which KLS would be installed.

    Once installed, KLS uses a “keystroke capture” device to record keystrokes as they are entered into a computer. It is not capable of searching or recording fixed data stored on the computer. Moreover, KLS is designed so that it is unable to record keystrokes while a computer’s modem is in operation because intercepting electronic communications would require an intercept order that is more difficult to get than a warrant.

    Because KLS must be manually installed on a suspect’s computer, it requires breaking and entering into a suspect’s home. In contrast, Magic Lantern allows the FBI to put software on a computer to record keystrokes without installing any physical device. Like KLS, Magic Lantern cannot decrypt e-mail by itself but can retrieve the suspect’s password. The details of how it does this have not been released. It is said to install itself on the suspect’s computer in a way similar to a Trojan horse computer virus. It disguises itself as an ordinary, harmless message, then inserts itself onto a computer. For example, when someone connects to the Internet, a pop-up box could appear, stating “Click here to win!” When the user clicks on the box, the virus will enter the computer.[9]

    Groups like the Electronic Privacy Information Center (EPIC) and the Center for Democracy and Technology (CDT) have raised multiple arguments for why packet sniffers should not be used at all. They are skeptical that these programs operate as the FBI claims and are troubled by the degree of secrecy the FBI maintains about the way they work. Furthermore, they argue that separating addressing information from content is more difficult for Internet communications than for phone calls. Therefore, Carnivore, they say, will not allow the FBI to do a pen/trap without seizing more information than authorized. Privacy advocates also worry that packet sniffers violate the Fourth Amendment because they scan through “tens of millions of e-mails and other communications from innocent Internet users as well as the targeted suspect.” The ACLU compares a Carnivore search to the FBI sending agents into a post office to “rip open each and every mail bag and search for one person’s letters.”

    Page  170

    Officials at the FBI respond that when used properly, packet sniffers will capture only the targeted e-mails. Additionally, Carnivore’s use is subject to strict internal review and requires the cooperation of technical specialists and ISP personnel, thus limiting the opportunities an unscrupulous agent might have to abuse it.

    A review of the original Carnivore program conducted by the Illinois Institute of Technology concluded that although it does not completely eliminate the risk of capturing unauthorized information, Carnivore is better than any existing alternatives because it can be configured to comply with the limitations of a court order. However, the report also determined that failure to include audit trails makes the FBI’s internal review process deficient. Specifically, the operator implementing a Carnivore search selects either pen or full mode by clicking a box on a computer screen, and the program does not keep track of what kind of search has been run. Therefore, it is difficult, if not impossible, to determine if an operator has used the program only as specified in the court order. Furthermore, it is impossible to trace actions to specific individuals because everyone uses the same user ID.The head of the review panel commented, “Even if you conclude that the software is flawless and it will do what you set it to do and nothing more, you still have to make sure that the legal, human, and organizational controls are adequate.”[10] This focus on accountability will be explored below.

    The oversight of these surveillance activities has traditionally been conducted under the ground rules of the Foreign Intelligence Surveillance Act of 1978. The act set up a special court to review wiretap or corresponding digital surveillance activities. The court routinely approved about 500 warrants a year, rising to 1,758 in 2004. The court also oversees efforts to minimize the collection of information about American citizens, given the focus on foreign agents. Since 2001 there has been an escalating controversy between the administration and the courts and Congress over the minimization procedures and the practice of issuing “national security letters” as an alternative to court-approved warrants. One estimate reports that the FBI may issue over 30,000 national security letters a year, indicating a high level of warrantless surveillance. Further, the Bush administration pressed for legislation that provides immunity from litigation for telephone and Internet companies that cooperate with authorities, a proposal not well received by the Democratic Congress. As of mid 2008 the controversy continues without clear resolution and is likely to continue as an issue for the next administration.[11]

    Page  171

    Accountability—a Question of Balance

    When homeland protection is discussed, it is often framed in terms of finding a legitimate balance between two competing public goods— safety and liberty. As Senator Ron Wyden (D-OR), put it in a December 2007 Washington Post op-ed considering the reauthorization and updating of FISA: “For nearly 30 years, the Foreign Intelligence Surveillance Act of 1978 (FISA) has represented the ultimate balance between our needs to fight terrorism ferociously and to protect the constitutional rights of Americans.”[12] Senator Russ Feingold (D-WI), speaking on the Senate floor explaining his “no” vote on the PATRIOT Act in 2001, framed his decision to do so in similar terms: “I have concluded that this bill still does not strike the right balance between empowering law enforcement and protecting civil liberties.”[13] Back in 2004, the Economist framed the debate between Democrats and Republicans on homeland protection again referring to this first balance:

    Since the terrorist attacks of September 11th 2001, the Bush administration has brought in a slew of law-enforcement and surveillance powers that critics fear is turning America into an Orwellian nightmare. The worriers are thinking of the all-seeing Big Brother of “1984,” though so far the chaos of “Animal Farm” may be closer to it. Others—and they are still in a clear majority—feel that a few limits on their freedoms are a small price to pay for fewer terrorist attacks. But even they agree that a balance has to be struck between civil liberties and security. The argument—between Republicans and Democrats, George Bush and John Kerry—is over where exactly this balance should lie.[14]

    Moreover, courts regularly use the terminology of balance, weighing the public interest against individual rights, and allowing the latter to be curtailed when they undermine a “compelling public interest,” for instance, allowing the violation of privacy of sex offenders in order to protect children from sex abuse, authorizing wiretaps for suspected killers, and to enhance security.

    The next step is to recognize that the point of balance changes throughout history, as domestic and international conditions change. Thus, in the wake of Prohibition more power was given to national police forces—to the FBI, after it was revealed that local law enforce Page  172 ment authorities were riddled with corruption. At that time, J. Edgar Hoover was a major positive force, bringing professionalism and integrity to police work. Over the decades that followed, the FBI accumulated more and more power and eventually itself became a major violator of individual rights and civil liberties, leading to the Church Committee reforms in the 1970s, which greatly curbed the bureau’s powers—tilting the balance back toward stronger protections of individual rights. Following 9/11, the USA PATRIOT Act was introduced, followed by numerous other security-enhancing measures introduced by President Bush, which, as noted above, jerked the balance heavily in the opposite direction.

    The question is, given the current conditions, which direction does the balance need to be pulled? Critics often argue that new security measures are excessive and demand that they be rolled back. But it is necessary to proceed with some caution here. Although there have been no successful terrorist attacks on the U.S. homeland since 9/11, there is good reason to assume that continued attempts will be made to inflict harm on the United States. Moreover, old and new security measures are best treated not as one bundle, but reviewed one at a time. One should avoid both holistic positions, the ones that claims that we are at war and hence must pull out all the stops, or the position that all new security measures are suspect. An unbundled review finds the following:

    1. Some measures are fully justified, indeed overdue. These often entail a mere adaptation of the law for technical developments. For example, FISA provided guidelines under which a federal agent could obtain authorization to conduct surveillance for “foreign intelligence purposes.” Prior to 9/11, wiretap warrants were limited to a given phone. Because of the increasing use of multiple cell phones and e-mail accounts over the last decades, federal officials engaged in surveillance under FISA found it much more difficult to conduct surveillance, as they could not follow suspects as they changed the instruments they were using unless they got a new court order for each communication device. The USA PATRIOT Act, enacted in October 2001, overcame this difficulty by amending the existing FISA law to allow what is called “roving surveillance authority”—making it legal for agents to follow one suspect, once a warrant is granted, whatever instrument he or she uses. Unless one holds that terrorists are entitled to benefit from new technologies but law enforcement is not entitled to catch up, this is an overdue and reasonable measure. Page  173 Similarly, before 9/11, the regulations that allowed public authorities to record or trace e-mails were interpreted by Department of Justice lawyers as requiring court orders from several jurisdictions through which e-mail messages travel.[15] This was the case because in the old days phone lines were local and hence to tap a phone, local authorization sufficed. In contrast, e-mail messages travel by a variety of routes. As of 2001 the USA PATRIOT Act permits national tracing and recording. A third example of a measure that is overdue stems from another technological development. FISA warrants are not required for surveillance of foreign-to-foreign communications. Currently, however, many foreigner-to-foreigner communications (say from Latin America to Europe) are routed through the United States. Still, the law is interpreted as requiring a warrant for tapping these communications, as if they were between U.S. persons. A progressive should not oppose updating this interpretation of the law to adapt to new technological realities.
    2. Some new security measures are reasonable. One should note that although the PATRIOT Act has become a sort of symbol for great excesses in hasty pursuit of security, only a small fraction, about 15 of its more than 150 measures, have been seriously contested. (Indeed, one of them reduces the penalty on hackers!) That is, most measures encompassed in the act are considered reasonable even by civil libertarians. Another example of a new security measure that seems reasonable is a tracking system of those who come to study, visit, or do business in the United States. Before 9/11, the United States did not check whether those who came into the country for a defined period of time, say on a student visa, left at the end of that period. Many did not leave, but there was no way of knowing how many there were, who they were, and above all what they were doing. The new Internet-based student tracking system requires colleges to alert authorities if a newly enrolled foreign student fails to show up for school or is otherwise unaccounted for. The system was initially plagued by a variety of problems (not the least of which was opposition by some college administrators, students, and others). One can argue whether or not such a measure is beneficial, but it is hard to see why it would be declared prima facie unreasonable, a system that is in place in practically all free societies.
    3. Some measures such as torture and mass detention of people based on their social status are beyond the pale.
    4. Many measures are neither inherently justified because enhanced security requires them nor inappropriate because they wantonly vio- Page  174 late rights. Instead, their status is conditioned on their being subject to proper oversight. In other words, the legitimacy of such measures depends on their place in what I call the second balance.

    The Second Balance

    Homeland protection requires drawing greatly on the second balance and not being limited to attempts to find the first one. The idea that underlies the second balance is that a measure that may seem tilted toward excessive attention to security may be tolerated if closely supervised by second-balance organs (discussed below), while a measure that is judged as tilting toward excessive attention to individual rights may be tolerated if sufficient exceptions are provided that are backed up by second-balance organs. That is, new measures can either be excessively privileged (undermining either security or the regime of rights) or excessively discriminated against both (leading to inaction on behalf of either element of a sound balance).

    The second balance sought here is not between the public interest and rights, but between the supervised and the supervisors. Deficient accountability opens the door to government abuses of power, and excessively tight controls make agents reluctant to act or incapable of doing so.

    Although the two forms of balance have some similarities and at some points overlap, they are quite distinct. For instance, the argument that the government should not be able to decrypt encoded messages is different from recognizing that such powers are justified—as long as they are properly circumscribed and their use is duly supervised.

    A simple example of the idea at hand may serve to introduce this key point. On many highways drivers now have the option of using computerized toll-collection systems, such as the E-Z pass, whereby an electronic device deducts the toll from credit posted on a chip inside the person’s car. The information gained by the computers of the toll booth—that a car owned by a given person passed a given point at a given time—can be treated in a variety of ways. At one extreme, it can be erased immediately after the computer deducts the proper amount from the credit stored in the car’s chip. At the opposite extreme, such data can be kept on file for years, added to that person’s dossier kept by a government agency or even private company, and made available for law enforcement, divorce lawyers, and even the media (a far from hypothetical situation). One extreme maximizes individual rights, espe- Page  175 cially privacy, while the second excessively privileges security and arguably other common goods. And one can readily conceive of a variety of intermediary positions.

    If one approaches this device only within a first-balance frame of mind, one will ask how long and for what usages one should allow the said information to be stored. One then judges the use of E-Z passes—or any other such measure—as proper or as illegitimate per se. The second balance adds another major consideration: It asks how the arrangements worked out in terms of the first balance are reviewed and enforced. Different answers to this second question will lead one either to tolerate or reject a measure, whatever its standing according to the first-order balance. Thus, for instance, if we know that said information will be used only for curbing terrorism, and be available only if a proper search warrant has been issued by a court, one may find such storing of information more acceptable than if one learns that on many occasions the employees of toll agencies released the information to the likes of private investigators and the media. The same holds for all security measures but those that are tabooed.

    The term balance is chosen because one can tilt excessively in either direction. Although most consideration is currently given to lack of adequate oversight, supervision, or accountability (from here on, to save space, I use the term oversight to refer to all such second-order processes: those that examine, review, and correct first-order processes), the opposite can also take place. For instance, FBI and CIA agents may again become reluctant to act if they believe that the acts they were authorized—indeed ordered—to carry out in the past can be retroactively defined as illegal and they can be jailed for having performed their jobs, or at least be forced to assume large personal debts to pay for legal representation. (True, as the famous Eichmann case illustrated, from a moral standpoint there are some acts that “everyone” should know are beyond the pale regardless of what their orders are, and should hence refuse orders to carry them out. However, one cannot run a security system based on the notion that people will rebel routinely. Instead, one should seek to ensure that as a rule those involved will be able to assume that orders are legitimate, in part because they are subject to proper oversight.)

    Oversight is already in place in several forms and modes. And is it without the desired effect. However, a progressive approach recognizes that in the current circumstances it is essential to make oversight much stronger—in order to allow enhanced security.

    Page  176


    Determining whether a specific public policy measure is legitimate entails more than establishing whether it significantly enhances public safety and minimally intrudes on individual rights. It also requires assessing whether those granted new powers are sufficiently accountable to the various overseers—ultimately to the citizenry. Some powers are inappropriate no matter what oversight is provided. However, others are appropriate given sufficient accountability. If accountability is deficient, the remedy is to adjust accountability, not to deny the measure altogether.

    Whether the specific powers given to the government sustain or undermine the balance between rights and safety depends on how strong each layer of accountability is, whether higher layers enforce lower ones, and whether there are enough layers of accountability. I suggest that we should ignore both public authorities’ claims that no strengthening of accountability is needed and the shrillest civil libertarian outcries that no one is to be trusted. Instead, we should promote reforms that will enhance accountability rather than deny public authorities the tools they need to do their work. This does not necessarily mean granting them all the powers they request, but in a world where new technologies have made the government’s duties more difficult and in which the threat to public safety has vastly increased, we should focus more on accountability before denying powers to law enforcement.


    This chapter draws significantly on four sources: the author’s videotaped remarks at the Media, Technology and Society conference at the University of Michigan, March 2006; and Etzioni 1999, 2002, and 2008.

    1. [www.ctiawireless.com]. return to text

    2. [www.nielsen-netratings.com]. return to text

    3. 18 U.S.C. §§ 3122–23 (2000); United States v. Giordano, 416 U.S. 505, 549 n. 1 (1974); 18 U.S.C. § 2518 (2000); Smith v. Maryland, 442 U.S. 735 (1979); Swire 2001. return to text

    4. Schultz 2001, 1221–23; Berg 2000; Dempsey 1997; Dhillon and Smith 2001; Freiwald 1996; Taylor 2001; Department of Justice, Field Guide on the New Authorities (Redacted) Enacted in the 2001 Anti-Terrorism Legislation § 216A. return to text

    5. ECPA, Pub L. 99-508, § 106(d)(3), 100 Stat. 1848, 1857 (1986) (codified as amended at 18 U.S.C. § 2518(11) (2000)); Intelligence Authorization Act for Fiscal Year 1999, Pub. L. 105-272, § 604, 112 Stat. 2396, 2413 (1998) (codified as amended at 18 U.S.C. § 2518(11)(b) (2000)); United States v. Petti, 973 F.2d 1441, 1444–45 (9th Cir. 1992); see also Faller 1999; USA PATRIOT Act (2001), Pub. L. 107-56, 115 Stat. 272 (codified in scattered sections of U.S.C.); Foreign Intelligence Surveillance Page  177 Act of 1978, Pub. L. 95-511, 92 Stat. 1783 (codified as amended at 18 U.S.C. §§ 2511, 2518–19 (2000), 47 U.S.C. § 605 (2000), 50 U.S.C. §§ 1801–11 (2000)). return to text

    6. Olmstead v. United States, 277 U.S. 466 (1927). return to text

    7. Katz v. United States, 389 U.S. 347, 351 (1967); State v. Reeves, 427 So. 2d 403, 425 (La. 1982); Amsterdam 1974, 384–85; Laba 1996, 1470–75; Sundby 1994; Julie 2000, 131–33. return to text

    8. The “Carnivore” Controversy: Electronic Surveillance and Privacy in the Digital Age: Hearing Before the Senate Comm. on the Judiciary, 106th Cong. (statement of Donald M. Kerr, Assistant Director, Laboratory Division, FBI). return to text

    9. Affidavit of Randall S. Murch at 3–4, United States v. Scarfo, 180 F. Supp. 2d 572 (D.N.J. 2001) (No. 00-404); “Judge Orders Government to Explain” 2001, 3; United States v. Scarfo, 180 F. Supp. 2d 572, 577 (D.N.J. 2001). return to text

    10. Ted Bridis, “Congressional Panel Debates Carnivore as FBI Moves to Mollify Privacy Worries,” Wall Street Journal, July 25, 2000, A24; Carnivore’s Challenge to Privacy and Security Online: Hearing Before the Subcomm. on the Constitution of the House Comm. on the Judiciary, 107th Cong. (2001) (statement of Alan Davidson, Staff Counsel, Center for Democracy and Technology); ACLU, “Urge Congress to Stop the FBI’s Use of Privacy-Invading Software.” return to text

    11. Ron Wyden, “Rights That Travel,” Washington Post, December 10, 2007, A19. return to text

    12. Wyden,”Rights That Travel,” A19. return to text

    13. Russ Feingold, October 25, 2001, floor statement. return to text

    14. “The Enemy within; Liberty and Security,” The Economist (London), October 9, 2004, 1. return to text

    15. Department of Justice, Field guide on the New Authorities (Redacted) Enacted in the 2001 Antiterrorism Legislation, § 216.return to text


    Amsterdam, Anthony G. 1974. “Perspectives on the Fourth Amendment.” Minnesota Law Review 58:349–477.

    Berg, Terrence. 2000. “www.wildwest.gov: The Impact of the Internet on State Power to Enforce the Law.” BYU Law Review 2000:1305–62.

    Dempsey, James X. 1997. “Communications Privacy in the Digital Age: Revitalizing the Federal Wiretap Laws to Enhance Privacy.” Albany Law Journal of Science and Technology 8:65–120.

    Denning, Dorothy E., and William E. Baugh Jr. 1997. “Encryption and Evolving Technologies: Tools of Organized Crime and Terrorism.” U.S. Working Group on Organized Crime, National Strategy Information Center.

    Dhillon, Joginder S., and Robert I. Smith. 2001. “Defensive Information Operations and Domestic Law: Limitations on Government Investigative Techniques.” Air Force Law Review 50:135–74.

    Etzioni, Amitai. 1999. The Limits of Privacy. New York: Basic Books.

    Etzioni, Amitai. 2002. “Implications of Select New Technologies for Individual Rights and Public Safety.” Harvard Journal of Law and Technology 15:258–90.

    Etzioni, Amitai. 2008. “Toward a Progressive Approach to Homeland Protection.” Democracy and Security 4 (2): 170–89.

    Faller, Bryan R. 1999. “The 1998 Amendment to the Roving Wiretap Statute: Congress ‘Could Have’ Done Better.” Ohio State Law Journal 60:2093–2121.

    Page  178

    Freiwald, Susan. 1996. “Uncertain Privacy: Communication Attributes after the Digital Telephony Act.” Southern California Law Review 69:949–1020.

    “Judge Orders Government to Explain How ‘Key Logger System’ Works.” 2001. Andrews Computer and Online Industry Litigation Reporter, August 14.

    Julie, Richard S. 2000. “High-Tech Surveillance Tools and the Fourth Amendment: Reasonable Expectations of Privacy in the Technological Age.” American Criminal Law Review 37:127–43.

    Laba, Jonathan Todd. 1996. “If You Can’t Stand the Heat, Get Out of the Drug Business: Thermal Imagers, Emerging Technologies, and the Fourth Amendment.” California Law Review 84:1437–86.

    Russell, Deborah, and G. T. Gangemi Sr. 1995. “Encryption.” In Building in Big Brother: The Cryptographic Policy Debate, ed. Lance Hoffman 10–14. New York: Springer-Verlag.

    Schultz, Christian D. H. 2001. “Unrestricted Federal Agent: ‘Carnivore’ and the Need to Revise the Pen Register Statute.” Notre Dame Law Review 76:1215–59. Sundby, Scott E. 1994. “‘Everyman’’s Fourth Amendment: Privacy or Mutual Trust between Government and Citizen?” Columbia Law Review 94:1751–1812.

    Swire, Peter P. 2001. “Administration Wiretap Proposal Hits the Right Issues but Goes Too Far.” October 3. Retrieved April 6, 2009, from [http://www.brookings.edu/papers/2001/1003terrorism_swire.aspx].

    Taylor, Paul. 2001. “Issues Raised by the Application of the Pen Register Statutes to Authorize Government Collection of Information on Packet-Switched Networks.” Virginia Journal of Law and Technology 6:4.

    Theoharis, Athan G., ed. 1999. The FBI: A Comprehensive Reference Guide. Phoenix, AZ: Oryx Press.