Online Privacy

Charles R. Nesson

Last Updated: June 13, 2001

Table of Contents

Introduction
Case Studies
Discussion Topics
Privacy Toolbox
Resources


Introduction

Along with its many benefits, the march of technology makes an encompassing surveillance network seem almost inevitable. We owe much of the privacy we have enjoyed in the past to a combination of immature technology and insufficient manpower to monitor us. But these protective inefficiencies are giving way to efficient technologies of data processing and digital surveillance that threaten to eliminate our privacy. Already we are tracked by our credit-card transactions, our passes through the fast-lanes at toll booths, our cell phone calls.

Each year brings more sensitive and widespread sensing devices, including cameras, microphones, and, potentially, biological sensors, all of which are being connected through increasingly efficient networks to increasingly more powerful data processing and storage. Cameras are proliferating, in toll plazas, on public streets, and in public parks. We welcome them as crime-fighters, even as they eliminate our ability to move through the world untracked. Face and voice recognition software may soon permit image data from surveillance cameras to be cross-referenced to databased profiles of each person observed. To get a hint of the future, enter your street address at globexplorer.com. You will see a satellite picture nearly good enough to show a car parked in your driveway, or in mine. Better resolution is coming soon. We are moving toward a transparent society in which our actions and transactions are followed, our lives tracked and documented, by folks we neither know nor trust; each of us a star in our own Truman Show.

Two privacy writers, Simson Garfinkel and David Brin, each in their own way, suggest the breadth and immediacy of technology's threats to privacy. Garfinkel, the author of "Database Nation,"(1) describes a thwarted effort in the late sixties to establish a massive central database of citizen information, a national data center to be administered and controlled by the United States government. Political opposition, based on fears of Big Brother, killed this proposal. Garfinkel, who fears Big Brother as much as anyone, surprisingly regrets the defeat of the proposal. Why? Because the alternative that emerged in its place seems to him much more threatening. Instead of a single known giant database in government hands, which might have been subject to some privacy controls, we have instead many databases in corporate hands which are not subject to privacy controls, and which are difficult even to inventory.

David Brin, another brilliant young author, in his book "The Transparent Society," (2) suggests the dimension of threatened privacy loss in another way, by the question he addresses. He poses a hypothetical based on the assumption that ubiquitous surveillance is coming. Imagine two cities of the future, he challenges. They look very much the same. Each is clean, orderly, utterly without crime, with surveillance cameras on every building and street corner. But they are very different underneath. In one city all of the surveillance cameras connect to the police stations where they are monitored by government authorities. In the other city the surveillance cameras connect to a net that allows anyone to watch. Brin asks, in which city would you rather live? He argues that we will actually be better off if all of us can watch each other rather than entrusting the monitoring function to an all-powerful agency.

But must we give in to the idea of ubiquitous surveillance? It is true that there are extraordinarily strong forces pushing in the direction of ubiquitous surveillance. Business and government, the two strongest forces shaping technological development of the net, each seem aligned in their hunger for large databases containing detailed information about us. Business wants such information to aid in marketing, government to assist in surveillance and control. The question is, are those of us who would resist the evaporation of our privacy capable of doing so, or must we accept its loss as inevitable.

Scott McNeely, CEO of Sun MicroSystems, asserts that we already have "zero privacy." His advice, "Get over it." Recent surveys, however, would seem to indicate that Americans are not prepared simply to "get over it." Concern for loss of privacy seems to be widespread and growing. Surveys by the Pew Foundation (3) and Harris Polls(4) show that Americans want a presumption of privacy in their daily lives. Eighty-six percent support "opt-in" requirements for Internet companies. Fifty-four percent feel that the tracking of users by web sites is harmful. Sixty-one percent of online-users are concerned that their e-mail will be read by someone else, without their knowledge or consent. Seventy-eight percent of on-line shoppers are concerned that when buying something on-line, the personal information they provide will result in their being sent unwanted information. Eighty-nine percent would be uncomfortable with a web site using their browsing and shopping practices to create a profile linked to their names. Ninety-two percent would feel the same about a web site selling their information to others.

Yet widespread and growing concern for privacy does not easily translate to action.

Self-help

One might expect that those of us who are concerned about our privacy would look to ourselves and take advantage of whatever means are available to us to protect it. A few moments of reflection about our daily routines yields a number of privacy-enhancing adjustments that we could make. When using the Internet, for example, we could employ different identities. We could disable cookies, use multiple e-mail addresses, and fake the personal data sought by web sites. We could stay off the Internet all together. When in public, we could take steps to avoid being "captured" by surveillance cameras-refrain from using ATMs, avoid those businesses and public places where cameras are installed, or just stay home. We could take advantage of direct marketing opt-out lists by sending a letter or making a phone call. We could avoid credit cards in favor of using cash, telephone from pay phones, and avoid automatic toll lanes. We could encrypt our electronic messages and our files, and route our communications through anonymizers. With enough planning and effort, we might avoid the surveillance net.

Yet despite widespread and growing concern for privacy, we seem unable or unwilling to take advantage of privacy-enhancing strategies that are open to us. Like most visitors to our national parks who idealize the beauty and isolation of the great outdoors without ever straying from the well-trodden trails and asphalt roads, most of us express concern for our loss of privacy but do little to protect it.

That there are costs for protecting privacy does not of course, in and of itself, preclude the use of privacy enhancing techniques. We do, after all, incur costs all the time for things we desire. We buy and install blinds for our bedroom windows. We build fences in our yards. The problem is that when it comes to privacy from electronic intrusion, the costs of protection are often higher than the perceived benefits.

Professor Michael Froomkin explains this in terms of what he calls "privacy myopia."(5) Individuals tend to undervalue the bits of information about themselves that allow aggregators to build databases and profiles. The problem, Froomkin says, is that any one bit of information about ourselves doesn't seem that valuable to us. This being so, it doesn't seem worth it to go to the trouble of protecting it. Yet, to the aggregator, to the person who is on the other end putting together the profile that will be sold and used as an asset in marketing, that bit of information has value. From the individual's point of view, it is hard to see what the aggregator sees, and hard to know the uses to which the information will be put. This leaves the individual willing to give the information away or sell it cheap. The effect of this privacy myopia is that in a given transaction, too often the costs of withholding, in the interest of privacy, will appear higher than both the values of the information withheld and of the privacy gained from withholding it. Such seemingly rational cost avoidance is even more apparent when deciding whether or not to purchase privacy enhancing products. As one understandably disillusioned cryptographer (John Markoff) put it: "Privacy never seems to sell. Those who are interested in privacy don't want to pay for it."

Judicial Protection

Courts have traditionally protected privacy primarily as an adjunct to real property, enforcing the law of trespass to give us privacy behind fences and closed doors, protecting privacy of place but not privacy of presence. The Supreme Court made a great theoretical advance in privacy law when it announced, in Katz v. United States,(6) that the Constitution protects people, not places. The law will protect our reasonable expectations of privacy, in addition to protecting the physical bounds of our real property.

Grounding privacy in reasonable expectation might seem to offer great promise for privacy protection in the face of advancing technology, but such is not always the case. Expectation can be an insecure footing for privacy, like sand slipping out from under foot as one climbs a dune. A case very recently decided by the Supreme Court provides both hope and caution. In United States v. Kyllo,(7) authorities used a thermal imager to look through the walls of a house. Thermal imagers collect molecular information in the form of heat that radiates through walls. Justice Scalia, during the oral argument asked the petitioner a cogent question: "Why don't your reasonable expectations of privacy include technology?" Inasmuch as there are thermal imagers in the world, why not expect people to guard against them just as "you pull your curtains if you want privacy because you know people have binoculars?" Then, to the suprise of many, Justice Scalia wrote the majority opinion for a five-to-four court holding the thermal image scan to be an unlawful search. He emphasized the expectation of privacy in the home from sense-enhancing technologies, at least where the technology is not "in general public use." What this means for online privacy remains to be seen. Connections through the Internet may originate from one's home, but are not confined to it. People may expect privacy in their online activities, but that has been misconception rather than reality.

Where new technology allows capture of information that was previously not subject to capture, expectations of privacy are likely to be considered naïve and uninformed, grounded in immature technology rather than in law. Claims of privacy intrusion resulting from aerial surveillance of one's house, or thermal imaging of heat emanating from one's walls, or radio pickup of cordless telephone conversations have all failed because the expectation of privacy was not deemed to be objectively reasonable. Effectively, the courts seem to say that, as far as the Constitution is concerned, people's expectations of privacy must change to adjust to the capabilities of new technologies. Like a victim of sexual assault who can prevent being raped by consenting, an individual can guard against violation of privacy expectation by learning not to expect it.

Legislative Protection

The ultimate legislative goal of many privacy advocates is to have privacy seen as a human right of overarching importance, and not merely a customer relations problem. The basic elements of such privacy protection would include requirements of notice, informed consent, access to profiles, availability of process to correct errors, limitation of the secondary uses to which gathered information may be put, and a well-funded agency charged with promoting the values of privacy and enforcing the laws that protect it, something approaching an environmental protection agency for privacy.

So far, industry has successfully resisted legislative movement towards a human-rights approach to privacy. Industry objects to such privacy regulation as antagonistic to the First Amendment, almost un-American and offers self-regulation as the alternative.(8) The Wall Street Journal reports that new privacy regulation will cost businesses as much as $36 billion.(9) Privacy advocates see the problems with self-regulation as fundamental: industry's objective in self-regulation is not primarily to protect privacy, but rather to allay customer fears sufficiently to do business, and to hold government regulation at bay. In the long run, industry self-regulation is a shifting ground for privacy protection, just as is the expectation of privacy to which courts respond.

Notes


(1) Simson Garfinkel, Database Nation, The Death of Privacy in the 21st Century, O'Reilly (2000). For purchase information, see O'Reilly. (back to text)

(2) David Brin, The Transparent Society, Addison-Wesley (1998). For purchase information, see Amazon.com. (back to text)

(3) Pew Internet & American Life Project, Trust and Privacy Online: Why Americans Want to Rewrite the Rules (2000) (back to text)

(4) BusinessWeek/Harris Poll, A Growing Threat (March 2000) (back to text)

(5) A. Michael Froomkin, The Death of Privacy?, 52 Stan. L. Rev 1461, 1471 (2000) (back to text)

(6) Katz v. United States, 389 U.S. 347 (1967) (back to text)

(7) Kyllo v. United States, 2001 U.S. Lexis 4487 (June 11, 2001). (back to text)

(8) Online Privacy Alliance (back to text)

(9) Internet Privacy Rules Could Cost Businesses as Much as $36 Billion, The Wall Street Journal Tech Center (May 2001)

Back to Top | Intro | Case Studies | Discussion Topics | Resources


Case Study 1: The Perfect Search

Excerpted from: Michael Adler "Cyberspace, General Searches, and Digital Contraband: The Fourth Amendment and the Net-Wide Search," Yale Law Journal (January, 1996).

Just as possessors of digital contraband may use the Internet to transfer files back to their hard drives, law enforcement agencies might use the fact that such hard drives are connected to the Internet to seek out evidence of illegalities. The interest that a law enforcement officer might have in examining the contents of a hard drive is obvious; the trove of information there may yield important insights into crimes that the owner may have committed. At the same time, the privacy interest that an individual may have in the hard drive is also obvious; regardless of whether or not the officer finds evidence of a crime, he may well learn much about the owner's private life in the process of looking through the drive. A number of commentators have written recently about the need for a warrant to ensure limits to the range of the examination -- and, consequently, the potential for violation of privacy -- possible in a hard drive search.

All of these commentators have assumed that a human investigator will be examining the hard drive to evaluate its contents. Nevertheless, there are certain types of investigations -- particularly those focused on digital contraband -- in which no human is needed to determine the presence or absence of relevant evidence. A computer program can be designed, for instance, to search through a hard drive and report only the presence or absence of an exact copy of a certain piece of illegally modified software. Such an object-targeted search program would ignore any legitimate copy of the commercial software, as well as any copy that was cracked in even a trivially different way. The program would naturally also ignore everything else on the disk, no matter how blatantly illegal -- or sensationally intriguing -- a human investigator might find that information.

Since such a search program would require an exact copy of the target digital contraband when seeking matching files, the search would be of limited use in targeting particular individuals under suspicion. Say an officer suspects an individual of trafficking in child pornography. The officer could not simply turn the search program loose with the orders that it find any sexually explicit material involving under aged participants. Instead, the officer would need a copy of a particular digital video clip that he believed the suspect possessed, and the search program would tell the officer nothing more than whether that particular clip was on the system. If the suspect had a slightly different video or was clever enough to keep the video encrypted or located on a removable cartridge that was not accessible from the Internet, the search would fail.

However, let us imagine for the moment that the government had acquired technical access sufficient to run such a search program on a large number of networked hard drives simultaneously. Let us further posit that the running of the search program would have a negligible impact on each of the individual systems, and that the search program would report nothing more than the presence or absence of a given piece of digital contraband. Under this scenario, a law enforcement officer who through ordinary means discovered one copy of a piece of digital contraband -- a child porn video or a copy of WordPerfect cracked by "Captain Blood" -- might infer that since one computer owner has this file, others may as well. The officer might then run a Net-wide search for that contraband. He certainly would not capture every single person who possessed it, but he might nevertheless identify dozens, hundreds, or even thousands of individuals who did have a copy on their computers and for whom he would then have probable cause to request a search warrant.

The search just described presents a novel set of characteristics: As part of a dragnet search, individuals' hard drives are searched without their permission and without any particularized cause to believe them guilty, and the search scans through a vast amount of very personal information located within people's offices and homes. At the same time, however, the search has a minimal impact on property, produces no false positives, need not be noticeable, and reveals nothing to officials beyond the identity of some individuals who possess this particular piece of digital contraband.

Back to Top | Intro | Case Studies | Discussion Topics | Resources


Case Study 2: SpyWare - How Private Is the Web?

Recent revenue pressures on both traditional e-commerce companies and peer-to-peer shared software has created a new dimension to web surfing. More and more often, when individuals download software they receive more than they expected. Unbeknownst to them, SpyWare or adware has been bundled with the product they download. This "SpyWare" software allows companies to trace the user's online movements and potentially report the information back - all without the user's knowledge or consent.

Users rarely realize SpyWare programs are bundled with what they're getting because the programs are automatically loaded during installation. Although some companies do alert users to what they are installing, either in the licensing agreements or in the privacy section, few users actually read the notice, and if they do, the information provided may not fully explain what the spyware application will do.

Spyware works by embedding a unique identifier onto your computer. It remains even after the application it came with is removed and constantly runs in the background on your computer. It enables companies to track all of the user's online movements, and may report that information back to a central server.

Spyware has been identified with file sharing services like Gnutella BearShare and Audio Galaxy Satellite, but also RealNetworks (attached to RealJukeBox), most browser extensions and CometCursor. It has even been found attached to cyber-nanny programs like SurfMonkey and on the Learning Company CD-ROMs.

These programs have understandably created a controversy but have been justified by some as the price for freeware. In a recent CNET article Vinnie Falco, the CTO of FreePeers, candidly recognized that "One of the issues around free software is the need to make money somehow. [Spyware] is a great compromise between protecting user privacy and the ability to support free software." The efforts of some companies have been very successful in this area. For example, one ad company, Radiate, has succeeded in placing spyware on 30 million PCs. That is a huge market base for any retailer wanting an easy marketing solution, but can it be justified?

As one author lamented, the basic desire to increase revenues through targeted advertising can become more problematic. "Theoretically speaking, advertising-supported software's capabilities are not just limited to collecting demographic data or reading your browser history. They can explore your computer, randomly scan files and ferret out sensitive information like your credit card number." (Subha Vivek, "Are You Being Watched." Frost.com, 5/18/01) Even if companies don't go that far, the software still impacts the performance of your computer and can have a detrimental effect.

Recent focus on the issue has led some companies to change their practices. Although Mattel has claimed that they do not gather personal information from the software, they have added a program to their website that will uninstall the unwanted programs from people's hard disks. Similarly SurfMonkey has indicated that it too will stop sending personal information back to its servers. However, given the obvious success and usefulness of spyware, it is not likely to disappear.

If you're interested in finding out if your computer is running spyware, use the resources below:

Back to Top | Intro | Case Studies | Discussion Topics | Resources


Case Study 3: Convenience Versus Control


Excerpted from The Atlantic Monthly, March 2001 "The Reinvention of Privacy" (Regarding U.S. Patent 5,629,678--application filed on January 10, 1995--for a "personal tracking and recovery system").


The patent is summed up in an abstract that begins, "Apparatus for tracking and recovering humans utilizes an implantable transceiver incorporating a power supply and actuation system allowing the unit to remain implanted and functional for years without maintenance. The implanted transmitter may be remotely actuated, or actuated by the implantee. Power for the remote-activated receiver is generated electromechanically through the movement of body muscle. The device is small enough to be implanted in a child. "

Until recently such an idea might have seemed better suited to science fiction or political allegory than to real life. But in December of 1999 the patent was acquired by a Florida-based company named Applied Digital Solutions, and it is now the basis of an identity-verification and remote-monitoring system that ADS calls "Digital Angel." Check it out.

Imagine that you are the proud parent of a newborn son or daughter. Would you consider having a location chip implanted (assume it is tiny, painless, and completely safe) in your child? Before you say "no" too fast, consider, would you put LoJack in your car? And if a location device makes sense for your car, then certainly for your child, no? What if your child is kidnapped, or gets lost in the Mall?

What do you see as the implications for society?


Back to Top | Intro | Case Studies | Discussion Topics | Resources


Discussion Topics

1. Do you care about your privacy online? What do we have to worry about? Is the concern worth the trouble of worrying about it? Of doing something about it? What should we do?

2. How do you explain the apparent anomaly between the high measure of expressed public concern for online privacy and the low willingness of people to use self-help tools to protect it?

3. What, if any, protection should we expect from courts for our privacy online? Are courts capable of providing it?

4. Would you favor the establishment of a privacy commission?

Back to Top | Intro | Case Studies | Discussion Topics | Resources


Privacy Toolbox

Cryptography

A basis of online privacy tools is the science of cryptography. Cryptography is defined by TechWeb Encyclopedia as "the conversion of data into a secret code for transmission over a public network. The original text or 'plaintext' is converted into a coded equivalent called 'ciphertext' via an encryption element. The ciphertext is decoded (decrypted) at the receiving end and turned back into 'plaintext.' " Privacy tools rely on the use of encryption devices for securing data and data transactions, sending email, and browsing.

As adapted from The Atlantic Monthly, March 2001 "Open Secrets"

The origins of the computer came from the need, during World War II, to create something that would be able to crack enemy code. The scrambling of code to insure that only the intended recipient receives the message is called cryptography. After the war, governments -- especially the United States -- took over cryptography and made it an absolute secret claiming to be protecting its citizens from terrorists, hackers and other criminals. In the 1990's, however, some "technological crusaders" who were troubled by the government control of private information and its implications, managed to give access to high-powered cryptography to the general public. Since then, cryptography has existed in our every day lives as seen through the use of ATMs, electronic banking, cell phones and web browsers. The impact of cryptography is tremendous according to Lawrence Lessig and others who believe that without it people will not be able to protect their private, personal information online.

Currently the invasion and protection of privacy is an important issue. Cryptography, although will create new crimes, it will stop others. (according to Lessig's "Code" p. 36). "[Cryptography] prevents eavesdroppers from listening to our conversations and reading our e-mails, and from reading information stored on hard disks connected to the Internet." Privacy-protection companies such as Zero-knowledge Systems and Lumeria base their business models on cryptography. In order to use these tools effectively, however, there must be an element of trust present between the customer and the cryptosystem. This is the only way to have faith that your private data will be well protected.


Cookies

According to TechEncyclopedia, a cookie is defined as "Data created by a Web server that is stored on a user's computer. It provides a way for the Web site to keep track of a user's patterns and preferences and, with the cooperation of the Web browser, to store them on the user's own hard disk." The purpose of a cookie is twofold, with benefits to both consumers and sellers: It enables a consumer or user to have her information remembered by a site. For sellers and advertisers, cookies allow tracking of consumers' habits to target specific advertisements to users with certain buying and surfing habits. For a review of data collection technology, see Rita Lin, Information Collection, in E-Commerce: An Introduction (2001) and Internet Cookies, Information Bulletin I-034, U.S. Department of Energy Computer Incident Advisory Center (1998).

Try out Brian Wells' Cookie Demonstration

For more information on cookies, see Electronic Privacy Information Center, The Cookies Page.
For a great demo of cookies, see Privacy.net, Bake your own Internet Cookie.
To find out who's tracking you with web bugs (pixel-sized cookies), use the Privacy Foundation's Bugnosis.
For a demo of how advertising companies like DoubleClick operate cookies with banners, see Privacy.net, How Companies Can Track Your Movements on the Internet.


Anonymizer

Anonymizer is an anonymous browsing service. This technology blocks cookies, Java, and Javascript; encrypts cookies, email, Web addresses (URLs) in the user's browser history list but not the page titles; and allows the user to chat and browse the Internet while concealing the user's identity from their ISP. This tool rewrites the links on each Web page the user visits, thus requiring no browser reconfiguration. With its 'Safe Cookies' encryption device, the Anonymizer enables cookies from first party sites that require them. It allows third party cookies for sites where the user wants to use them and then immediately disables the cookie and eliminates it from the user's browser after the surfing session.


Lumeria

Lumeria is an infomediary that provides SuperProfile software with which users create, store, and distribute identity data to marketers of the user's choice. An infomediary is "an information provider that gathers content from several sources and functions as a data aggregator for a target audience." With the SuperProxy software platform system, Lumeria's technology provides anonymous browsing and access to the software from any computing device without downloading. The SuperProfile also includes an Ad Network feature which allows the user the option to replace a website's ads with ads from the local geographic area, and to replace ads with ads that match the user's profile preferences. The user receives payment for uploading these ads.

Microsoft Internet Explorer 6

Microsoft Internet Explorer 6 (IE6) includes a cookie filtering device imbedded in the browser. The device allows the user to choose from among six privacy preference settings (from the "Block All Cookies" setting to "Accept All Cookies" setting). The user may also opt to import privacy settings from other sites. IE6 previews a site's privacy policy in regard to cookies, provides the user with a one-line description of the policy (the "compact policy"), and informs the user of whether a site collects personal identifiable information and whether a site's compact policy states if this information will be used for secondary purposes. The browser also scans the policy for an opt-in or opt-out provision [see Demonstrations]. The browser reads only policies that are written in P3P (Platform for Privacy Preferences Project) XML code, a format developed by the World Wide Web Consortium. IE6 is not available but a beta version may be downloaded for preview.


Zero-Knowledge

Zero-Knowledge Freedom 2.0 is an anonymizing tool that provides firewall protection, filters cookies, automatically fills out online forms at user's discretion, manages and blocks online ads, monitors outgoing text and alerts user before sending personal identifying/sensitive information, encrypts email, and allows the user to browse and chat on the Internet anonymously. Freedom 2.0 allows the user to choose from among multiple (up to five) user-created actual or pseudonymous digital identities (called "nyms") to use for filling out forms, browsing, chatting, and sending email. A note on firewalls: A firewall is "a method for keeping a network secure." The Privacy Foundation glossary decribes it as "a software or hardware device to control access to computer on a local area network (LAN) from outside computers on the Internet."


Opt-In/Opt-Out

Opt-in and opt-out are two options that a user may be provided when deciding whether to receive, send, or accept information, communications, or a service. Some web sites include an opt-in or opt-out provision in their privacy policies. To opt-in is "(t)o purposely accept some situation or condition ahead of time." To opt-out is "(t)o cancel some situation or condition." For example, under an opt-in provision, a web site may share a user's information with other businesses only if the user requests or gives the web site permission to do so. In the same situation under an opt-out provision, the web site may share the user's information with other businesses unless and until the user requests that the web site discontinue doing so. These provisions may be activated by sending an email to the web site, or by clicking or checking a box on a page of the web site.

Back to Top | Intro | Case Studies | Discussion Topics | Resources


Additional Resources

FindLaw Online Privacy Law

Washington Post, Scott McNealy, “The Case Against Absolute Privacy,” May 29, 2001. Discusses consumer benefits of sharing information online. Advocates self-regulatory approach of businesses to online privacy.

The Emergence of Website Privacy Norms,” Steven A. Hetcher, 7 Mich. Telecomm. Tech. L. Rev. 97 (2000 / 2001). Analyzes history and current trends in website/consumer relationships in exchange and use of personal data.  

The Privatization of Big Brother: Protecting Personal Sensitive Information From Commercial Interests in the 21st Century,” Mike Hatch, 27 Wm. Mitchell L. Rev. 1457 (2001).Argues for an opt-in policy view in all legislation concerning the sharing of personal data with secondary parties. 

“E-commerce Privacy and the Black Hole of Cyberspace,” Stephen R. Bergenson, 27 Wm. Mitchell L. Rev. 1527 (2001).“The Privacy Paradox,” Eric Jorstad, 27 Wm. Mitchell L. Rev. 1503 (2001).

The Criminalization of True Anonymity In Cyberspace,” George F. du Pont, 7 Mich. Telecomm. Tech. L. Rev. 97 (2000 / 2001).

“Internet Privacy: Does the Use of “Cookies” Give Rise to a Private Cause of Action for Invasion of Privacy in Minnesota?,” Gregg M. Fishbein and Susan E. Ellingstad, 27 Wm. Mitchell L. Rev. 1609 (2001).

“Window Peeping in the Workplace:A Look Into Employee Privacy in a Technological Era,” Donald H. Nichols, 27 Wm. Mitchell L. Rev. 1587, (2001).

“Fair Information Practices and the Architecture of Privacy: (What Larry Doesn’t Get),” Marc Rotenburg, 2001 Stan. Tech. L. Rev. 1 (2001). Critiques Larry Lessig’s book Code and Other Laws of Cyberspace and its approach to privacy protection.Working paper version available.

International/European Union Perspective“Internet Regulation—Heavy Handed or Light Touch Approach?A View From a European Union Perspective,” Robert T J Bond, 27 Wm. Mitchell L. Rev. 1557 (2001).

Jessica Litman, Information Privacy/Information Property, 52 Stan. L. Rev. 1283 (2000).

Joel R. Reidenberg, Restoring Americans' Privacy in E-commerce, 14 Berkeley Tech. LJ 771 (1999).

Lessig, Code, Chapter 11, "Privacy"

Zittrain, "What the Publisher Can Teach the Patient", Stanford Law Review, Vol. 52

Randy Barret, "Stealth Chatter." ZD Ney 5/10/01

Communications Decency Act: 47 USCA § 230

Electronic Communications Privacy Act: 18 USCA § 2702P3P and Privacy on the Web FAQ - a project aimed at creating a technological solution to privacy problems. Maybe we will never need to read the privacy policy again.

EPIC, "Pretty Poor Privacy: An Assessment of P3P and Internet Privacy" David Chaum, "Achieving Electronic Privacy," Science (August 1992): 96-101. Scientific American, "Computer Security and the Internet," October 1998 (hard copy only)

The following links offer some interesting perspectives and a few examples of experiments gone awry.

Privacy Foundation, "Work Place Surveillance is the Top Privacy Story for 2000," (12/28/00); Statistics on work place monitoring and links to related articles.

ECompany, You're Being Followed (This Is Not News). December 1, 2000

Solveig Singleton, Privacy as Censorship: A Skeptical View of Proposals to Regulate Privacy in the Private Sector, Cato Policy Analysis No. 295, January 22, 1998.

US News & World Reports: "More Web Users Wage Guerilla War on Nosy Sites" (8/28/00)

US News & World Reports: "Internet Privacy" (10/2/00)

FTC, "Online Profiling: A Report to Congress," June, 2000

Industry Standard, "Toysmart Settles with FTC," July 21, 2000

CNET, "Thirty-nine States Object to Sale of Toysmart's Customer List," July 21, 2000

Jean Camp, Trust and Risk in Internet Commerce. Chapter 3

Brian Sullivan, "Sen. Kerry: On-Line Taxes and Privacy Changes Coming," Computer World (5/18/01)

Keeping "private e-mail" private: A proposal to modify the Electronic Communications Privacy Act. Robert S. Steere, 33 Val. U.L.Rev. 231 (1998).

Carnivore Eats Your Privacy, Wired News, 7/11/00

Critics Blast FBI's First Release of Carnivore, CNET, 10/2/00

EPIC's Carnivore Archive, Electronic Privacy Information CenterNYTimes, "Judge Sets F.B.I. E-Mail Scanning Disclosure," August 3, 2000 (follow trail of article links on the right).

Newsbytes.com, "Administration Bias Alleged in Carnivore Review Team," October 4, 2000

ZDNET, Mary Behr "Privacy At What Cost?" (5/18/01). Can companies even afford to protect consumers in the way the government wants?

Privacy Foundation, "E-mail Wiretapping," 2/5/01

CNET, "Beware: E-signatures Can Easily Be Forged," July 14, 2000

Reuters, "Clinton Relaxes Crypto Export Rules," July 17, 2000

1999 FTC Report

Testimony of Joel R. Reidenberg: Professor of Law and Director of the Graduate Program Fordham University School of Law before the House Subcommittee on Commerce, Trade and Consumer Protection, Committee on Energy and Commerce: Hearing on the EU Data Protection Directive: Implications for the U.S. Privacy Debate (March 8, 2001 )

Justin Boyan, "The Anonymizer: Protecting User Privacy on the Web," CMC Magazine (December 1997).

Toby Lester, "The Reinvention of Privacy," Atlantic Monthly (March 2001) (includes overview of Zero-Knowledge and Lumeria).

Massachusetts Institute of Technology, Ethics and Law on the Electronic Frontier, "Group Project: Protecting privacy through anonymity tools," (Spring 2001) (listing resources on privacy tools and online anonymity).

World Wide Web Consortium, Platform for Privacy Preferences (P3P) Project..

"As Congress Mulls New Web Privacy Laws, Microsoft Pushes System Tied to Its Browser," Glenn R. Simpson, Wall Street Journal, March 21, 2001.

"A Pseudonymous Communications for the Internet," Ian Avrum Goldberg, Dissertation for PhD in Computer Science at University of California at Berkeley, Fall 2000. Dr. Goldberg is Chief Scientist for Zero-Knowledge Systems.

"Industry Wants to Opt Out of Opt-In," Yair Galil, Internet Law Journal, April 16, 2001. Critiques opt-in provisions and provides several links to studies, cases and other articles.

"Network Advertising Initiative: Principles not Privacy," Electronic Privacy Information Center, July 2000. Critiques opt-out provisions.

Back to Top | Intro | Case Studies | Discussion Topics | Resources


Berkman Center for Internet & Society