Return to Privacy Module I

 

Excerpted from: Jerry Kang, Information Privacy In Cyberspace Transactions, 50 Stanford Law Review 1193, 1212-20 (April 1998)

 

* * * *

Now that we know what information privacy is, we should probe what purpose it serves. [FN61]

 

  Avoiding embarrassment.  In any given culture, disclosures of certain behaviors, actions, or fates will embarrass the individual‑‑even when the behavior, action, or fate is neither blameworthy nor stigmatized.  Take urination for example.  There is nothing wrong with urination; all humans do it.  The fact that someone urinates is not going to be used against her. However, a visual disclosure of that behavior‑‑for instance, being caught on videotape through a hidden camera‑‑would cause intense embarrassment for most Americans.  Another example is minor hemorrhoids.  Assume that this fact will not be used against the person in any way.  The individual will not pay more for health insurance, will not drop in social standing, and will not lose her job or friends.  Nevertheless, the broad disclosure of this fact would embarrass many, perhaps most, people.

 

  That these examples are culturally contingent makes them no less real. [FN62]  In other words, the fact that different cultures may react differently to such disclosures does not deny that, for each culture, there are some zones of behavior, actions, or fates the disclosure of which‑‑in and of itself‑‑will cause discomfort or embarrassment. [FN63]  One value of information privacy, then, is to avoid the simple pain of embarrassment.

 

  Constructing intimacy.  An individual's capacity to disclose personal information selectively also supports her ability to modulate intimacy. Charles Fried has argued this case most prominently. [FN64]  By virtue of information privacy, one can selectively regulate the outflow of personal information to others.  By reducing this flow to a trickle, one can construct "aloofness, removal, and reserve," [FN65] and maintain substantial social distance.  Conversely, *1213 one can release a more telling flow of personal information, [FN66] which invites and affirms intimacy. [FN67]

 

  According to Fried, information privacy is necessary to create social relationships that go beyond the basic respect due all human beings. [FN68] Something in addition to basic human respect must exist between two individuals to transform their relationship into one of trust, friendship, or love.  That additional something is intimacy, which is partly created by the release of secrets‑‑the selective disclosure of personal information. [FN69]  Without information privacy, we would be less able to disclose on a case‑by‑case basis the nonpublic facets of our personality.  Thus, we would lack the "moral capital" [FN70] needed to construct intimacy. [FN71]

 

  I concur with Jeffrey Reiman's critique of Fried that intimacy is more related to the sharing of experiences than the sharing of secrets. [FN72] This does not mean that information privacy has nothing to do with modulating intimate relationships.  I believe that intimacy, at least for adults in current American culture, involves the display of certain behaviors unseen in public areas, such as playfulness, childlikeness, and certain types of physical touching‑‑which take root and flower best in an information preserve, away from the harsh light of publicity. [FN73]  If we were under observation, we would not be able to display caring to other individuals as freely, spontaneously, or completely as we might otherwise. [FN74]  This, in turn, would hinder the construction of deep social relationships.

 

  *1214 Averting misuse.  Yet another value of privacy is that it protects against improper uses of personal information.  Personal information can be misused in two ways.  First, it can derail an otherwise fair process that distributes benefits and burdens.  Many social goods‑‑such as jobs, offices, remuneration, and respect‑‑as well as social bads‑‑such as unfriendliness, disrespect, and imprisonment‑‑are granted or denied on the basis of data about ourselves.  If these social goods and bads are allocated based on personal data of poor quality, unfairness may result: Garbage in, garbage out. [FN75] Further, high quality information in one context may be low quality information in another because, as Kenneth Karst explains, "the evaluator and the recipient of his statement may not share the same standards for reducing a complex set of facts to evaluative inferences or even the same language." [FN76]  Worse, such decisions may be difficult to discover and correct, [FN77] especially when they are generated through automated processes.  Computers, with their air of objectivity and infallibility, resist dispute. [FN78]  One way to check against such information misuse is to give the individual greater control over the flow of personal information.  An individual with such control will take preventative *1215 measures, for instance, by keeping irrelevant personal data away from the decisionmaker. [FN79]

 

  Second, information can be misused by making us vulnerable to unlawful acts and ungenerous practices.  After all, personal information is what the spying business calls "intelligence," and such "intelligence" helps shift the balance of power [FN80] in favor of the party who wields it. [FN81]  To take a simple example, knowledge of our home phone number and address makes us more vulnerable to harassers [FN82] and stalkers. [FN83]  Personal information can also make us vulnerable, for instance, to identity theft.  [FN84]  Besides outright illegal acts, another's control of our personal information can make us susceptible to a whole range of ungenerous practices. It could subject us to influence that crosses the line between persuasion and undue influence.  Sophisticated advertisers, for example, do not merely track consumer demand; they manufacture *1216 it outright. [FN85]  Detailed knowledge of who we are and what we consume makes the job of preference fabrication that much easier. [FN86]  More disturbingly, personal information can be misused by making us vulnerable to prejudice or unwarranted disesteem.  An example is the information that one is gay, which could be evidenced by accessing certain Internet discussion groups or making certain cyberspace purchases. [FN87]  For those not generally "out," the inability to control this information creates tremendous social and psychological vulnerability.

 

  Individual vulnerability has social consequences.  It chills individuals from engaging in unpopular or out‑of‑the‑mainstream behavior.  While uniform obedience to criminal and tort laws may deserve praise, not criticism, excessive inhibition‑‑not only of illegal activity but also of legal, but unpopular, activity [FN88]‑‑can corrode private experimentation, deliberation, and reflection. [FN89]  The end result may be bland, unoriginal thinking [FN90] or excessive *1217 conformity to unwarranted social norms. [FN91]  Worse, the self‑repression of activity and communication could undermine the self‑critical capacities of a polity.  [FN92]  This is why totalitarian regimes have maligned a desire for privacy as deviant, in part to sap an individual's ability to question the status quo and to experiment with alternate conceptions of the good life. [FN93]

 

2. Countervalues.

 

  It would be one‑sided to discuss only the values supporting information privacy when prominent countervalues‑‑values against individual control over personal information‑‑also exist.

 

  Commerce.  By requiring the individual's consent before personal data are processed, privacy applies friction to the flow of information.  This friction, the argument goes, hurts commerce; better information leads to better markets. When this argument is made, two stories are often told‑‑one about junk mail, the other about consumer credit.  The junk mail story starts by explaining that junk mail is only "junk" because it was sent to the wrong person.  If the direct marketing industry had better intelligence about personal interests and preferences‑‑for example, by being able to examine an individual's*1218 history of consumption‑‑people would receive less "junk."  Because information privacy makes this more difficult, it increases the search costs of matching interested buyers with interested sellers.  In short, more privacy means more junk.  The consumer credit story starts by noting that a freer flow of personal information can decrease the costs of consumer credit by helping creditors avoid bad credit risks.  Additional personal information allows greater discrimination among individuals according to whatever characteristic is relevant to a particular transaction. [FN94]  This, in turn, decreases the cost of such transactions either generally, or, at the least, for those individuals who possess a favorable set of characteristics. [FN95]

 

  The commerce argument, as thus stated, presumes that privacy necessarily entails information blockage.  But this is not so.  If individuals will truly benefit by releasing their personal data, e.g., by getting less junk or cheaper credit, they will rationally choose to do so. [FN96]  Information privacy does not mandate informational quarantine; it merely requires that the individual exercise control within reasonable constraints over whether, and what type of, quarantine should exist.  Accordingly, these arguments do not demonstrate that the individual should be deprived of information privacy.  At most, they suggest that individuals should be open to information processing in exchange for commercial benefit and that society should make such exchanges feasible. [FN97]

 

  Truthfulness.  Information privacy allows one to have thoughts, beliefs, conditions, and behaviors without the knowledge of others, thereby making it easier to have public personae distinct from private ones.  This differentiation between public and private visages need not be used for good, such as self‑determination and deliberative politics.  Instead, the argument goes, it will be used to deceive and defraud.  Individuals will not only keep poor quality information away from decisionmakers, they will also conceal high quality, but legitimately detrimental, information.  The cover of privacy might encourage individuals not only to engage in activity unjustifiably *1219 stigmatized but also justifiably stigmatized.  Worse, they may be hypocrites, publicly espousing norms they privately abandon. [FN98]  The parade‑of‑horribles conjures easily: the unrehabilitated child molester volunteering for day care; the domestically violent tyrant passing as winsome celebrity; the sexually promiscuous person, infected with herpes, claiming to be disease free; the reckless driver swearing falsely to be accident free. Perhaps Richard Posner was right to recast invasions of privacy as self‑defense against deception. [FN99]

 

  It would be facile to deny that information privacy can cloak our darker sides and aid misrepresentation.  Equally facile, however, is the inference that information privacy is thus inexorably the handmaiden of deception. Privacy is not valuable only to those with something discreditable to hide. Individuals do not always seek to conceal or control personal information to exploit others in some acquisitive, tortious, or immoral way. [FN100]  Put in other terms, secrecy‑‑the intentional concealment of personal information‑‑ does not always amount to lying. [FN101]  The hallowed example is the secret ballot. [FN102]

 

  Moreover, it is not inherently wrong for individuals to have differing private and public masks. [FN103]  Consider how differently we act, and rightly so, between work and home.  Only an unsophisticated psychology assumes one true, essential personality, with all other personae spurned as deceitful masks.  In fact, all our masks, all our roles, constitute integral facets of our personalities, none of which is necessarily privileged, true, or authentic. [FN104]  This is not to say that no core personality exists.  But this core personality is a weighted composite of the multiple personalities we experience and cultivate. [*1220 FN105]  The ability to maintain divergent public and private personae creates the elbowroom necessary to resist social and political homogeneity. [FN106]

 

  In sum, information privacy does not necessarily promote deception and fraud.  It can do so only if both the nature of the relationship between the individual and the information user, and the ethical or legal duties of disclosure inherent to that relationship, command an openness that information privacy prevents.  What is important is that in most cyberspace transactions, which I describe below, far more information is collected than any self‑ defense "need to know" principle could justify.

* * *

 

(RETURN TO MODULE I)

 

(RETURN TO COURSE HOMEPAGE AND SYLLABUS)

_______________________________________________________________

footnotes accompanying excerpts:

 

[FN61]. The analysis here of the relevant values supporting privacy is not complete.  I discuss the value of dignity separately in Part III.A.2 below. Other values on which I do not dwell are listed in Kim Lane Scheppele, Legal Secrets: Equality and Efficiency in the Common Law 181‑83 (1988) (arguing that privacy is also necessary for sanity and role maintenance).

 

[FN62]. See Irwin Altman, Privacy Regulations: Culturally Universal or Culturally Specific?, 33 J. Soc. Issues 66, 67 (1977).  For interesting discussions of how different cultures maintain privacy, see, for example, Altman, supra note 23, at 14 (discussing the Tuareg veil worn almost continually over the mouth), and Robert F. Murphy, Social Distance and the Veil, in Philosophical Dimensions of Privacy, supra note 22, at 34, 42‑44 (same).

 

[FN63]. See Altman, supra note 23, at 42 ("[I]t might be said that mechanisms for separating the self and non‑self‑‑that is, for regulating interpersonal boundaries to achieve a desired level of privacy‑‑are universal and present in all societies.").

 

[FN64]. See Fried, supra note 41.  See generally James Rachels, Why Privacy Is Important, in Philosophical Dimensions of Privacy, supra note 22, at 290 (offering similar arguments).

 

[FN65]. Murphy, supra note 62, at 34.

 

[FN66]. Take, for example, the choice to reveal selectively that one is gay or lesbian.  See Susan J. Becker, The Immorality of Publicly Outing Private People, 73 Or. L. Rev. 160, 206 (1994) ("Many gay people find terrifying the option of taking one giant leap to universally disclose this intimate detail of their lives, while the possibility of taking a series of small steps towards that goal is palatable.").

 

[FN67]. See Murphy, supra note 62, at 36 ("This imposition of distance on the parameters of the role set does more than make other roles possible, for it promotes the solidarity of the relationship itself.  In this sense, many role sets are effective secret societies.").

 

[FN68]. See Fried, supra note 41, at 477.

 

[FN69]. See id. at 484‑85.

 

[FN70]. See id. at 484.

 

[FN71]. Information privacy may have a more complicated relationship with intimacy depending upon where two people are in their relationship.  In the beginning of a relationship, a lack of information privacy might actually promote the creation of intimacy.  To take a common example, a person is often more inclined to go on a first date with someone if she knows something about him.  I think Fried's response would be that, once that relationship starts, information privacy is instrumental in furthering intimacy.

 

[FN72]. See Jeffrey H. Reiman, Privacy, Intimacy, and Personhood, in Philosophical Dimensions of Privacy, supra note 22, at 300, 305.

 

[FN73]. Cf. Robert S. Gerstein, Intimacy and Privacy, in Philosophical Dimensions of Privacy, supra note 22, at 265, 268 (noting that observation kills the spontaneity necessary for intimacy); see also text accompanying notes 274‑294 infra  (discussing the relationship between surveillance and dignity).

 

[FN74]. See Richard A. Wasserstrom, Privacy: Some Arguments and Assumptions, in Philosophical Dimensions of Privacy, supra note 22, at 317, 324 (noting that the lack of privacy is harmful because "the kind of spontaneity and openness that is essential to [people] disappears with the presence of an observer").

 

[FN75]. Poor quality includes: inaccurate information; technically accurate but misleading information, because it is incomplete or stale; and irrelevant information, because it is accurate and not‑misleading but inappropriately considered.  See IITF Principles, supra note 19, at 6 (stating that quality of personal information depends on accuracy, timeliness, completeness, and relevance).  Errors in databases are not exceptional.  See, e.g., Kenneth C. Laudon, Dossier Society: Value Choices in the Design of National Information Systems 139 (1986) (stating that, over a one year period, 74.3% of records disseminated by the Federal Bureau of Investigation ("FBI") Identification Division had "some significant quality problems"); id. at 140‑42 (noting that 11.2% of the warrants for persons listed on the FBI Wanted Persons list were no longer valid, 6.6% were inaccurate, and 7.0% dealt with offenses sufficiently trivial that extradition and prosecution were unlikely).

 

[FN76]. Kenneth L. Karst, "The Files": Legal Controls over the Accuracy and Accessibility of Stored Personal Data, 31 Law & Contemp. Probs. 342, 356 (1966); see also id. at 357 (arguing that the risk of inaccuracy is greatest when the file is read by an outsider unfamiliar with the system and unaware that the language or the standards of the evaluator differ from his own); Spiros Simitis, Reviewing Privacy in an Information Society, 135 U. Pa. L. Rev. 707, 718 (1987) (arguing that the loss of data's context distorts their content).

 

[FN77]. See Karst, supra note 76, at 358.

 

[FN78]. See Burnham, supra note 55, at 151  ("[E]ven highly educated people are prepared to grant the computer far more power than it actually possesses."); HEW Report, supra note 20, at xx ("[T]he net effect of computerization is that it is becoming much easier for recordkeeping systems to affect people than for people to affect recordkeeping systems."); Laudon, supra note 75, at 4 (contending that decisions about us are made less on "personal face‑to‑face contact" and more on information about us, our "data image"); Simitis, supra note 76, at 718 (noting that once a decision has been made by the computer, the burden of proof is shifted onto the individual to prove that the computer is wrong).

 

[FN79]. One such example may be the borrower's race in a loan application.  See Gandy, supra note 18, at 200‑01 (discussing racially discriminatory lending); Oscar H. Gandy, Jr., Legitimate Business Interest: No End in Sight? An Inquiry into the Status of Privacy in Cyberspace, 1996 U. Chi. Legal F. 77, 79 (same); see also Peter P. Swire, The Persistent Problem of Lending Discrimination: A Law and Economics Analysis, 73 Tex. L. Rev. 787, 814‑30 (1995) (explaining why markets may not stop racial discrimination in lending).

 

[FN80]. I use "power" here to mean nothing more complicated than "an actual capacity to do or prevent something."  Stephen R. Munzer, A Theory of Property 178 (1990).

 

[FN81]. For further discussion of how conflicts over information flow are conflicts over power, see Bok, supra note 34, at 19.  Bok states:

  Conflicts over secrecy‑‑between state and citizen ... or parent and child, or in journalism or business or law‑‑are conflicts over power: the power that comes through controlling the flow of information.  To be able to hold back some information about oneself or to channel it and thus influence how one is seen by others gives power; so does the capacity to penetrate similar defenses and strategies when used by others.

Id. (citation omitted).

 

[FN82]. For a disturbing story of privacy and harassment, see Nina Bernstein, Personal Files via Computer Offer Money and Pose Threat, N.Y. Times, June 12, 1997, at A1 (describing a prisoner who processed a woman's consumer survey on behalf of Metromail Corporation and later sent her a sexually threatening letter).

 

[FN83]. Actress Rebecca Schaffer was murdered by a crazed fan who had located her home through Department of Motor Vehicles records.  See Regan, supra note 18, at 102‑03 (discussing the types of problems that led to the introduction of the Driver's Privacy Protection Act of 1994 ("DPPA"), 18 U.S.C. §§ 2721‑2725, which was later incorporated into the Violent Crime Control and Law Enforcement Act of 1994, Pub. L. No. 103‑322, 108 Stat. 1796 (codified as amended in scattered sections of 18 U.S.C.)).

 

[FN84]. In identity theft, an impostor obtains enough personal information to impersonate his victim in financial transactions.  Typically, the impostor applies for a credit card under the victim's name and then charges up the card, leaving the victim to deal with the impostor's debts.  Often, the victim's credit is ruined and may take years to repair.  See Board of Governors of the Fed. Reserve Sys., Report to the Congress Concerning the Availability of Consumer Identifying Information and Financial Fraud 18‑20 (1997) [hereinafter Federal Reserve Report] (on file with the Stanford Law Review) (noting the financial effects of fraud on society as a whole); Privacy Rights Clearinghouse, Second Annual Report 28‑32 (1995) (suggesting that government agencies are providing inadequate protection of individual privacy rights).

 

[FN85]. The argument that advertising, in its multifarious forms, can alter demand is uncontroversial.  See Daniel Hays Lowenstein, Commercial Speech and the First Amendment: "Too Much Puff": Persuasion, Paternalism, and Commercial Speech, 56 U. Cin. L. Rev. 1205, 1215‑17 (1988) (making a qualified case that advertising increases smoking); cf. Glickman v. Wileman Bros. & Elliott, Inc., 117 S. Ct. 2130, 2141 (1997) ("Generic advertising is intended to stimulate consumer demand for an agricultural product in a regulated market.  That purpose is legitimate and consistent with the regulatory goals of the overall statutory scheme.").

 

[FN86]. See Miller, supra note 17, at 43 (expressing concern over cybernetic manipulation of consumers and voters).

 

[FN87]. See, e.g., Philip Shenon, Navy Case Combines Gay Rights and On‑ Line Privacy, N.Y. Times, Jan. 17, 1998, at A6 (describing how the Navy accessed America Online subscription information to obtain the true identity of an on‑line personality named "Tim" who had claimed to be both gay and a Navy employee).

 

[FN88]. Consider the chilling effect caused by military surveillance of domestic political groups, including the American Civil Liberties Union, the Southern Christian Leadership Conference, and the National Association for the Advancement of Colored People.  See Miller, supra note 17, at 40 (explaining that Army intelligence maintained files on these and other activist political organizations).  Although the state enjoys a virtual monopoly on lawful coercive force, I believe that a substantially similar effect can be achieved through private sector surveillance.  As John Stuart Mill warned:

  [The] means of tyrannizing are not restricted to the acts which [society] may do by the hands of its political functionaries.  Society can and does execute its own mandates; and if it issues wrong mandates instead of right, or any mandates at all in things with which it ought not to meddle, it practices a social tyranny more formidable than many kinds of political oppression, since, though not usually upheld by such extreme penalties, it leaves fewer means of escape, penetrating much more deeply into the details of life, and enslaving the soul itself.  Protection, therefore, against the tyranny of the magistrate is not enough; there needs protection also against the tyranny of the prevailing opinion and feeling, against the tendency of society to impose, by other means than civil penalties, its own ideas and practices as rules of conduct on those who dissent from them ....

John Stuart Mill, On Liberty 4‑5 (Hacket Publishing 1978).

 

[FN89]. See Stanley I. Benn, Privacy, Freedom, and Respect for Persons, in Philosophical Dimensions of Privacy, supra note 22, at 223, 241 ("We act differently if we believe we are being observed.  If we can never be sure whether or not we are being watched and listened to, all our actions will be altered and our very character will change." (quoting Hubert Humphrey)); Paul M. Schwartz, Privacy and Participation: Personal Information and Public Sector Regulation in the United States, 80 Iowa L. Rev. 553, 560 (1995) (noting how data processing "creates a potential for suppressing a capacity for free choice: the more that is known about an individual, the easier it is to force his obedience").

 

[FN90]. Oliver Wendell Holmes lamented that "the very minute a thought is threatened with publicity it seems to shrink toward mediocrity."  Bloustein, supra note 54, at 255 (quoting O.W. Holmes, The Poet at the Breakfast‑Table 344 (1872)).

 

[FN91]. Norms are nonlegal obligations obeyed out of a combination of internalized duty and fear of externally imposed sanction.  Obviously, not all social norms are warranted.  Specifically, Richard McAdams has demonstrated that if social norms arise from individuals' desire for esteem, many social norms will be economically inefficient.  See Richard H. McAdams, The Origin, Development, and Regulation of Norms, 96 Mich. L. Rev. 338, 412‑16 (1997). Information privacy can resist such norms in two ways.  First, it can make norm violations harder to detect, thereby making norms harder to enforce.  Second, it may prevent the initial construction of the norm by interfering with the public recognition of group consensus, which is a prerequisite for norm construction.  See id. at 425‑26.  McAdams notes two qualifications.  First, because privacy resists both efficient and inefficient norms, any judgment on whether privacy produces a net increase in efficiency depends on, among other things, the relative proportion of efficient versus inefficient norms.  Second, in certain cases, privacy may perpetuate, not resist, a norm by decreasing communicative exchange about a consensus in the past that has since disappeared.  See id. at 426‑27.

 

[FN92]. See Gavison, supra note 34, at 455 (arguing that privacy fosters moral autonomy, which is necessary for democracy).  The Supreme Court recognized as much in NAACP v. Alabama: "This Court has recognized the vital relationship between freedom to associate and privacy in one's associations.... Inviolability of privacy in group association may in many circumstances be indispensable to preservation of freedom of association, particularly where a group espouses dissident beliefs."  357 U.S. 449, 462 (1958).

 

[FN93]. See Westin, supra note 41, at 23 (observing that totalitarian regimes tarnish privacy as immoral and antisocial); Bloustein, supra note 54, at 226‑27 ("Unlike the totalitarian ideology, democratic political philosophy favors autonomous or private associations because they constitute independent sources of power and initiative which act to forestall undue accumulation of state power.").

 

[FN94]. In credit transactions, these characteristics include the individual's previous repayment history.  In other transactions, such as health insurance, such characteristics might include the individual's genetic makeup, medical history, and lifestyle risks.  For a thoughtful analysis of medical privacy, see generally Paul M. Schwartz, Privacy and the Economics of Personal Health Care Information, 76 Tex. L. Rev. 1 (1997).

 

[FN95]. See George J. Stigler, An Introduction to Privacy in Economics and Politics, 9 J. Legal Stud. 623, 628‑33 (1980) (arguing that the ability to classify more accurately will lead to greater economic efficiency).

 

[FN96]. See Mary J. Culnan, Self‑Regulation on the Electronic Frontier: Implications for Public Policy, in NTIA Report, supra note 11, at text accompanying note 7 (no pagination in electronic copy).

 

[FN97]. This requires attention not only to what is legally permissible, but also to what is economically feasible.  Here I am concerned about transaction costs preventing efficient processing of personal information.  For a sustained analysis of privacy in economic terms, see text accompanying notes 229‑304 infra.

 

[FN98]. Consider the ingenious reporter who investigated Supreme Court nominee Robert Bork's video rental history.  The investigation revealed nothing juicy.  But members of Congress‑‑painfully aware of their own vulnerability‑‑ immediately passed the VPPA, which proscribes the release of such information. See Video Privacy Protection Act of 1988, 18 U.S.C. §§ 2710‑2711 (1994); see also S. Rep. No. 100‑599, at 5 (1988) (citing the Bork incident as impetus for the legislation); Joe Domanick, Maybe There Is a God: Six Lessons in the Pitfalls of Public Hypocrisy, Playboy, Aug. 1990, at 110 (discussing the hypocrisy of, inter alia, Robert Bauman, Jimmy Swaggert, and Jim Bakker).

 

[FN99]. Posner, supra note 56, at 395.

 

[FN100]. See Edward J. Bloustein, Privacy Is Dear at Any Price: A Response to Professor Posner's Economic Theory, 12 Ga. L. Rev. 429, 445 (1978) (discussing a woman giving birth and a man writing love letters to his wife).

 

[FN101]. See Bok, supra note 34, at xv (explaining that lying is prima facie bad, but that secrets are not).

 

[FN102]. See Sweezy v. New Hampshire, 354 U.S. 234, 250‑51  (1957) (discussing the importance of freedom of political expression).  Just slightly less hallowed are a jury's secret deliberations.  See Clark v. United States, 289 U.S. 1, 13 (1933) ("Freedom of debate might be stifled and independence of thought checked if jurors were made to feel that their arguments and ballots were to be freely published to the world.").

 

[FN103]. See Ferdinand Schoeman, Privacy and Intimate Information, in Philosophical Dimensions of Privacy, supra note 22, at 403, 408‑09 (arguing that it is important that people maintain different dimensions of themselves in different contexts).

 

[FN104]. See, e.g., Margaret Chon, Multidimensional Lawyering and Professional Responsibility, 43 Syracuse L. Rev. 1137, 1138‑39 (1992) (agreeing in the context of legal ethics).

 

[FN105]. See Rachels, supra note 64, at 294 (suggesting that the variances in behavior define the relationships, not the individual).  Posner has come around on this point.  See Richard A. Posner, Overcoming Law 534‑35 (1995) (arguing that one's public self is no less real than one's private self).

 

[FN106]. I take no position on privacy's connection to rehabilitation.  Some commentators argue, however, that information privacy is necessary for an individual to "remold her identity or reform her character."  C. Edwin Baker, Posner's Privacy Mystery and the Failure of Economic Analysis of Law, 12 Ga. L. Rev. 475, 479 (1978).  Without such ability, the past would always catch up to the present, never allowing us to "overcome the mistakes of the past."  Id. at 480.  But see Richard A. Epstein, Privacy, Property Rights, and Misrepresentations, 12 Ga. L. Rev. 455, 472 (1978) (criticizing rehabilitation as a justification for privacy).

  This conflict appears starkly in those laws, such as Megan's Law, which require notification of neighbors when a convicted sex offender moves into a neighborhood.  See, e.g., N.J. Stat. Ann. §§ 2c:7‑2 to ‑11 (West 1995).  Currently 29 states have such community notification laws.  The convicted felon's privacy rights might have to be sacrificed where the safety of the community, especially its children, is in jeopardy.  For those who think this notification requirement is excessively harsh or double punishment, I am persuaded by my colleague Eugene Volokh's pithy retort: "These are not punishments, they're consequences."  Lori Basheda, "Megan's Law" Challenged by Molester, Orange County Reg., June 23, 1997, at B1.  For the opposite twist on the trade‑off between privacy and protection of children, see Largest Database Marketing Firm Sends Phone Numbers, Addresses of 5,000 Families with Kids to TV Reporter Using Name of Child Killer, Bus. Wire, May 13, 1996, available in LEXIS, News Library, Bwire File (reporting the sale of phone numbers, addresses, and ages of 5000 children by Metromail to a reporter using the name "Richard Allen Davis," the person convicted of murdering 12‑year‑old Polly Klaas).

 

(RETURN TO MODULE I)

 

(RETURN TO COURSE HOMEPAGE AND SYLLABUS)