Privacy

First published Tue May 14, 2002; substantive revision Thu Oct 19, 2023

Early debates on privacy began at the end of the nineteenth century, when the potential intrusion of photography and the (tabloid) press was first recognized. When contrasted with the concerns that we face today due to the smart devices surrounding us, collecting data, and influencing our opinions and behavior, the old worries look quite innocent. Recent technology has led to previously unimagined ways of obtaining information about people, for observing them, listening in on their conversations, monitoring their activities and locating their whereabouts. These are not simply “new technologies”: they fundamentally change the social practices in which they are embedded. Furthermore, the problem is not simply that having a smartphone enables companies to collect huge amounts of personal data, but that this data is used to create profiles of users that can be exploited for political and commercial purposes.

Yet there are also social changes of an entirely different sort that have, in various ways, produced constant shifts in the boundaries separating the private and the public realms. These changes include, for example, the fact that women can no longer simply be assigned to the realm of domestic and family labor but are increasingly playing—and wanting to play—an equal role in gainful employment and the public sphere. Another social change is that ever since the 1960s, intimacy and sexuality are no longer banished to the private domain but are now openly portrayed and displayed in a whole range of (social) media.

An analysis of these changes in the societal meanings of the private and the public shows that interest in re-conceptualizing privacy—which is embedded in a broader political and legal endeavor of finding and creating appropriate privacy protections—is due to three distinct social-historical processes. Firstly, recent developments in information technology threaten the protection of personal privacy in fundamentally new ways; secondly, there have been radical changes in the relation between the sexes, prompting a concomitant reconfiguration of the private sphere; and thirdly, there has been an intrusion of intimacy into the public realm through previously private themes that have turned public, accompanied by shifts in notions of individuality, subjectivity, and authenticity. These developments suggest that there is not one history of the concept of privacy, but that the rules that protect privacy today (and the reflection that has gone along with those rules) have been driven by developments and concerns in different political and social areas. The history of privacy may therefore include more than what counts as “private” at any particular time (Westin 1967; Elshtain 1981; Benn & Gaus 1983; B. Moore 1984; Ariès & Duby 1987; Weintraub & Kumar 1997; McKeon 2007; Vincent 2016; Igo 2018). Finally, these developments bring to light the thoroughly conventional nature of the separation between public and private life.

Against this background, we can see that there is no single definition, analysis or meaning of the term “privacy”, either in ordinary language or in philosophical, political and legal discourse. The concept of privacy has broad historical roots in legal, ethical, sociological and anthropological discourses. In this article, we will first focus on the histories of privacy in various discourses and spheres of life. We will also discuss the history of legislating privacy protections in different times and (legal) cultures. In the second part, we will consider a range of critiques of privacy—both domestic privacy and the right to privacy—and all the relevant arguments and counterarguments forming those debates.

The third part of this article is devoted to substantial discussions of privacy in the literature. This literature distinguishes between descriptive accounts of the meaning of privacy (which describe what is in fact protected as private), and normative accounts of privacy. We will also review discussions that treat privacy as an interest with moral value, and those that refer to it as a moral or legal right that ought to be protected only by social conventions or also by the law. As a starting point, we will look more precisely at the various meanings of “privacy”. We will present the semantics of the concept, which some authors claim to be not unifiable, only possessing a Wittgensteinian sense of family likeness (Solove 2008). The description of what is in fact protected as private is followed by normative accounts of privacy defending its value, and the extent to which it should be protected. The question of whether privacy has to be protected as a conventional interest, a moral right, or a legal right, has been contested for a long time.

The final section of this article is the longest and most extensive. There, contemporary debates on privacy in public discourse will be considered, as well as a range of philosophical, legal, and anthropological theories, from privacy and health to group privacy, the social dimensions of privacy, and the relationship between privacy and power. In the end, it will be concluded that the many debates regarding privacy, across very different fields of society, show that the problem of privacy determines to a large degree the personal, social and political lives of citizens. A free, autonomous life of well-lived relations—intimate, social, and political—is becoming increasingly endangered. Debates about privacy are therefore becoming more and more important, not just in academia, but in the societal public sphere as well.

1. The History of Privacy

To understand the history of privacy, one must first consider

  1. the history of the distinction between the private and the public sphere, in both ancient and modern liberal thought;
  2. conceptualizations of informational privacy; and
  3. the history of a legal right to privacy.

These notions are all connected. Separating them here is done principally for heuristic reasons.

1.1 The History of Conceptualizing the Private Sphere

Aristotle’s distinction between the public sphere of politics and political activity, the polis, and the private or domestic sphere of the family, the oikos, is the starting point for philosophical discussions of privacy (Politics 1253b, 1259b). Aristotle’s classic articulation of the private domain is one of necessity, restriction, confinement, and subjection to the laws of nature and reproduction. For Aristotle (and for a modern Aristotelian such as Hannah Arendt), there is a clear social ontology that makes it seem natural for certain things, persons and activities to be regarded as private, and others as public. The private domain is the domain of the household,

the sphere where the necessities of life, of individual survival as well as of continuity of the species, [are] taken care of and guaranteed. (Arendt 1958 [1998: 45])

Although there has been persistent concern for domestic privacy throughout history and across many cultures (in cultural theory and [art] history, for instance; see Ariès 1960 [1962]; Ariès & Duby 1985&1987; Vincent 2016), in philosophical theory there remains a research-gap between Aristotle’s theory of privacy and the classical liberal theory, starting with Hobbes and Locke. This is in contrast to (art-)historical analyses, which comprehensively consider domestic privacy from the early Middle Ages until the early twentieth century (see especially the very informative Vincent 2016).

In liberal theory, the public/private distinction is taken to refer to the appropriate realm of governmental authority as opposed to the realm reserved for self-regulation, along the lines initially analyzed by Locke in his Second Treatise on Government (Locke 1690), and later by John Stuart Mill in his essay On Liberty (Mill 1859). The distinction arises again in Locke’s discussion of property, which can also be found in his Second Treatise. In the state of nature, all the world’s bounty is held in common and is in that sense public. But one possesses oneself and one’s own body, and one can also acquire property by combining one’s labor with it. These cases are considered one’s private property. In the liberal tradition, Rawls also distinguishes between the private (which includes the domestic sphere) and the public.

As will be discussed in §2, classical liberal theory from Hobbes and Locke to Rawls, together with the naturalistic distinction between the private-as-domestic and the public, has been criticized by feminist and contemporary liberal thinkers. That the division between the private and the public is always conventional (and not natural) in character, has been maintained by almost all theories of privacy dating from the last five or so decades. New approaches to the theory of privacy call for a redescription of the private, and a reformulation of the idea of equal rights to privacy and freedom that is no longer inconsistent with the principles of a liberal democracy based on equal rights (Allen 1988 & 1989; Jean Cohen 1992 and 2002; see also §3 below).

1.2 The History of Informational Privacy

The recent history of the moral right to informational privacy is linked to the liberal discourse on privacy and freedom. It found its start with a well-known essay by Samuel Warren and Louis Brandeis, “The Right to Privacy” (1890; see also Gordon 1960; Prosser 1960; Glancy 1979; Keulen & Kroeze 2018: 32 and 44–45; Sax 2018: 147–150). Citing “political, social, and economic changes” and a recognition of “the right to be let alone”, which counts as the first definition of informational privacy, Warren and Brandeis argued that existing law afforded a way to protect the privacy of the individual, and they sought to explain the nature and extent of that protection (1890: 193). Focusing in large part on increasing levels of publicity enabled by the burgeoning newspaper industry and recent inventions such as photography, they emphasized the invasion of privacy brought about by the public dissemination of details relating to a person’s private life. Warren and Brandeis felt that a variety of existing cases could be protected under a more general right to privacy, which would safeguard the extent to which one’s thoughts, sentiments, and emotions could be shared publicly by others. Urging that they were not attempting to protect any items or intellectual property produced, but rather the peace of mind attained by such protection, they claimed that the right to privacy was based on a principle of “inviolate personality”, which was part of a general right to the immunity of the person, “the right to one’s personality” (1890: 195 and 215).

Warren and Brandeis believed that the privacy principle was already part of common law dealing with the protection of one’s home, but new technology made it important to explicitly and separately recognize this protection under the name of a right to privacy. They suggested that limitations on this right could be determined by analogy with the law of defamation and slander, and it would not prevent publication of information about public officials running for office, for example. Warren and Brandeis thus laid the foundation for a concept of a right to privacy that has become known as the right to control over information about oneself; their central and most influential concept remains the right to be left alone (1890: 193).

Although the first legal privacy cases after the publication of their paper did not recognize a right to privacy, it wasn’t long before public discourse, as well as both state and federal courts in the US, were endorsing and expanding that right. In an attempt to systematize and more clearly describe and define the new right of privacy being upheld in tort law, William Prosser wrote that what had emerged were four different interests in privacy:

  1. intrusion into a person’s seclusion or solitude, or into his private affairs;
  2. public disclosure of embarrassing private facts about an individual;
  3. publicity placing one in a false light in the public eye; and
  4. appropriation of one’s likeness for the advantage of another (1960: 389).

One could say that Prosser’s approach is in fact a legal one, since he is examining the right to privacy in tort law.

The history of informational privacy is rather short, as we saw, and although many developments have taken place since the 1960s, these are better discussed in the next section, as well as the systematic sections later in this article (but see Igo [2018] for a general overview of the modern American history).

1.3 History of Legal Protection

During the twentieth century, the right to privacy was advanced to the status of a human right. It was a completely novel entry into the catalog contained in the Universal Declaration of Human Rights, and it is unusual in lacking any predecessors in state constitutions or basic laws. As Diggelmann and Cleis (2014: 441) have pointed out, “the right’s potential was dramatically underestimated at the time of its creation”. Having originated in a somewhat accidental manner, the right to privacy has since become one of the most important human rights (Weil 1963; Volio 1981; Michael 1994; Feldman 1997; Richardson 2017). Article 12 of the 1948 Universal Declaration of Human Rights reads:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.

20 years later, the same right was enshrined by Article 17, Paragraph 1 of the International Covenant on Civil and Political Rights (1966), albeit in slightly different terms.

Turning our focus to the history of a right to privacy in Europe in particular, Article 8 of the European Convention on Human Rights, drafted in 1950, reads somewhat differently while expressing the same idea:

Everyone has the right to respect for his private and family life, his home and his correspondence.

In 1981, the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108) provided specific rules for the protection of personal data (Dalla Corte 2020: 125). In 2000, with the Charter of Fundamental Rights of the European Union, there was for the first time a formal differentiation between the two rights: a right to privacy on the one hand, and a right to the protection of personal data on the other. This formal distinction was the first in an international declaration of rights. Article 7 of the Charter provides a right to “respect for private and family life”, and Article 8 provides a right to the “protection of personal data” (see also González Fuster 2014). The latter plays an important role in securing the right to informational privacy.

Further examination of the history of the rights to privacy and data protection in Europe, reveals several important developments. The first of these was the step taken in 1983 by the Federal Constitutional Court of Germany. In its landmark decision on the constitutionality of the 1983 Census Act, the judgment (BVerfG 65, 1(43)) was influential in designating the right to privacy as informational self-determination. The ruling stated that

[if] individuals cannot, with sufficient certainty, determine what kind of personal information is known to certain parts of their social environment, and if it is difficult to ascertain what kind of information potential communication partners are privy to, this could greatly impede their freedom to make self-determined plans or decisions. A societal order, and its underlying legal order, would not be compatible with the right to informational self-determination if citizens were no longer able to tell who knows what kind of personal information about them, at what time and on which occasion.

The next important step was taken in 1995, with the adoption of the Data Protection Directive by the European Union, which was a legally binding directive (Directive 95/46/EC) adopted by EU member states. The purpose of this directive was twofold: to harmonize data protection laws, so as to facilitate cross-border trade by companies, and to protect people’s rights and freedoms when their personal data are used. Thus, the directive was largely based on the rationale of market integration. In 2016, the Data Protection Directive was followed up in Europe by the General Data Protection Regulation (GDPR). The GDPR has been in law since 2016, but its provisions became enforceable only in 2018. The GDPR is the most consequential regulatory development in information policy in decades (see Lindroos-Hovinheimo 2021), imposing significant fines on companies that fail to comply. As Hoofnagle et al. (2019) explain, the GDPR brings personal data into a complex and protective regulatory regime. Many data protection principles (for instance about data security) have been incorporated into US law as well, albeit mostly in Federal Trade Commission settlements with companies (Hoofnagle et al. 2019).

The juridification and jurisdiction in the US, especially concerning a constitutional right to privacy, has taken a very different course. On the one hand, there have been many advances with respect to laws protecting informational privacy. While US data protection law is fragmented, there have been recent developments that may result in closer alignment with EU law (e.g., the California Consumer Privacy Act of 2018 and the American Data Privacy and Protection Act—the latter being a Bill introduced in the US House of Representatives in 2022).

On the other hand, the most significant focus in US law is on decisional privacy, the right to make decisions (here with respect to one’s body) without interference from others, i.e., on same-sex marriage and abortion. In 1965, a right to privacy, independent of informational privacy and the Fourth Amendment, was recognized explicitly by the Supreme Court. It has commonly been referred to as the constitutional right to privacy. The right was first recognized in Griswold v. Connecticut (381 U.S. 479), which overturned the convictions of the Director of Planned Parenthood and a doctor at Yale Medical School for dispersing contraceptive-related information, instruction, and medical advice to married persons. Justice Brennan’s explanation contained the now famous sentence:

The right to privacy gives an individual, married or single, the right to be free from unwarranted governmental intrusion into matters so fundamentally affecting a person as the decision whether to bear or beget a child.

The constitutional right to privacy was described by Justice William O. Douglas as protecting a zone of privacy covering the social institution of marriage and the sexual relations of married persons. (For further commentary, see Allen 1988; Jean Cohen 1992; Inness 1992; Tribe 1990; DeCew 1997; Turkington & Allen 1999.)

Despite controversy over Douglas’s opinion, the constitutional privacy right was soon relied on to overturn a prohibition on interracial marriage, to allow individuals to possess obscene matter in their own homes, and to allow the distribution of contraceptive devices to individuals, both married and single. However, the most famous application of this right to privacy was as one justification for a woman’s right to have an abortion, a right defended in the 1973 judgment of Roe v. Wade (410 U.S. 113). It has since been used in subsequent legal decisions on abortion. As the explanation formulated by Justice Blackmun famously put it,

this right of privacy… is broad enough to encompass a woman’s decision whether or not to terminate her pregnancy. (Roe v. Wade, 410 U.S. 113 [1973]: 153; see Dworkin 1994)

Bork (1990), however, views the Griswold v. Connecticut decision as an attempt by the Supreme Court to take a side on a social and cultural issue, and as an example of bad constitutional law (for criticism of Bork’s view, see Inness 1992; Schoeman 1992; Johnson 1994; DeCew 1997).

Regardless of anything else it might be, the right to privacy was seen as a right that incorporates reproductive liberties. Precisely which personal decisions regarding reproductive liberties have been protected by this right to privacy has varied depending on the makeup of the Court at any given time. For example, in the 1986 Bowers v. Hardwick judgment (478 U.S. 186; Subsequently overturned in Lawrence v. Texas 2003), it was decided that this right did not render anti-sodomy laws in Georgia unconstitutional, despite the intimate sexual relations involved.

In June 2022, in Dobbs v. Jackson Women’s Health Organization (No. 19–1392, 597 U.S.), the Supreme Court (then with a majority of Conservative judges) overruled its previous decision in Roe v. Wade and Planned Parenthood v. Casey (505 U.S. 833 [1992]). The majority argued that abortion cannot be counted as a constitutional right, since the constitution does not mention it, and abortion was not “deeply rooted” in American history. More specifically, the court argued that the right to privacy implied by the Fourteenth Amendment does not include a woman’s right to abortion. As a consequence of Dobbs v. Jackson Women’s Health Organization, individual states have the power to regulate access to abortion (see Tribe 2022).

Note that the debate in the US is mostly focused on aspects of decisional privacy. In Europe, the right to same sex marriage, as well as the right to abortion, are conceived of as rights to personal freedom, and have therefore not been discussed in the context of theories of privacy. (For further commentary on this difference, see §3.)

2. Critiques of Privacy

2.1 Thomson’s Reductionism

Reductionists are named for their view that privacy concerns are analyzable or reducible to claims of other sorts, such as infliction of emotional distress or property interests. They deny that there is anything useful in considering privacy as a separate concept. Reductionists conclude, then, that there is nothing coherent, distinctive or illuminating about privacy interests. Probably the most famous fundamental critique of a right to privacy stems from the work of Judith Jarvis Thomson (1975). Having noted that there is little agreement on exactly what a right to privacy entails, Thomson examines a number of cases that have been thought to constitute violations of the right to privacy. On closer inspection, however, Thomson argues all those cases can be adequately and equally well-explained in terms of violations of property rights or rights over the person, such as a right not to be listened to. Ultimately the right to privacy, on Thomson’s view, is merely a cluster of rights, consisting of several irreducible “grand rights” (for instance, the right to property), and a number of reducible “ungrand rights” (such as the right not to be subjected to eavesdropping, the right to keep one’s belongings out of view, and so on). The “right over one’s own person” consists of several ungrand rights, such as (amongst others) the right not to be looked at—but is itself not a grand right (Thomson 1975: 304–306). The different rights in the cluster can be fully explained by, for instance, property rights or rights to bodily security. Therefore, the right to privacy, in Thomson’s view, is “derivative” in the sense that there is no need to find what is common in the cluster of privacy rights, and it is also derivative in its importance and justification. Any privacy violation is better understood as the violation of a more basic right, e.g., the right not to be listened to.

This reductionist view was quickly discarded by Thomas Scanlon, who argues in a direct reply to Thomson’s thesis that we have, for instance, a general right not to be looked at that

[a]s far as I can see I have no such general rights to begin with. I have an interest not to be looked at when I wish not to be (…). But rights directly corresponding to these interests would be too broad to be part of a workable system. (1975: 320)

The right to privacy is grounded in “the special interests that we have in being able to be free from certain kinds of intrusions” (Scanlon 1975: 315). Scanlon’s reply to Thomson can be read as an attempt to find a common ground for different aspects of privacy (DeCew 1997; Rachels 1975; Reiman 1976; Gavison 1980; see also §3 and §4).

Raymond Geuss (2001), on the other hand, seconds Thomson’s reductionist line of criticism but adds to it a more fundamental consideration:

Judith Jarvis Thomson has argued very persuasively that this right does not exist in the sense that it fails to designate any kind of coherent single property or single interest. That does not mean that none of the various things that have come to be grouped under “privacy” are goods—far from it, many of them are extremely important and valuable—only that they are disparate goods, and the perfectly adequate grounds we have for trying to promote them have little to do with one another. (2001: 145)

The very distinction between private and public, he argues, already relies on the assumption that there exists a unified liberal distinction that is set in stone politically, and uncontested. But this assumption displays not only a mistaken conception of the distinction between public and private, but also a mistaken conception of politics. In “real politics”, all distinctions and values are contested. As a result, the liberal distinction is illusory and ideological. According to Geuss, this becomes apparent only when one recognizes the deep heterogeneity of privacy, its reducibility to other interests, and the plurality of very different values attached to its various meanings (Geuss 2001).

2.2 Posner’s Economic Critique

Richard Posner (1978) also presents a critical account of privacy. He argues that from an economic perspective, we should not assign a property right to the personal information of individuals, when doing so leads to economic inefficiencies. This theoretical claim is linked to an empirical claim that common law follows a similar economic logic. Posner does not appear to deny that there is something that we can call “privacy” and a “right to privacy”. Instead, he argues that the notion of privacy should be attributed in a different way, following an economic analysis of the distribution of property rights to personal information. Strictly speaking, then, Posner does not present a fundamental critique of privacy, but rather an account of privacy which is based on considerations of economic efficiency, and he argues that privacy is protected in ways that are economically inefficient. With respect to information, in Posner’s view privacy should only be protected when access to personal information would reduce its value (for example, allowing students access to their letters of recommendation makes those letters less reliable and thus less valuable, and hence they should remain confidential or private). Focusing on privacy as control over information about oneself, Posner argues that concealment or the selective disclosure of information is often used to mislead or manipulate others, or for private economic gain. The protection of individual privacy is therefore less defensible than others suppose, because it does not maximize wealth.

2.3 The Communitarian Critique

Communitarian approaches find it suspicious that many recent theories of privacy rely on the concept of individual (negative) freedom as the raison d’être for privacy, and it is this connection between privacy and freedom or autonomy that is called into question (see Roessler 2008: 699). Privacy in communitarian thought is instead conceived of as a realm or dimension of life concerned with specific practices, also (or even primarily) relevant to the community at large. Accordingly, these practices must be understood not as a realm to which the individual has a claim qua autonomous being, but as one conceded to the individual as a member of the community (Sandel 1982; Elshtain 1995; Etzioni 1999 & 2004).

The idea underlying the communitarian point of view, particularly that of Sandel (1982), is that liberal theories of privacy necessarily conceive the self as disembodied and egocentric in nature. This is not only inconsistent in epistemological terms but also normatively undesirable from a political perspective, because communities and communal practices already take priority over the formation of individual identity. Communitarians therefore claim that privacy should not primarily be understood as an individual right to (physical or sexual) self-determination, but rather as protection given to practices that depend on being sheltered from the view of others (Etzioni 1999: 183; 2004: 30). Etzioni’s concept of privacy comprises its decisional, informational, as well as local (the privacy of the home) aspects (Etzioni 1999). The communitarian viewpoint has been criticized, however (e.g., Jean Cohen 2002: 42). These critics argue that it is incorrect to hold that a theory of privacy based on the idea of individual freedom and autonomy cannot at the same time conceive of the self as relational in nature, and as constituted and contextualized in a variety of respects. Feminist theories of privacy insist that individual rights come before communal duties, because it is otherwise impossible to guarantee equal freedom to take decisions pertaining to one’s life and one’s body. In particular, communal practices and traditions may prove repressive and discriminatory, making an individual right to privacy indispensable (Allen 1988: 700; Fraser 1992; Morris 2000).

2.4 The Feminist Critique

The feminist critique of the theorization of the private sphere starts by questioning the idea of the realm of privacy as that of the natural, of women, feelings, hearth and home, and of emotional care for the male members of society, as well as the raising of children. The “natural” coding of the separation between private and public, therefore, is one which follows precisely the borderline separation of the sexes (Okin 1989; Pateman 1989; Phillips 1991; Jean Cohen 1992; Fraser 1992; Landes 1998; DeCew 2015). The target of this feminist critique is classical liberalism, and it has been influential. Rawls—the most influential liberal thinker of the twentieth century—accepted at least part of this criticism as justified, and he revised his own theory as a result (Rawls 2001: 166).

However, in classical liberal theory there is a double interpretation of the private domestic realm. On the one hand, the private domain is valued positively as the domestic sphere that is sheltered; on the other hand, the private sphere is the sphere of women and therefore inferior to the public sphere, according to the coding of a patriarchal society. As a result, the domestic sphere (including the family) is valued and prized as the realm that is sheltered from the demands of a hostile world. However, it is associated with “women”, while the public sphere is associated with “men”. The private is thus characterized as inferior to the public, just as nature is considered inferior in relation to culture (Okin 1991).

In the history of privacy, we are confronted with yet another double reading: even though liberal theory since Hobbes and Locke has advocated equal liberties for all citizens, it has clung to a natural conception of privacy that patently contradicts the notion of equal rights. This is because it grants those rights to men only, and not to women (Locke 1690; Hobbes 1651). As feminist theory has argued, this seems to have little to do with nature and more to do with power and culture. Seen in purely normative terms, nature provides us with no argument as to why certain activities (or persons) should be considered “private”, and others “public” (Pateman 1989; Phillips 1991; Jean Cohen 1992; Fraser 1992; Ortner 1974 [1998]). The classical liberal moral (and later also legal) right to privacy and private freedom has to be separated from the natural interpretation, which is still looms large in the background of everyday culture.

It is necessary to examine the feminist critique from yet another angle. In principle, early radical egalitarian feminist approaches are skeptical with respect to any possible conceptualization of privacy. The best-known of these skeptical approaches is the one developed by MacKinnon (1987, 1989, 1991; see also Olsen 1991). For MacKinnon, the appeal to legal or moral rights to privacy is but a further manifestation of the attempt to push women back into an ideologically constituted realm of privacy defined as the non-political or pre-political, and only ever concede them rights insofar as they are seen as different or deviant. Privacy can be dangerous for women when it is used to cover up repression and physical harm inflicted on them, perpetuating the subjection of women in the domestic sphere and encouraging non-intervention by the state. Such a concept of privacy, according to MacKinnon, fails to call the sexual hierarchy into question. Instead, it simply preserves the social power structures that find expression in the correlation of women with the private, and men with the public.

In response to MacKinnon’s argument, one objection is that it fails to make a clear enough distinction between a natural, pre-political concept of privacy on the one hand (which is rejected not only by MacKinnon herself, but also by other theories of privacy) and a legal-conventional concept of privacy on the other (Allen 1988; Jean Cohen 1992). A more reasonable view, according to Anita Allen (1988), is to recognize that while privacy can be used as a shield for abuse, it is unacceptable to reject privacy completely based on harm done in private (see also DeCew 1997: 86). A complete rejection of privacy would result in everything being made public, leaving the domestic sphere open to complete scrutiny and intrusion by the state. Allen and other theorists (such as Pateman 1989: 118–136; see especially Jean Cohen 1992) suggest that societies in general, as well as traditional conceptual divisions between the private and the public, can be criticized and revised (or, as Cohen [1992] puts it, “redescribed”). Therefore, feminists should adopt a concept of privacy that is not in the gender-specific, natural tradition, but is instead oriented towards the notion of freedom (see Allen 1988). The challenge is to find a way for the state to take seriously the domestic abuse that used to be allowed in the name of privacy, while also preventing the state from imposing itself onto the most intimate parts of women’s lives. This means drawing new boundaries for justified state intervention, and thus understanding the public/private distinction in new ways (see §3 and §4 below).

Another feminist perspective on privacy is related to the critique of liberalism in a different way. The approaches to privacy that are linked to freedom and autonomy have been criticized from the perspective of a theory of power (Brown 1995; Cornell 1995). Skepticism towards such approaches arises because they follow from, and are consonant with, other (liberal) dichotomies (such as subject-object or having rights or no rights) that are thought to be essentially exclusionary and discriminatory. It is further argued that such conceptions fail to take into account and criticize the power structures inherent in society, which are therefore also inherent in the structures protecting privacy.

Feminist approaches are far from homogeneous. They range from examples that appear to reject any conceptualization of privacy whatsoever (e.g., Brown 1995; see also the different but equally critical perspective on liberal conceptualizations in Geuss 2001), to those that propose alternative ways of determining privacy, as is the case with Morris, who discussing Pitkin (1981) argues that

privacy should be reconstructed rather than abandoned, for otherwise it is impossible to think critically about central problems in democratic theory—among them the very possibility of citizens’ representing, or translating into a common language, what is most singular, secret, ineffable, internal, that is, private, about themselves. (2000: 323)

She defends a “positive political theory of privacy” which she understands as being part of a democratic theory (2000: 323).

3. Meaning and Value

When considering the concept of the private, it is sometimes difficult to separate the descriptive element of meaning from the normative determination. The determination of the meaning of privacy often contains clear normative elements, such as when Fried (1968) claims that the meaning of privacy consists in protecting trust and intimacy, or when Nissenbaum (2010) defines privacy as the adequate flow of information (which requires protection). In this article, an attempt will be made to separate the descriptive and normative aspects as clearly as possible. In the following section, an overview of the relation between the concept of privacy and other concepts will first be given, followed by a descriptive overview of the meaning of privacy. Finally, we will discuss the various normative determinations that have been given to privacy.

3.1 Semantics

One initial approach to determining the meaning of “privacy” is to examine how it is related to similar words. This is what is called the “semantics” of privacy. First, “private” should be distinguished from “intimate” (Gerstein 1978; Benn & Gaus 1983; Bok 1982; Allen 1988; Inness 1992; Dworkin 1994; see also the threefold differentiation of the meaning of “private” in Weinstein 1971; for the discussion that follows, see Roessler 2001 [2005: 9–12]). What is intimate can also be private, but is not necessarily so: We also speak, for instance, of forms of “public intimacy” in aesthetic contexts (think of John Lennon and Yoko Ono at the Hilton in Amsterdam). “Intimacy” has erotic or sexual connotations, as well as connotations of proximity and vulnerability that also—but not only—have to do with the exposure of one’s body (Inness 1992). Secondly, “private” must be distinguished from “secret”. What is private can be secret, but it is not necessarily so. For instance, the question of where I practice my religious beliefs is private, but not necessarily secret. Another example are medical data: these data are informationally private but not secret; they are known to many people (in the health system) and we would not generally call them “secret”.What is secret can be private, but is also not necessarily so—for example, when one speaks of state secrets (Bok 1982: 10–14; Mokrosinska 2020). Semantic overlaps occur when privacy is dependent on something being completely hidden or concealed, in other words on a secret, as with secret diaries or secret ballots. Of relevance here is Wasserstrom (1984), who interprets privacy above all as the realm of what is secret, hidden or concealed, and thus seeks to bring to light the connotations of deception and deceit.

Another important semantic relation is that of the predicate “private” to the predicate “public”; the latter is often defined in opposition to the former. In everyday language, there are two distinct semantic models underlying the various uses of “private” and “public” (Benn & Gaus 1983: 7–10). The first is an “onion” model, which allows one to distinguish between different layers of privacy. The center of the onion is the realm of personal or bodily intimacy and privacy, including not only one’s body, but also one’s private diary, as opposed to which everything else is regarded as “public”. The second layer of the onion comprises the classic realm of privacy, viz. the family and other intimate relationships. In opposition to the family, the outside world of society and the state constitutes the public realm. The outer layer of the onion is society at large—the realm of economic structures or public civil society—that counts as “private” with respect to intervention by the state. It therefore forms yet another realm of privacy in the face of the public realm of the state and its possible interference (Okin 1991).

In a metaphorical sense, the second model in everyday usage lies perpendicular to the first. For this second semantic model, the term “private” is predicated of actions we carry out, or decisions that we make, no matter where we happen to be. Going to church is thus a private matter. In this second sense, the concept of privacy describes a protected sphere or dimension of action and responsibility, where individuals can act in a way that is independent of decisions and influences from the public realm of state institutions and society at large. This second model also comprises informational privacy, since information about myself which I want to keep private (medical data etc) is not left at home, in a layer of the onion. I carry it with me wherever I go, therefore privacy has to be applicable wherever we are.

3.2 Definitions and Meanings

Both of the semantic models mentioned above play a role in the following definitions. From early on, the difficulties of developing a systematic and general definition of privacy have been recognized. We start with the most influential definition (at least since the twentieth century): the definition of the concept given by Justices Warren and Brandeis (see §1.2). Their conception of (the right to) privacy means rather generally the “right to be left alone”. Warren and Brandeis (1890: 214; for the history of Warren and Brandeis’ interest in privacy see Prosser (1960)) also refer to “the dignity […] of the individual” in discussing the right to privacy, in that respect for privacy is seen as acknowledging the dignity of persons, and their respective personalities. Furthermore, it should be mentioned that the aim of their article was not to explicitly define privacy, but rather to answer the question of “whether our law will recognize and protect the right to privacy” (1890: 196; “our law” refers to the US legal framework). This general definition of the concept of privacy, made in terms of respect for personality, dignity, and “being left alone”, prepared the field not only for detailed legal discussions, but also for efforts to define the term from a philosophical perspective.

Both of the semantic models mentioned above play a role in the following definitions. From early on, the difficulties of developing a systematic and general definition of privacy have been recognized. We start with the most influential definition (at least since the twentieth century): the definition of the concept given by Justices Warren and Brandeis (see §1.2). Warren and Brandeis famously summarized this right to privacy as the right “to be let alone” (1890: 214; for the history of Warren and Brandeis’ interest in privacy see Prosser 1960). They underscore that

the intensity and complexity of life, attendant upon advancing civilization, have rendered necessary some retreat from the world, and man, under the defining influence of culture, has become more sensitive to publicity, so that solitude and privacy have become more essential to the individual; but modern enterprises and invention have, through invasions upon his privacy, subjected him to mental pain and distress, far greater than could be inflicted by mere bodily injury. (Warren & Brandeis 1890: 196)

As the foundation of their conception of the right to privacy, Warren and Brandeis refer to “the dignity […] of the individual” or, as pointed out by Sax in his discussion of Warren and Brandeis,

the idea of the “inviolate personality” (Warren & Brandeis 1890: 205), or, put differently, “the more general right to the immunity of the person,—the right to one’s personality” (Warren & Brandeis 1890: 207). It is ultimately up to the individual to decide how she wants to be, think, and act. (Sax 2018: 149)

It should be mentioned that the aim of their article was not to explicitly define privacy, but rather to answer the question of “whether our law will recognize and protect the right to privacy” (Warren & Brandeis 1890: 196; “our law” refers to the US legal framework). This general definition of the concept of privacy, made in terms of respect for personality, dignity, and “being left alone”, nevertheless prepared the field not only for detailed legal discussions, but also for efforts to define the term from a philosophical perspective.

There have been various ways in which understanding the differences in determination of the meaning of privacy have been categorized. The two most prominent are reductionism and coherentism, as has been mentioned above. Reductionists are generally critical of the effort to carve out a special category of harms or interests best referred to as unique privacy harms or interests. Coherentists, meanwhile, defend the coherent fundamental value of privacy interests. In contrast, Ferdinand Schoeman introduced a somewhat different terminology. According to Schoeman, a number of authors believe that “there is something common to most of the privacy claims” (1984b: 5). Schoeman refers to this approach as the “coherence thesis”. Positions which

deny both the coherence thesis and the distinctiveness thesis argue that in each category of privacy claims there are diverse values at stake of the sort common to many other social issues and that these values exhaust privacy claims. […] The thrust of this complex position is that we could do quite well if we eliminated all talk of privacy and simply defended our concerns in terms of standard moral and legal categories. (Schoeman 1984b: 5)

These latter theorists are referred to as reductionists. Judith Thomson puts into question the idea that privacy is a distinct concept altogether, insofar as she finds that there is no clear idea of what the right to privacy is. Instead, she suggests that “the right to privacy is itself a cluster of rights” (1975: 306), such as the right to property or the right over the person (see §2.1). As has been mentioned in earlier sections, Thomson’s reductionist view was dismissed by Thomas Scanlon. For Scanlon, realms of privacy are always “conventionally defined”, irreducible, and obtain in their own right; that is, they cannot be marked out by means of other rights or claims. “Our zone of privacy,” writes Scanlon,

could be defined in many different ways; what matters most is that some system of limits to observation should be generally understood and observed. (1975: 317–318)

The fact that such limits exist—however varied they may turn out to be—is for Scanlon an indication of the irreducibility of privacy.

Nonetheless, the diversity of possible definitions and characterizations, as well as the many possible fields of application, have continued to pose a challenge. While Scanlon’s reply to Thomson can be read as an attempt to find common ground for different aspects of privacy (Scanlon 1975), Judith DeCew (1997) proposes to systematize the concept of privacy by putting forward a “cluster account” that highlights connections between the different interests covered by the concept, without reducing privacy to these different interests:

I argue that privacy is best understood as a cluster concept covering multiple privacy interests, including those enhancing control over information and our need for independence as well as those enhancing our ability to be self-expressible and to form social relationships. (DeCew 1997: 73)

There are other authors who have questioned the possibility of developing a general, comprehensive definition of privacy. One of the most innovative recent approaches to the meaning (and value) of privacy is that of Helen Nissenbaum. She takes note of “the conceptual quagmire to claim a definition—its definition—of privacy” (2010: 3) and proposes a different approach that renounces to the attempt of providing a single, unifying definition. For Nissenbaum, the right to privacy is best understood as a “right to appropriate flow of personal information” (2010: 127). Generally, the appropriate flow of personal information is governed by context-relative information norms. These are characterized by four parameters, which are the specific contexts, the actors, the information types, and (importantly) the transmission principles (see Nissenbaum 2010: 140–141). A transmission principle is a “constraint on the flow of information from party to party in a context.” (Nissenbaum 2010: 145) What counts as being private information depends on the different norms imposed on the flow of information governing different social contexts, such as the contexts of education, religion, security, or politics (the “contextual integrity” of various contexts; Nissenbaum 2010: 127–231). The adequate transmission principle can, depending on the informational norm, in some contexts (for instance intimate relations) be understood as the control of access of the persons involved. In that sense, Nissenbaum is not strictly arguing against Control-Access approaches, although she argues for their limited use in the general framework of informational privacy. Note that Nissenbaum writes about informational privacy and does not discuss other dimensions of the concept.

Finally, one of the most widely discussed privacy theorists is Daniel Solove (2004, 2008, 2011). Solove famously observed that privacy is a “concept in disarray” (2008: 1) and argues that it is futile to look for a unified definition. Instead, he appeals to Wittgenstein’s idea of “family resemblances”, and proposes understanding privacy as consisting of “many different yet related things” (2008: 9).

It can be concluded from the discussion presented here that in the privacy literature, no clear definition has been made that everyone can agree on. Neither is there a clear scope for privacy: in the US, the conceptual and philosophical discussions regarding the meaning and definition of “privacy” are mostly framed in terms of legal discussions of privacy in the context of US constitutional law. Following this reasoning, a number of theorists defend the view that privacy has broad scope, inclusive of the multiple varieties of privacy issue described by the US Supreme Court, even though there is no simple definition of privacy see Schoemann (1992), Parent (1983), Henkin (1974), Thomson (1975), Gavison (1980) and Bork (1990).

3.3 Normative Approaches

Let us now have a closer look at the normative side of the concept of privacy’s value or function. As will be seen, when determining the supposedly descriptive meaning of privacy, attempts are usually made to describe privacy at the same time in normative terms. Let us emphasize again that it is particularly difficult to separate the descriptive and normative (value-laden) aspects of the concept of privacy.

3.3.1 Intrinsic vs Instrumental

With the above in mind, we can begin by separating instrumental and intrinsic approaches to the value of privacy. We speak of the instrumental value of privacy when it is valued for the sake of something else (e.g., intimate relations, the body, freedom, autonomy, or dignity; Stigler 1980; Posner 1981). Intrinsic value is when privacy is valued for its own sake, without reference to any other objects, concepts of value, or dimensions in life (Warren & Brandeis 1890; Bloustein 1964; Gerstein 1978; Gavison 1980; Parent 1983).

This idea of an intrinsic value to privacy has, however, been criticized. An example is Fried’s criticism:

It is just because this instrumental analysis makes privacy so vulnerable that we feel impelled to assign to privacy some intrinsic significance. But to translate privacy to the level of an intrinsic value might seem more a way of cutting off analysis than of carrying it forward. (1970: 140)

Fried thus claims that even when we say we value something no matter what, we can still ask the question, “why should this be so?” However, the distinction between intrinsic and extrinsic value is generally a widely debated philosophical topic—not only in relation to privacy. One should therefore not expect it to be settled in the domain of the philosophy of privacy (see entry on intrinsic vs. extrinsic value ).

3.3.2 Access Simpliciter

Access-based approaches have been put forward to answer the question of the meaning and value of privacy. A variety of formulations can be found in the literature (e.g., Thomson 1975; Gavison 1980; Reiman 1995; Allen 2000; Tavani & Moor 2001). Reiman, for instance, defines privacy as “the condition in which others are deprived of access to you” (1995: 30). Similarly, Allen suggests that

privacy refers to a degree of inaccessibility of a person or information about her to others’ five senses and surveillance devices. (2000: 867)

A classic formulation has been offered by Ruth Gavison: “An individual enjoys perfect privacy when he is completely inaccessible to others” (1980: 428). All these suggestions assume that we value privacy and therefore need some restriction of access to it, see especially Reiman (1995).

Sissela Bok defines privacy as

the condition of being protected from unwanted access by others—either physical access, personal information, or attention. Claims to privacy are claims to control access. (1982: 10)

Privacy is here defined as a condition in which one is protected in various respects from undesired intrusions by other people. Such a broadly-based definition still seems likely to cover the whole range of meanings of the concept of privacy. Similar approaches are also found with Benn (1988) and Schoeman (1992).

The access-based approach has been criticized, however. For instance, it is clear that one does not enjoy privacy after falling into a crevasse, even though it does comply with the condition of “inaccessibility” (see Schoeman 1984b: 3). If a state of isolation, seclusion or secrecy is enforced and not freely chosen (in other words, when the person in question has no control over access), then one would not describe it as “private”. Westin (1967: 40) describes the solitary confinement of prisoners as an example of “too much” privacy. Furthermore, this criticism shows that we think of privacy as a distinctly interpersonal phenomenon. If I am stranded on a desert island, it makes no sense to say I enjoy complete privacy because there are no other people present that I could be inaccessible to (Fried 1968).

3.3.3 Controlling Access

Other approaches to the meaning and value of privacy take the idea of control as their very starting point, i.e., control over specific areas of privacy. The so-called control-based approach has been adopted in many writings on privacy in academic literature and beyond (Westin 1967; Fried 1968; Scanlon 1975; Parent 1983; Inness 1992; Bok 1982; Boyd 2010). Most control-based approaches justify this interpretation of the value of privacy with the enabling of freedom: individual freedom and autonomy are not possible without the protection of private life (see Allen 1988; Jean Cohen 2002). It should be pointed out, however, that access-based conceptions of privacy could also imply that we value privacy because it enables freedom. This is in fact Reiman’s position (1995), and so this should not be seen as a defining line of separation between the approaches. Finally, Bloustein argues that “the intrusion [of privacy] is demeaning to individuality, is an affront to personal dignity” (1964: 962).

The classic and very influential approach advocated by Westin defines privacy as control over information:

Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information about them is communicated to others. (1967: 7; see also Gross 1971)

In a similar vein to Westin, Fried defines privacy as “the control we have over information about ourselves” (1968: 482). Fried also asserts that the reason we value privacy is that relationships with other individuals (characterized by love, friendship and trust, for example) essentially depend on the ability to share certain information which we would not share with those not in the relationship. Privacy thus provides the “means for modulating […] degrees of friendship” (Fried 1968: 485), because we share more information with very close friends than we would with others. Hence, what we really care about is the ability to control the information that we share with others. The right to privacy would protect the ability of an individual to shape meaningful relations with others and would be justified by the importance of relationships such as friendship, love and trust in human life (Fried 1968: 484).

This control can not only be understood as being concerned with informational privacy (as with Westin), but also in much broader terms. Iris Young writes: “The Private [is] what the individual chooses to withdraw from public view” (1990: 119–120). Control is here conceived as a retreat from visibility in the public eye, as is echoed by Julie Inness:

Privacy is the state of the agent having control over a realm of intimacy, which contains her decisions about intimate access to herself (including intimate informational access) and her decisions about her own intimate actions. (1992: 56)

Since the values ascribed to privacy can differ, access-based approaches are compatible with this justification of the value of privacy in terms of intimacy and closeness. Most control-based approaches, however, justify the value of privacy by citing the enabling of freedom: individual freedom and autonomy are not possible without the protection of a private life (Jean Cohen 2002). Jean Cohen (2002) gives a theoretical defense of a freedom-based view of the right to privacy. She defends a constructivist approach to privacy rights and intimacy, arguing that privacy rights protect personal autonomy, and that a right to privacy is indispensable for a free and autonomous life. Freedom and autonomy are in the control access-based interpretation inherently linked with privacy: if I want to control the access to myself (or to information about myself), then it is an expression of my freedom to do so. Privacy has, in these approaches, such a high value because it protects my freedom to do what I want to do at home (within the limits of the law), to control what people know about me (also within limits) and to decide freely about myself and my body (Jean Cohen 2002; Roessler 2004: 71ff). One aspect of the freedom or autonomy of persons is their freedom from unwanted observation and scrutiny. Benn (1984) emphasizes how respect for privacy effectively expresses respect for persons and their personhood. He explains that “[a] man’s view of what he does may be radically altered by having to see it, as it were, through another man’s eyes” (1984: 242). The necessity of adopting a different perspective on one’s own behavior can be seen as a restriction of freedom in the sense of the freedom of self-presentation The basic idea of these approaches is that the freedom to play different roles, to present oneself in different ways, presupposes that a person, in a given relationship, can hide (keep private) aspects which she does not want to be seen, to share. The other person would see her in a different way, if he knew this other side of her. Thus, the freedom to present oneself in this way, in this relationship, would be thwarted. The social freedom to play different roles is dependent on being able to present oneself differently to neighbors, students, one’s spouse. It is dependent on the protection of privacy (Allen 1988; Goffman 1959 for further discussion; Goffman is an oft-quoted reference in the privacy literature). We need privacy precisely to afford us spaces free from observation and scrutiny, in order to achieve the liberal ideals of freedom—such as the ideal of personal relations, the ideal of “the politically free man”, and the ideal of “the morally autonomous man” (Benn 1984: 234).

Adam Moore (2010) adopts yet another approach, which nonetheless should be mentioned under the heading of control-based approaches. In the explicit tradition of Aristotelian teleology, Moore steps off from an account of human nature to explain the value of privacy. Human nature, he argues, allows humans to flourish in a particularly human way. In order to flourish, humans need to develop their rational faculties. This allows them, among other things, to live an autonomous life. Among the necessary favorable external conditions which human beings need to flourish, are the rights to and norms of privacy (A. Moore 2003).

3.3.4 Controlling Access: Three Dimensions

The control-based approach has led to the suggestion that different dimensions of control can be distinguished, and that these have a direct correlation with different areas of privacy. From a normative standpoint, these dimensions—not spaces—of privacy serve to protect, facilitate, and effectuate access to what is conceived of as private, and to identify a range of different privacy norms that are supposed to protect and enable personal autonomy.

Other authors suggest that the control-access definition of privacy is best understood as a cluster concept covering interests in (a) control over information about oneself; (b) control over access to oneself, both physically and mentally; and (c) control over one’s ability to make important decisions about family and lifestyle, so as to be self-expressive and to develop varied relationships (DeCew 1997).

Roessler (2001 [2005]) suggests that three dimensions of privacy should be identified, namely decisional privacy, informational privacy, and local privacy, meaning the traditional “private sphere”, mostly the home (the place, as opposed to information or actions). The justification for these different dimensions lies, she argues, in their role in protecting personal autonomy. Without the protection of privacy in these three dimensions, an autonomous and well-lived life in a liberal democracy would not be possible. She explains:

The dimension of decisional privacy serves to secure the scope for a subject to make decisions and take action in all his social relations. The dimension of informational privacy serves to secure a horizon of expectations regarding what others know about him that is necessary for his autonomy. The dimension of local privacy serves to protect the possibilities for spatial withdrawal upon which a subject is dependent for the sake of his autonomy. [The] aim is to show what the value of privacy inheres in and in what way the violation of these dimensions of privacy also entails a violation of the individual autonomy of the subject. (Roessler 2001 [2005: 16])

Protection of personal privacy with respect to these three dimensions is also constitutive of social life (Fried 1968), as well as crucial for democratic decision-making procedures (Stahl 2020; A. Roberts 2022).

3.3.5 Decisional Privacy

The three dimensions are anchored in different traditions of the conception of privacy, and have been discussed for many years. It is in recent years that decisional privacy, or the privacy of actions, has become a specialist term in the literature. Norms of decisional privacy allow for the control of access to one’s decisional sphere. This concerns certain forms of behavior in public, questions of lifestyle, and more fundamental decisions and actions where we may with good reason tell other people that such-and-such a matter is none of their business (see Lanzing 2016; Sax 2018).

A decisive factor in coining the concept of decisional privacy was the ruling of the US Supreme Court in the Roe v. Wade case. As a result of this landmark case, feminist theory has treated sexual freedom of action, the privacy of intimate and sexual acts, and the woman’s right of sexual self-determination as central elements in the theory of privacy (Allen 1988). In the literature on privacy, decisive significance is also given to the privacy of the body (Gatens 1996, 2004). This includes the woman’s newly-won right to conceive of her body as private to the extent that she can decide for herself whether or not to bear a child, and thus enjoy the right of reproductive freedom.

Sexual harassment and sexual orientation are two further central aspects of decisional privacy, both of which concern the link between sexuality, the body, and identity, and are decisive for the societal coding and meaning of privacy. Protection from sexual harassment and the respect for diverse sexual orientations form dimensions of decisional privacy precisely because it is the privacy of the body that is vulnerable to infringement (see Jean Cohen 2002 for a comprehensive discussion). For more on issues linked to power, see §4.

We can distinguish between different aspects of decisional privacy according to their social context, but the argument underlying the claim to protection of such privacy remains structurally the same. If one understands a person’s self-determination and autonomy to consist in the right to be the (part-) author of her own biography, this must mean that within different social contexts she can demand that her decisions and actions are respected (in the sense that they are “none of your business”) by both social convention and state law. The limits to this form of privacy are regulated by convention and are of course subject to constant renegotiation. Yet this sort of respect for a person’s privacy—applicable also to public contexts—is especially relevant for women. (For relevant examples, see Nagel 1998a & 1998b; Allen 1988; Fraser 1996; Gatens 2004). The spectrum of decisional privacy thus extends from reproductive rights to freedom of conduct in public space.

3.3.6 Informational Privacy

Norms of informational privacy allow people to control who knows what about them. The knowledge other people have about us shapes the ways in which we can present ourselves, and act around others. Informational privacy is thus essentially linked to individual freedom and autonomy since it enables different forms of self-presentation, as well as enabling different forms of social relationship (see Roessler 2001 [2005: 111–141] for more detail).

Debates about informational privacy hearken back to the interpretation of the US Constitution, beginning with the essay written by Justices Warren and Brandeis. (That essay was written after what they felt was an invasion of privacy by intrusive paparazzi in 1890.) It was in that essay that, for the first time, the right to be left alone was described as a constitutional right to privacy, in the sense that information about a person is worthy of protection even when it involves something that occurs in public (see §1.2).

This form of privacy is relevant, primarily in friendships and love relationships, and serves both as protection of relationships and as protection within relationships. In some theories of privacy, this actually constitutes the very heart of privacy in the form of “relational privacy”, which guarantees opportunities for withdrawal that are constitutive for an authentic life (Fried 1968; Rachels 1975). For further details, see entry on privacy and information technology.

3.3.7 Local Privacy

The dimension of local privacy refers to the classic, traditional place of privacy, thought of in terms of its most genuine locus: one’s own home. As we have already seen, this form of local privacy is not derived from a “natural” separation of spheres, but rather from the value of being able to be able to withdraw to within one’s own four walls (see §1 and §2 above).

Traditionally, two different aspects of privacy are of relevance here: solitude and “being-for-oneself” on the one hand, and the protection of family communities and relationships on the other. Firstly, people seek the solitude and isolation provided by the protection of their private dwelling, in order to avoid confrontation with others. This aspect of privacy also comes to the fore in the work of Virginia Woolf and George Orwell, for both of whom the privacy of the room—the privacy to write or think—is a precondition for the possibility of self-discovery and an authentic life (Orwell 1949; Woolf 1929).

Local privacy also offers protection for family relationships. The privacy of the household provides the opportunity for people to deal with one another in a different manner, and to take a break from roles in a way that is not possible when dealing with one another in public. This dimension or sphere of privacy, however, is especially prone to generate the potential for conflict. As has been made clear from previous discussions, this has been a particularly important starting point for feminist criticism. A conflict arises here between traditional conceptions of privacy as constitutive of a loving family haven, which has nothing to do with demands for justice or equal rights (Honneth 2004; contrast with Rawls 1997), and the critical feminist approach (see Okin 1989 & 1991; Young 2004).

4. Contemporary Debates

Contemporary debates on privacy are manifold, lively, and sometimes heated. They turn on a multitude of issues that are not only limited to informational privacy, but also include other dimensions of privacy.

4.1 Recent Debates on the Value or Function of Privacy

Conceptual and normative debates are still pervasive and persistent in the literature, with the role of autonomy and the control-access approach being typical of contemporary discussion. A general discussion of privacy, not focused on one particular aspect but rather presenting the different threats to privacy, and its role and associated tensions in discussions of advancing technology, can be found in different recent comprehensive monographs, for instance in Rule (2007), also in Rotenberg, Scott, and Horwitz (2015), in Citron (2022) as well as in Francis and Francis (2017); Focusing on more particular aspects of the normative debates, Koops et al. (2017) have given a very helpful and informative overview over the different possibilities of defining the meaning and value of privacy. They aim at a typology of eight concepts of privacy:

Our analysis led us to structure types of privacy in a two-dimensional mode, consisting of eight basic types of privacy (bodily, intellectual, spatial, decisional, communicational, associational, proprietary, and behavioral privacy), with an overlay of a ninth type (informational privacy) that overlaps, but does not coincide, with the eight basic types. (2017: 483)

They also very helpfully explain the differences between access-based, control-based and other approaches to the function and value of privacy. Marmor (2015) argues that in the past it has been increasingly difficult to argue for a general interest which privacy is to protect, especially because of the reductionist critique of Thomson (1975); see §2.1). Marmor goes on to claim, however,

that there is a general right to privacy grounded in people’s interest in having a reasonable measure of control over the ways in which they can present themselves (and what is theirs) to others. (2015: 3–4)

Marmor presents an interesting version of a control-based theory of privacy in defending the view that we have a basic interest in having (reasonable) control over evaluations of ourselves, which also means that we have an interest in a predictable environment and a predictable flow of information (see 2015: 25). He is thereby in fact connecting to other control-based approaches (see Jean Cohen 1992, also Goffman 1959) as well as theories of privacy as contextual integrity (see Nissenbaum 2010). Mainz and Uhrenfeldt (2021) also support the control-access approach to privacy, claiming that

there is at least a pro tanto reason to favor the control account of the right to privacy over the access account of the right to privacy. (2021: 287)

In a slightly different manner, Gaukroger (2020: 416) claims that privacy not only protects us when we act in morally right ways, but also when we make use of the “freedom to be bad”—which he defends as a general good. Lundgren, however, disputes the control-access approach because of its fundamental difficulties. Following Parent, Lundgren argues that if we adopt a control based account, we lose privacy every time we give someone access to whatever it is we want to keep private which in fact means that we lose more and more of our privacy. Lundgren therefore argues for a “limited access” (Lundgren 2020: 173) conception, criticizing many different forms of control-access accounts (2020: 167 fn 7). (By implicitly following Bloustein’s (1964) plea for the centrality of the concept of human dignity, Floridi (2016) contends that the concept of dignity should be the foundation of interpreting informational privacy, as understood by the GDPR.)

A further development concerns the theory of contextual integrity, as developed by Helen Nissenbaum. Nissenbaum’s framework of Contextual Integrity has won enormous attention as the standard to evaluate flows of personal information, their legitimacy and acceptability but has not remained undisputed. On the one hand, Nissenbaum’s paradigm has led to a great number of articles applying it to different societal fields and technology developments. Shvartzshnaider et al. (2019) present “a method for analyzing privacy policies using the framework of contextual integrity (CI)”. The authors claim that this

method allows for the systematized detection of issues with privacy policy statements that hinder readers’ ability to understand and evaluate company data collection practices. (Shvartzshnaider et al. 2019: 162)

Nissenbaum (2019) herself applies the approach of contextual integrity to a complex data network and argues that especially the contextual integrity approach is able to identify those sources of disruption in novel information practices which should be criticized (2019: 221; see also Nissenbaum 2015). Another interesting example is Shaffer (2021) who applies the Contextual Integrity Framework to new technologies for smart cities. Winter and Davidson (2019) apply the approach of contextual integrity to the problem of data governance in health information, since advances in collecting data

also pose significant data governance challenges for ensuring value for individual, organizational, and societal stakeholders as well as individual privacy and autonomy.

They go on to

investigate how forms of data governance were adapted, as PHI [Personal Health Information] data flowed into new use contexts, to address concerns of contextual integrity, which is violated when personal information collected in one use context moves to another use context with different norms of appropriateness. (2019: 36)

On the other hand, some criticism has been raised which argues that Nissenbaum does not actually provide normative standards to judge which information flows should be seen as ethically justified and which as detrimental (Rule 2019). Rule argues that

notions that norms underlying any given domain of human conduct are unambiguous or uncontested simply do not withstand close examination. Indeed, most social norms, particularly in rapidly changing domains of human conduct like privacy practices, are volatile and highly contested. (2019: 260)

In a similar vein, DeCew (2015) questions the idea that in any given context, the governing norms might be not up to ethical standards and still defendable on the basis of the integrity of the context and the norms governing it, as is the case of the governing norms in the traditional family (DeCew 2015: 215).

Finally, the theory that private data should be seen as private property should be mentioned. This idea originated with Lessig (2002) and has since been discussed by a number of authors. Arguments exist for and against the idea that the agents generating datapoints—the source of the data—have a right to claim those datapoints as private property (Schneider 2021; see also Tufekci 2015; Wu 2015).

4.2 The Conflict between Privacy and Other Values or Rights

Newell e.a. (2015) ushers in a cogent articulation and defense of privacy even when privacy seems to conflict with other important values. Recent technological developments are concerning, with implications for the moral, legal and social foundations and interrelationships between privacy, security and accountability. Katell and Moore (2016: 3) make use of a control-based definition of privacy, stating that a right to privacy is “a right to control access to, and uses of, places, bodies, and personal information”. They also write that

the ability to control access to our bodies, capacities, and powers, and to sensitive personal information, is an essential part of human flourishing or well-being. (Katell & Moore 2016: 5)

Kenneth Himma

argues that security is a more important right that always “trumps” privacy, which on his view is not an absolute or fundamental right, but merely “instrumental” to other rights. (Katell & Moore 2016: 12 and Himma 2016; both in A. Moore 2016)

Himma’s defense is based on his view that security is fundamental to survival—our most valuable duty and obligation. In contrast, responding to this view, Adam Moore defends privacy over security with multiple arguments, perhaps the most powerful of which is demonstrating “the importance of privacy as a bulwark against the tyrannical excesses of an unchecked security state” (Katell & Moore 2016: 13 and Moore 2016b). As the authors in this volume note, there is good reason to conclude that privacy, security and accountability are all morally valuable.

Over the course of the last few years (and especially between 2020 and 2023), the conflict between privacy and health, or privacy and safety, has been the topic of much discussion—particularly from the perspective of the COVID-19 pandemic. In an early attempt to develop suitable guidelines, Morley et al. (2020) have postulated that not only should privacy be protected, but also equality and fairness observed in digital contact-tracing. However, such contact-tracing apps clearly bring about a conflict between privacy and the health of people. Throughout the course of the pandemic, such contact-tracing apps were contested (see Bengio et al. 2020; Fahey & Hino 2020).

4.3 The Cultural Relativity of the Value of Privacy

Schoeman (1984b) has pointed out that the question of whether or not privacy is culturally relative can be interpreted in different ways. One question is whether privacy is deemed valuable to all peoples, or whether its value is relative and subject to cultural differences (see Westin 1967; Rachels 1975; Allen 1988; A. Moore 2003). Another question is whether there are any aspects of life that are inherently private and not just conventionally so. There is also some literature on the different cultures of privacy in different regions of the world. A growing number of articles are concerned with privacy conceptions in China, not only due to technological developments over the last years (as detailed in Ma 2019 for a relational, Western-Eastern conception of privacy; H. Roberts 2022), but also on earlier periods in Chinese culture, for instance on Confucian and Taoist thinking (Whitman 1985; for attitudes towards healthcare, see Chen et al. 2007). Basu (2012) writes about the Indian perception of privacy, based on India’s cultural values, and offers an explanation for why their concept of privacy seems to extend beyond the often-dominating public-private dichotomy. Capurro (2005) deals with intercultural aspects of privacy, particularly with regard to differences between Japanese and Western conceptions (see also Nakada & Tamura 2005). For a perspective on the general Buddhist theory of privacy, see Hongladarom (2016). The ubuntu perspective on privacy is discussed by Reviglio and Alunge (2020).

4.4 The Democratic Value of Privacy

Theorists have repeatedly pointed to the connection between the protection of privacy, understood as the protection of individual autonomy, and the protection of democracy (see Goold 2010: 38–48). Hughes, for example, speaks of privacy as a “bulwark against totalitarianism” (2015: 228), while Spiros Simitis describes the right to privacy as “a constitutive element of a democratic society” (1987: 732). Probably the best-known advocate of privacy with a view to democracy is Ruth Gavison. She argues that the protection of privacy both supports and encourages the moral autonomy of citizens, which is an essential precondition of democratic societies (Gavison 1980: 455; see also Hughes 2015: 228; Simitis 1987: 732; Solove 2008: 98–100; Schwartz 1999). There can be no democratic self-determination without the protection of privacy. Hence, government intervention for the security of citizens becomes an ideology that threatens democracy when its goal is no longer the freedom of individuals—that is, when citizens are no longer treated by the state as democratic subjects, but as objects.

Mokrosinska emphasizes privacy as a democratic value, thus strengthening that value when it comes into competition with freedom of expression and other political interests. Privacy can facilitate setting aside deep disagreements in order for political engagement in a democracy to proceed. Thus, Mokrosinska proposes a strategy for mediating between privacy and free speech when they collide (Mokrosinska 2015). Lever, in a monograph on privacy and democracy, develops a comprehensive theory and argues that privacy is in different ways essential for democracies and for the freedom, equality, and solidarity of democratic subjects (Lever 2013). From a different perspective, Stahl (2020) discusses ways in which “surveillance of intentionally public activities” should be criticized. Drawing on the theories of contextual integrity (Nissenbaum) and of the public sphere (Habermas), Stahl argues that

strategic surveillance of the public sphere can undermine the capacity of citizens to freely deliberate in public and therefore conflicts with democratic self-determination. (2020: 1)

The democratic value of privacy also plays a central role in Republicanism, since threats to privacy are always also threats to democracy. Republicans point out the value of privacy by contrasting it to liberal theories, which cannot explain the possible threat to privacy—not just the actual interference—as an encroachment on the freedom of subjects. Andrew Roberts (2015: 320) writes that for republicans, because privacy

is a pre-requisite for effective participation in political life, and republicans consider such participation to be the essence of self-government and the means through which a polity can secure conditions of freedom, in a republican democracy individual privacy will be seen as a collective good.

(For a far more elaborate account along similar lines, see A. Roberts 2022; as well as Schwartz 1999)

4.5 Group Privacy

Group privacy has entered philosophical discussion at a rather late stage, since it was only with the new forms of information and communication technology that groups could be targeted and surveilled efficiently. Research on group privacy addresses the fact that it is not individuals that are targeted by data collection and profiling practices, but rather groups of individuals who share certain relevant features. An important and influential collection of essays is that of Taylor, Floridi, and van der Sloot (2017a). Taylor et al. point out that

profiling and machine learning technologies are directed at the group level and are used to formulate types, not tokens—they work to scale, and enable their users to target the collective as much as the individual. (2017b: 1)

The book discusses divergent perspectives on what a group is, how groups should be addressed with regard to privacy, which elements of the problem can be addressed using current legal and conceptual tools, and which will require new approaches.

Especially interesting is Floridi’s argument that

groups are neither discovered nor invented but designed by the level of abstraction (LoA) at which a specific analysis of a social system is developed. Their design is therefore justified insofar as the purpose, guiding the choice of the LoA, is justified. (Floridi 2017: 83)

Floridi states that the claims that groups have rights and groups have privacy are taken together in the argument that groups can have rights to privacy, and that indeed sometimes it is only the group which has privacy and not its members. (2017: 83; see also Taylor 2017: 13–36; van der Sloot 2017: 197–224). Loi and Christen (2020) agree with Floridi’s claim that groups can have rights to privacy. However, they argue, against Floridi, for the distinction between two different concepts of privacy for groups:

The first (…) deals with confidential information shared with the member of a group and inaccessible to (all or a specific group of) outsiders. The second (…) deals with the inferences that can be made about a group of people defined by a feature, or combination thereof, shared by all individuals in the group. (Loi & Christen 2020: 207)

Loi and Christensen claim that it is the latter, the inferential notion of privacy which is the one explaining group privacy. An absolute right to this privacy, they conclude, is implausible.

Puri (2021) criticizes the conventional liberal conception of privacy that focuses excessively on the identification of the individual as inadequate in safeguarding the individual’s identity and autonomy. Puri therefore develops

a theoretical framework in the form of a triumvirate model of the group right to privacy (GRP), which is based on privacy as a social value… [he also formulates] the concept of mutual or companion privacy, which counterintuitively states that in the age of Big Data Analytics, we have more privacy together rather than individually. (Puri 2021: 477)

4.6 Social Dimensions of Privacy

Interestingly, Puri (2021) connects the question of group privacy with the one of the social dimensions of privacy. However, although both discourses share the critique of individualist conceptions of privacy and are interested in more than the individual’s protection of privacy, the latter tackles a different problem. The upshot of the debates around the social dimensions of privacy is the claim that privacy is not (only) a right or need for individuals, but protects, and ought to protect relations as well. Not all relations make up a group, and privacy can, in some cases, constitute the very relation it protects, as Roessler and Mokrosinska (2013) argue. James Rachels and Charles Fried both recognized that privacy has a social value: relationships can only be protected when certain norms of privacy apply both within relationships and to relationships (Fried 1968; Rachels 1975). The various norms of informational privacy do not merely regulate the social relationships and roles that we have in life, they actually make them possible. The perspective of the individual seems insufficient to account for many of the concerns raised in debates around privacy-invasive technologies. With ever-greater frequency, privacy-invasive technologies have been argued to endanger not only individual interests, but also to affect society and social life more generally. Therefore, not only the safeguarding of individual freedoms, but also the constitution and regulation of social relationships, are essential to privacy norms (see Roessler & Mokrosinska 2015).

Following Fried (1968) and his theory describing the relational character of the protection of individual privacy, one can argue that, while endorsing norms of informational privacy which should protect individual privacy, this privacy at the same time plays an essential role in social relations. Therefore, the protection of privacy is not always and not necessarily in conflict with societal interests (Roessler & Mokrosinska 2013: 771).

In recent years, several scholars have taken important steps towards developing a social approach to privacy. Arguing that an important aspect of the significance of informational privacy is that it goes beyond the interests of the individuals it protects, these scholars have emphasized the way in which privacy enables social and professional relationships, democratic decision-making processes, and political participation. They have also stressed the necessary role of privacy for cooperation and trust within various associations, such as economic partnerships. Regan, Solove, and Nissenbaum also follow this line of thought. Famously, Regan argues that privacy is not only of value to the individual, but also to society in general. For Regan privacy is a common value, a public value, and also a collective value (1995: 213; see Regan 2015: 50; Hughes 2015). Solove claims that

[by] understanding privacy as shaped by the norms of society, we can better see why privacy should not be understood solely as an individual right… the value of privacy should be understood in terms of its contribution to society. (2008: 98, 171fn)

Solove believes privacy fosters and encourages the moral autonomy of citizens, a central requirement of governance in a democracy. Regan (1995), Solove (2008) and Nissenbaum (2010) took the first steps in analyzing the social dimensions and value of privacy in a democratic society and are now focusing on the role of privacy in political and social practice, law, media and communication, healthcare, and the marketplace.

The social dimensions of privacy also play a crucial role in the social analysis of the family. DeCew and Moore assess the public/private boundary in the family, given that family conventions are among the most crucial for this primary human socialization setting, and also given that structures in the family often are oppressive, as DeCew points out (see MacKinnon 1989).

4.7 Privacy and the Datafication of Daily Life

The issue of the datafication of daily life and its relation to privacy has been discussed from many different angles. One of the first areas of focus was “ambient intelligence” (initially analyzed by Brey 2005), and the “internet of things”, which is still very widely discussed. Wachter, for instance, writes that the

Internet of Things (IoT) requires pervasive collection and linkage of user data to provide personalised experiences based on potentially invasive inferences. Consistent identification of users and devices is necessary for this functionality, which poses risks to user privacy. (2018: 266)

She suggests guidelines for IoT developers which guarantee the protection of privacy in accordance with the GDPR, in cases such as smart homes and data gathered by Siri and Alexa (see also entries on privacy and information technology and ethics of artificial intelligence and robotics.)

4.7.1 The Datafication of Life: Work

Most of us spend a lot of time at work every day. It is the workplace where, in recent years, more and more employers have been observing what their employees are doing. This raises normative problems, such as the consequences of this surveillance for the autonomy of employees, as well as for the relationship between employees and, of course, the question of how much privacy one can claim at the workplace. A good overview of this topic is provided by Bhave, Teo, and Dalal (2020). Chamorro-Premuzic and Buchband (2020) demonstrate the importance of clear communications with employees explaining the reasons for, and existence of, corporate monitoring programs.

4.7.2 The Datafication of Life: Commercial Health Apps

From the perspective of the datafied life, the development of commercial health apps—which have become increasingly commonplace in recent years—is particularly important. They became especially relevant given COVID-19 lockdown restrictions, when people became not only more interested in sporting activities, but also in measuring them and finding motivation through interpreting related data. Commercial health app developers have exploited this, and are often not sufficiently clear about the dangers of data collection for the protection of privacy (Sax 2021). Huckvale, Torous, and Larsen (2019) point to these dangers and explain their nature, as does Mulder (2019), who examines app providers and their marketing statements with respect to the extent to which they actually meet the requirements of the GDPR. A different, but equally important point concerns women’s health data and how the recent decision of the US Supreme Court to overturn Roe v. Wade affects the privacy of the health-data of women (see Cox 2022; the Dobbs v. Jackson Women’s Health Organization is also relevant at this point).

4.7.3 The Datafication of Life: The Privacy Paradox

It is often reported that people who talk generously about their private lives in social media still claim that they regard informational privacy as a very valuable asset, and that the state has an obligation to guarantee their informational privacy protection. Observations about the online behavior of young people in particular have led theorists to use the term “privacy paradox” in recent years (Hargittai & Marwick 2016: 3737). Martin and Nissenbaum (2016), however, argue against a paradox: they present a survey with

questions that insert ranges of values for the respective parameters contextual integrity asserts as fundamental to informational standards. (Martin & Nissenbaum 2016: 217)

The argument from contextual integrity demonstrates that whether people see any given data flow as violating privacy depends on the values for five parameters: sender, recipient, data subject, transmission principle, and type of information. Martin and Nissenbaum argue that if one describes a situation in which information or data flows without specifying values for all the parameters, one is thereby offering an ambiguous description. Their study on “sensitive” information demonstrates that although people consistently rank certain types of information as more or less sensitive, their reactions vary quite significantly when the recipient is specified, e.g., health information to an advertiser versus to a physician. Thus, when people post information on social media, we can draw no conclusions about whether this means they care or do not care about privacy because they may consider it appropriate to share the information in question with friends, though not with advertisers etc. (2016: 217).

Hargittai and Marwick, on the other hand, interpret the privacy paradox differently and argue that while young adults are aware of the dangers of sharing information online, they feel that they have to “acknowledge that individuals exist in social contexts where others can and do violate their privacy” (2016: 3737). Hoffman, Lutz, and Ranzini (2016), in contrast, explain the endless sharing of personal data, while simultaneously defending the value of informational privacy, as a kind of “privacy cynicism”. In contrast, Draper and Turow analyze this attitude as “resignation” and have recently defended a theoretical framework which conceptualizes “digital resignation as a rational response to consumer surveillance” (Draper & Turow 2019: 1824).

4.7.5 The Datafication of Life: The Quantified Self and the Duty to Protect One’s Own Privacy

One aspect of mass data collection concerns the possibility of self-observation: one’s own behavior can be measured comprehensively, and every quantifiable detail can be noted. The first such “lifeloggers” were Gordon Bell and Jim Gemmell. They had a grand vision of recording life as a whole, similar to the protagonists in Dave Eggers’ dystopian novel The Circle (Bell & Gemmell 2009 [2010]; see Eggers 2013 and his 2021; Lanzing 2016). As a rule, self-trackers use self-observation techniques more selectively. They are mostly concerned with data that is necessary for their health and well-being, but not with all-round self-observation.

Allen points to one of the consequences of living with a life log: she argues that life logs make memory superfluous and calls this activity a “freezing of the past” (Allen 2008: 48). This makes it impossible to change oneself. She further argues that this complete abandonment of privacy—also characteristic of social media—leads to people, especially young adults, becoming less autonomous. Allen therefore pleads for a duty to protect our own privacy (Allen 2011) and makes the case that protecting one’s own privacy should be required:

We need to restrain choice—if not by law, then somehow. Respect for privacy rights and the ascription of privacy duties must both be a part of a society’s formative project for shaping citizens… some privacy should not be optional, waivable, or alienable. (Allen 2011: 172)

4.8 The Privacy of Animals

Alan Westin (1967) surveyed several studies of animals, demonstrating that a desire for privacy is not restricted to humans. One of the more recent contemporary debates on privacy concerns the question of whether animals should have (a right to) privacy. Pepper (2020) defends such a right, since nonhuman animals also have an

interest in shaping different kinds of relationships with one another by giving us control over how we present ourselves to others… which grounds a right against us. (2020: 628; see also Paci et al. 2022)

4.9 Epistemological Issues

Research on the relationship between privacy and knowledge—that is, on epistemological issues—is a relative newcomer to the arena of privacy debates. Fallis (2013) offers one of the first discussions of the issue, and criticizes the position that holds that “protecting privacy typically leads to less knowledge being acquired”, as well as the position that “lack of knowledge is definitive of having privacy”. He goes on to argue that

contra the defenders of the knowledge account of privacy, that someone knowing something is not necessary for someone else losing privacy about that thing. (Fallis 2013: 153)

For Kappel, it

seems obvious that informational privacy has an epistemological component; privacy or lack of privacy concerns certain kinds of epistemic relations between a cogniser and sensitive pieces of information. (2013: 179)

In his paper, Kappel aims at

[shedding] some light on the epistemological component of informational privacy [since] the nature of this epistemological component of privacy is only sparsely discussed. (2013: 179; see also the entry on privacy and information technology)

4.10 Privacy, Surveillance, and Power

Since the publication of the essay by Warren and Brandeis, there have been enormous technological advances that have radically transformed not only the possibilities for keeping people under surveillance, but also our concepts of privacy, as well as freedom and autonomy. Opportunities for monitoring people are now available in private households, public spaces, and when surfing the Internet. The spaces of freedom in the public sphere could not exist under permanent (potential) surveillance and social control, and certainly not if we can no longer be sure about what data being collected, and into whose hands it might fall (see Julie Cohen 2008 for more on the relationship between privacy and visibility in the networked information age). If we can no longer be certain that we are not being surveilled and controlled, then we cannot even openly and self-determinedly debate potentially critical positions with others (Stahl 2016: 33–39; Rosen 2000: 207). Control and standardization (or normalization) are two sides of the same coin. In the face of increasingly pervasive data collection and digital surveillance, some authors have sought to move away from understandings of privacy that build on the notions of control over information or secrecy. Brunton and Nissenbaum, for instance, have observed a variety of practices that sought privacy through “obfuscation”, i.e., the “deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection” (2015: 1). The German sociologist Carsten Ochs suggests that these obfuscation practices correspond to a new type of informational privacy, which emerges in response to the specific challenges of a digitalized society and which he characterizes as “concealment” (“Verschleierung”) (2022: 496).

Brunton and Nissenbaum argue that obfuscation is a

tool particularly suited to the “weak”—the situationally disadvantaged, those at the wrong end of asymmetrical power relationships. (2015: 9)

It is useful to address the information asymmetry between those who collect data and can use it to monitor, predict and influence behaviors, and those whose data is collected and who may not even be aware of this data collection and usage (Brunton & Nissenbaum 2015: 48–51). These observations point to the relation between privacy and power, which has received significant attention in the recent literature. The erosion of privacy through data collection and processing increases the power of big technology companies and governments to influence the people whose data is collected (Véliz 2020). Privacy is also a matter of power, insofar as access to privacy is unequally distributed and marginalized groups are more vulnerable than others to privacy invasions. The legal theorist Skinner-Thompson (2021) has documented how minority communities, such as religious, ethnic or sexual minorities, are disproportionally exposed to surveillance due to a lack of legal protections of privacy.

Here, we would also like to draw attention to the computer intelligence consultant Edward Snowden, an employee at the NSA, who in 2013 published highly classified data from the NSA where he was working. These data revealed global surveillance programs by different states and the disclosure prompted many heated cultural debates in different countries, as well as international academic debates. From the beginning, one of the questions was whether Snowden had actually violated the privacy of individuals or whether the publication of state secrets should be judged differently than the publication of individual secrets; see Mokrosinska on the difference between secrecy and privacy (Mokrosinska 2020). On the general privacy issues involved in this affair, see Greenwald 2014; Rubel 2015; MacNish 2018; Lucas 2014. For the general societal debates, the Snowden revelations were significant in drawing new attention to structural forms of surveillance and possible violations of privacy.

In analyses of the new “surveillance state”, many authors discuss the different social contexts in which violations of informational privacy may in various ways coincide with restrictions of freedom. Zuboff (2019) analyses and criticizes the surveillance capitalism and the “instrumentarian power” of this surveillance state, where privacy is only seen as data and therefore as a tradable good. At the same time, she criticizes the process of transforming personal life into behavioral data and commercial products. For Zuboff, personal data is not a commodity that should be traded arbitrarily, since it is essentially constituted by human experience (see also Roessler 2015; Wang 2022). Susskind’s discussion of “future power” as the power to “scrutiny” and to “perception-control” (2018) is equally critical and pessimistic about the protection of informational privacy in the surveillance state. Similar pessimism is expressed by Harcourt (2015), who conceptualizes the decline and loss of privacy as the loss of the individual. At the same time, he calls for forms of civil disobedience in the surveillance state (2015). For the political abuse of data power, see Cadwalladr and Graham-Harrison (2018) on the so-called Cambridge-Analytica Scandal.

Specific aspects of the surveillance state are highlighted by authors using the concept of data colonialism, which brings out the persistent exploitative power inherent in datafication (Wang 2022). The term “colonialism” is connected to cruel practices of domination and exploitation in human history. As Couldry and Mejias (2019) emphasize, “data colonialism” is not used to describe the physical violence and force that was often seen in historical colonialism, but at the same time, the frame of colonialism is not used “as a mere metaphor” either. Instead, the term is used “to refer a new form of colonialism distinctive of the twenty-first century” (2019: 337), which involves a widespread dispossession and exploitation of our human lives. By using the term “data colonialism” instead of “data capitalism”, or some similar formulation, the crueler side of “today’s exposure of daily life to capitalist forces of datafication” is highlighted (2019: 337; see also Thatcher, O’Sullivan, & Mahmoudi 2016: 994).

4.11 Privacy, Colonialism, and Racism

More specifically, racist structures and colonial heritage play a constitutive role in the conceptualization of privacy. Fraser (1992) analyzed the simultaneous constitution of privacy and the public sphere, in a cogent interpretation of the hearings for the confirmation of Clarence Hill to the US Supreme Court. Race, sex and class were the determinants of the construction of privacy:

The way the struggle unfolded, moreover, depended at every point on who had the power to successfully and authoritatively define where the line between the public and the private would be drawn. It depended as well on who had the power to police and defend that boundary. (Fraser 1992: 595)

Nagel (1998a and b), meanwhile, is critical of this theory of interdependence of the concepts privacy and the public sphere, but the idea that personal privacy is deeply connected with societal and political power relations plays an important role in the literature from the last decade. There has been intensified research on the racist and colonial constructions of privacy. Couldry and Mejias (2019) generally criticize the relation between big data and the contemporary subject, while Arora (2019) argues more specifically that that many assumptions about internet use in developing countries are wrong, since they presuppose a Western conception of this use and the idea of privacy in talking about and dealing with the Global South. In her work, she analyses the real-life patterns of internet use in different countries (among them India and China) and seeks to answer questions why, for instance, citizens of states with strict surveillance policies seem to care so little about their digital privacy (Arora 2019: 717).

Translating human life into commodified data relations is not new, but rather a fact that has been critically examined by many scholars over the years. For example, Shoshana Zuboff (2019) has commented on the drive toward continually increasing data extraction as the essence of surveillance capitalism. From a different angle, Simone Browne interprets the “condition of blackness” as the key with the help of which state surveillance can precisely record and regulate the lives of subjects. Her book shows exactly how state surveillance emerged and learned from practices of colonial exploitation and slavery. Surveillance, Browne asserts,

is both a discursive and material practice that reifies boundaries, borders, and bodies around racial lines, so much so that the surveillance of blackness has long been, and continues to be, a social and political norm. (2015: 1)

Again, from the societal perspective of class and race, Bridges (2017) demonstrates in which ways the protection of personal privacy is deeply connected to class and race, and how it can be used to exacerbate the structures of social inequality. She points out that the poor are subject to invasions of privacy that can be perceived as gross demonstrations of governmental power without limits.

4.12 The Future of Privacy

Over the last years, discussions have begun regarding the question of what lies “beyond privacy”. Since most big technology companies have been forced to conform to privacy laws, and even advertise their compliance, a significant question is whether a sort of “privacy-washing” is occurring, where the very focus on privacy undermines the interests which originally initiated those privacy concerns. Projects like “The Privacy Sandbox”, where Google, together with the advertisement-technology industry, developed seemingly privacy-friendly alternatives to third-party cookies, may be seen as cases of “privacy-washing”. Behind these supposedly good intentions lie fundamental questions about the future of privacy, as well as the future of the internet. For example, do we even want an internet that runs on personalized advertisements and massive data collection? Brunton and Nissenbaum are concerned that

information collection takes place in asymmetrical power relationships: we rarely have choice as to whether or not we are monitored, what is done with any information that is gathered, or what is done to us on the basis of conclusions drawn from that information. (2015: 49)

Sharon demonstrates that

in an interesting twist, the tech giants came to be portrayed as greater champions of privacy than some democratic governments. (2021: 45)

Thus, if we look back to the beginnings of informational privacy in the late nineteenth century, we can see that the apparent changes (the corporations themselves seem to care about privacy) express the very continuities of the threat to privacy. The academic and societal debates about these threats, however, make it clear that privacy has by no means lost its value and importance.

Bibliography

  • Allen, Anita L., 1988, Uneasy Access: Privacy for Women in a Free Society, (New Feminist Perspectives Series), Totowa, NJ: Rowman & Littlefield.
  • –––, 1989, “Equality and Private Choice: Reproductive Laws for the 1990s, edited by Nadine Taub and Sherrill Cohen”, Nova Law Review, 13(2): 625–648. [Allen 1989 available online]
  • –––, 2000, “Privacy-as-Data Control: Conceptual, Practical, and Moral Limits of the Paradigm Commentary”, Connecticut Law Review, 32(3): 861–876.
  • –––, 2008, “Dredging up the Past: Lifelogging, Memory, and Surveillance”, University of Chicago Law Review, 75(1): 47–74 (article 4).
  • –––, 2011, Unpopular Privacy: What Must We Hide?, (Studies in Feminist Philosophy), Oxford/New York: Oxford University Press. doi:10.1093/acprof:oso/9780195141375.001.0001
  • Arendt, Hannah, 1958 [1998], The Human Condition, (Charles R. Walgreen Foundation Lectures), Chicago: University of Chicago Press. Second edition, 1998.
  • Ariès, Philippe, 1960 [1962], L’enfant et la vie familiale sous l’Ancien Régime, (Civilisations d’hier et d’aujourd’hui), Paris: Plon. Translated as Centuries of Childhood: A Social History of Family Life, Robert Baldick (trans.), New York: Vintage Books, 1962.
  • Ariès, Philippe and Georges Duby, 1985–1987, Histoire de la vie privée, 5 tomes, (Univers historique), Paris: Seuil. Translated as A History of Private Life, 5 volumes,Arthur Goldhammer (trans.), Cambridge, MA: Belknap Press, 1992–1998.
    • De l’Empire romain à l’an mil, Paul Veyne (ed.), translated as From pagan Rome to Byzantium, 1992
    • De l’Europe féodale à la Renaissance, Georges Duby (ed.), translated as Revelations of the Medieval World, 1993
    • De la Renaissance aux lumières, Roger Chartier (ed.), translated as Passions of the Renaissance, 1993
    • De la Révolution à la Grande Guerre, Michelle Perrot (ed.), translated as From the fires of the Revolution to the Great War, 1994
    • De la Première Guerre mondiale à nos jours, Antoine Prost and Gérard Vincent (eds), translated as Riddles of Identity in Modern Times, 1998
  • Arora, Payal, 2019, “General Data Protection Regulation—A Global Standard? Privacy Futures, Digital Activism, and Surveillance Cultures in the Global South”, Surveillance & Society, 17(5): 717–725. doi:10.24908/ss.v17i5.13307
  • Basu, Subhajit, 2012, “Privacy Protection: A Tale of Two Cultures”, Masaryk University Journal of Law and Technology, 6(1): 1–34.
  • Bell, C. Gordon and Jim Gemmell, 2009 [2010], Total Recall: How the E-Memory Revolution Will Change Everything, New York: Dutton. Reprinted as Your Life, Uploaded: The Digital Way to Better Memory, Health, and Productivity, New York: Plume, 2010.
  • Bengio, Yoshua, Richard Janda, Yun William Yu, Daphne Ippolito, Max Jarvie, Dan Pilat, Brooke Struck, Sekoul Krastev, and Abhinav Sharma, 2020, “The Need for Privacy with Public Digital Contact Tracing during the COVID-19 Pandemic”, The Lancet Digital Health, 2(7): e342–e344. doi:10.1016/S2589-7500(20)30133-3
  • Benn, Stanley I., 1984, “Privacy, Freedom, and Respect for Persons”, in Schoeman 1984a: 223–244 (ch. 8). doi:10.1017/CBO9780511625138.009
  • –––, 1988, A Theory of Freedom, Cambridge/New York: Cambridge University Press. doi:10.1017/CBO9780511609114
  • Benn, S. I. and Gerald F. Gaus (eds.), 1983, Public and Private in Social Life, London/New York: Croom Helm/St. Martin’s Press.
  • Bhave, Devasheesh P., Laurel H. Teo, and Reeshad S. Dalal, 2020, “Privacy at Work: A Review and a Research Agenda for a Contested Terrain”, Journal of Management, 46(1): 127–164. doi:10.1177/0149206319878254
  • Bloustein, Edward J., 1964, “Privacy as an Aspect of Human Dignity: An Answer to Dean Prosser”, New York University Law Review, 39(6): 962–1007.
  • Bok, Sissela, 1982, Secrets: On the Ethics of Concealment and Revelation, New York: Pantheon Books.
  • Bork, Robert H., 1990, The Tempting of America: The Political Seduction of the Law, New York: Free Press.
  • Boyd, Danah, 2010, “Social Network Sites as Networked Publics: Affordances, Dynamics, and Implications”, in Zizi Papacharissi (ed.), Networked Self: Identity, Community, and Culture on Social Network Sites, New York: Routledge, pp. 39–58.
  • Brey, Philip, 2005, “Freedom and Privacy in Ambient Intelligence”, Ethics and Information Technology, 7(3): 157–166. doi:10.1007/s10676-006-0005-3
  • Bridges, Khiara M., 2017, The Poverty of Privacy Rights, Stanford, CA: Stanford University Press.
  • Brown, Wendy, 1995, States of Injury: Power and Freedom in Late Modernity, Princeton, NJ: Princeton University Press.
  • Brunton, Finn and Helen Nissenbaum, 2015, Obfuscation: A User’s Guide for Privacy and Protest, Cambridge, MA: MIT Press.
  • Cadwalladr, Carole and Emma Graham-Harrison, 2018, “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach”, The Guardian, 17 March 2018, News Section. [Cadwalladr and Graham-Harrison available online]
  • Capurro, Rafael, 2005, “Privacy. An Intercultural Perspective”, Ethics and Information Technology, 7(1): 37–47. doi:10.1007/s10676-005-4407-4
  • Chamorro-Premuzic, Tomas and Richard Buchband, 2020, “If You’re Tracking Employee Behavior, Be Transparent About It”, Harvard Business Review, 23 December 2020.
  • Chen, Wei-Ti, Helene Starks, Cheng-Shi Shiu, Karen Fredriksen-Goldsen, Jane Simoni, Fujie Zhang, Cynthia Pearson, and Hongxin Zhao, 2007, “Chinese HIV-Positive Patients and Their Healthcare Providers: Contrasting Confucian Versus Western Notions of Secrecy and Support”, Advances in Nursing Science, 30(4): 329–342. doi:10.1097/01.ANS.0000300182.48854.65
  • Citron, Danielle Keats, 2022, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age, New York: W.W. Norton & Company.
  • Cohen, Jean L., 1992, “Redescribing Privacy: Identity, Difference, and the Abortion Controversy”, Columbia Journal of Gender and Law, 3(1). doi:10.7916/CJGL.V3I1.2354
  • –––, 2002, Regulating Intimacy: A New Legal Paradigm, Princeton, NJ: Princeton Univ. Press.
  • Cohen, Julie E., 2008, “Privacy, Visibility, Transparency, and Exposure”, University of Chicago Law Review, 75(1): 181–201 (article 4).
  • –––, 2019, “Turning Privacy Inside Out”, Theoretical Inquiries in Law, 20(1): 1–31. doi:10.1515/til-2019-0002
  • Cornell, Drucilla, 1995, The Imaginary Domain: Abortion, Pornography and Sexual Harassment, New York: Routledge.
  • Couldry, Nick and Ulises A. Mejias, 2019, “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject”, Television & New Media, 20(4): 336–349. doi:10.1177/1527476418796632
  • Cox, David, 2022, “How Overturning Roe v Wade Has Eroded Privacy of Personal Data”, BMJ, 378: o2075. doi:10.1136/bmj.o2075
  • Craglia, M. (ed.), de Nigris, S., Gómez-González, E., Gómez, E., Martens, B., Iglesias, M., Vespe, M., Schade, S., Micheli, M., Kotsev, A., Mitton, I., Vesnic-Alujevic, L., Pignatelli, F., Hradec, J., Nativi, S., Sanchez, I., Hamon, R., Junklewitz, H., 2020, “Artificial Intelligence and Digital Transformation: early lessons from the COVID-19 crisis”, EUR 30306 EN, Luxembourg: Publications Office of the European Union. doi:10.2760/166278.
  • Dalla Corte, Lorenzo, 2020, “Safeguarding data protection in an open data world: On the idea of balancing open data and data protection in the development of the smart city environment and data protection in the development of the smart city environment”, PhD Thesis, Tilburg: Tilburg University.
  • DeCew, Judith Wagner, 1989, “Constitutional Privacy, Judicial Interpretation, and Bowers v. Hardwick”, Social Theory and Practice, 15(3): 285–303. doi:10.5840/soctheorpract198915314
  • –––, 1997, In Pursuit of Privacy: Law, Ethics, and the Rise of Technology, Ithaca, NY: Cornell University Press.
  • –––, 2007, “The Philosophical Foundations of Privacy”, in Encyclopedia of Privacy, William G. Staples (ed.), Westport, CT: Greenwood Press, pp. 404–414.
  • –––, 2012, “Privacy”, in The Routledge Companion to Philosophy of Law, Andrei Marmor, New York: Routledge, pp. 584–598.
  • –––, 2015, “The Feminist Critique of Privacy: Past Arguments and New Social Understandings”, in Roessler and Mokrosinska 2015: 85–103. doi:10.1017/CBO9781107280557.006
  • –––, 2016, “Privacy and Its Importance with Advancing Technology”, Ohio Northern University Law Review, 42(2): 471–492 (article 4).
  • –––, 2018, “The Conceptual Coherence of Privacy As Developed in Law”, in Core Concepts and Contemporary Issues in Privacy, Ann E. Cudd and Mark C. Navin (eds.), (AMINTAPHIL: The Philosophical Foundations of Law and Justice 8), Cham: Springer International Publishing, pp. 17–30. doi:10.1007/978-3-319-74639-5_2
  • Diggelmann, Oliver and Maria Nicole Cleis, 2014, “How the Right to Privacy Became a Human Right”, Human Rights Law Review, 14(3): 441–458. doi:10.1093/hrlr/ngu014
  • Draper, Nora A and Joseph Turow, 2019, “The Corporate Cultivation of Digital Resignation”, New Media & Society, 21(8): 1824–1839. doi:10.1177/1461444819833331
  • Dworkin, Ronald, 1994, Life’s Dominion: An Argument about Abortion, Euthanasia and Individual Freedom, New York: Vintage Books.
  • Eggers, Dave, 2013, The Circle, London: Penguin Books.
  • –––, 2021, The Every: or At last a sense of order, or The final days of free will, or Limitless choice is killing the world, New York: Vintage Books.
  • Elshtain, Jean Bethke, 1981, Public Man, Private Woman: Women in Social and Political Thought, Princeton, NJ: Princeton University Press.
  • –––, 1995, Democracy on Trial, New York: Basic Books.
  • Etzioni, Amitai, 1999, The Limits of Privacy, New York: Basic Books.
  • –––, 2004, The Common Good, Cambridge: Polity.
  • Fahey, Robert A. and Airo Hino, 2020, “COVID-19, Digital Privacy, and the Social Limits on Data-Focused Public Health Responses”, International Journal of Information Management, 55: article 102181. doi:10.1016/j.ijinfomgt.2020.102181
  • Fallis, Don, 2013, “Privacy and Lack of Knowledge”, Episteme, 10(2): 153–166. doi:10.1017/epi.2013.13
  • Feldman, David, 1997, “The developing scope of Article 8 of the European Convention on Human Rights”, European Human Rights Law Review, 3: 265–274.
  • Floridi, Luciano, 2016, “On Human Dignity as a Foundation for the Right to Privacy”, Philosophy & Technology, 29(4): 307–312. doi:10.1007/s13347-016-0220-8
  • –––, 2017, “Group Privacy: A Defence and an Interpretation”, in Taylor, Floridi, and van der Sloot 2017a: 83–100. doi:10.1007/978-3-319-46608-8_5
  • Francis, Leslie P. and John G. Francis, 2017, Privacy: What Everyone Needs to Know, New York: Oxford University Press.
  • Fraser, Nancy, 1992, “Sex, Lies, and the Public Sphere: Some Reflections on the Confirmation of Clarence Thomas”, Critical Inquiry, 18(3): 595–612. doi:10.1086/448646
  • –––, 1996, “Rethinking the Public Sphere”, in Habermas and the Public Sphere, Craig Calhoun, Cambridge, MA: MIT Press.
  • Fried, Charles, 1968, “Privacy”, The Yale Law Journal, 77(3): 475–493.
  • –––, 1970, An Anatomy of Values: Problems of Personal and Social Choice, Cambridge, MA: Harvard University Press.
  • Gajda, Amy, 2022, Seek and Hide: The Tangled History of the Right to Privacy, New York: Viking.
  • Gatens, Moira, 1996, Imaginary Bodies: Ethics, Power, and Corporeality, London/New York: Routledge. doi:10.4324/9780203418659
  • –––, 2004, “Privacy and the Body: The Publicity of Affect”, in Rössler 2004: 113–132.
  • Gaukroger, Cressida, 2020, “Privacy and the Importance of ‘Getting Away With It’”, Journal of Moral Philosophy, 17(4): 416–439. doi:10.1163/17455243-20202987
  • Gavison, Ruth, 1980, “Privacy and the Limits of Law”, Yale Law Journal, 89(3): 421–471.
  • Gerstein, Robert S., 1978, “Intimacy and Privacy”, Ethics, 89(1): 76–81. doi:10.1086/292105
  • Geuss, Raymond, 2001, Public Goods, Private Goods, (Princeton Monographs in Philosophy), Princeton, NJ: Princeton University Press.
  • Glancy, Dorothy J., 1979, “The Invention of the Right to Privacy”, Arizona Law Review, 21(1): 1–40.
  • Goffman, Erving, 1959, The Presentation of Self in Everyday Life, revised edition, New York: Anchor Books.
  • Goldenfein, J., Green, B., & Viljoen, S., 2020, “Privacy Versus Health Is a False Trade-Off”, Jacobin, April 17.
  • González Fuster G, 2014, The Emergence of Personal Data Protection as a Fundamental Right of the EU, Heidelberg: Springer.
  • Goold, Benjamin J., 2010, “How Much Surveillance is Too Much? Some Thoughts on Surveillance, Democracy, and the Political Value of Privacy”, in Overvåking i en rettstat, D. W. Schartum (ed.), Bergen: Fagbokforlaget, pp. 38–48. [Goold 2010 preprint available online]
  • Gordon, Harold R., 1960, “Right of Property in Name, Likeness, Personality and History”, Northwestern University Law Review, 55(5): 553–613.
  • Greenwald, Glenn, 2014, No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State, New York: Metropolitan Books/Henry Holt and London: Hamish Hamilton.
  • Gross, Hyman, 1971, “Privacy and Autonomy”, in Privacy: Nomos XIII, Roland Pennock and John W. Chapman (eds.), New York: Atherton Press, pp. 169–81.
  • Harcourt, Bernard E., 2015, Exposed: Desire and Disobedience in the Digital Age, Cambridge, MA: Harvard University Press.
  • Hargittai, Eszter and Alice Marwick, 2016, “‘What Can I Really Do?’ Explaining the Privacy Paradox with Online Apathy”, International Journal of Communication, 10: 3737–3757 (article 21). [Hargittai and Marwick 2016 available online]
  • Henkin, Louis, 1974, “Privacy and Autonomy”, Columbia Law Review, 74(8): 1410–1433.
  • Himma, Kenneth Einar, 2016, “Why Security Trumps Privacy”, in A. Moore 2016a: 145–170 (ch. 8).
  • Hobbes, Thomas, 1651, Leviathan, London: Crooke. Reprinted as Hobbes: Leviathan: Revised Student Edition, Richard Tuck (ed.), Cambridge: Cambridge University Press, 1996.
  • Hoffmann, Christian Pieter, Christoph Lutz, and Giulia Ranzini, 2016, “Privacy Cynicism: A New Approach to the Privacy Paradox”, Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 10(4): article 7. doi:10.5817/CP2016-4-7
  • Hongladarom, Soraj, 2016, A Buddhist Theory of Privacy, (SpringerBriefs in Philosophy), Singapore: Springer Singapore. doi:10.1007/978-981-10-0317-2
  • Honneth, Axel, 2004, “Between Justice and Affection: The Family as a Field of Moral Disputes”, in Rössler 2004: 142–162.
  • Hoofnagle, Chris Jay, Bart van der Sloot, and Frederik Zuiderveen Borgesius, 2019, “The European Union General Data Protection Regulation: What It Is and What It Means”, Information & Communications Technology Law, 28(1): 65–98. doi:10.1080/13600834.2019.1573501
  • Huckvale, Kit, John Torous, and Mark E. Larsen, 2019, “Assessment of the Data Sharing and Privacy Practices of Smartphone Apps for Depression and Smoking Cessation”, JAMA Network Open, 2(4): e192542. doi:10.1001/jamanetworkopen.2019.2542
  • Hughes, Kirsty, 2015, “The Social Value of Privacy, the Value of Privacy to Society and Human Rights Discourse”, in Roessler and Mokrosinska 2015: 225–243. doi:10.1017/CBO9781107280557.013
  • Igo, Sarah E., 2018, The Known Citizen: A History of Privacy in Modern America, Cambridge, MA: Harvard University Press.
  • Inness, Julie C., 1992, Privacy, Intimacy, and Isolation, New York: Oxford University Press.
  • Johnson, Jeffrey L., 1994, “Constitutional Privacy”, Law and Philosophy, 13(2): 161–193.
  • Kappel, Klemens, 2013, “Epistemological Dimensions of Informational Privacy”, Episteme, 10(2): 179–192. doi:10.1017/epi.2013.15
  • Katell, Michael and Adam D. Moore, 2016, “Introduction: The Value of Privacy, Security and Accountability”, in A. Moore 2016a: 1–18.
  • Keulen, Sjoerd and Ronald Kroeze, 2018, “Privacy from a Historical Perspective”, in van der Sloot and de Groot 2018: 21–56. doi:10.1515/9789048540136-002
  • Koops, Bert-Jaap, Bryce Clayton Newell, Tjerk Timan, Ivan Skorvanek, Tomislav Chokrevski, and Masa Galic, 2017, “A Typology of Privacy”, University of Pennsylvania Journal of International Law, 38(2): 483–576.
  • Kupfer, Joseph, 1987, “Privacy, Autonomy and Self-Concept”, American Philosophical Quarterly, 24(1): 81–89.
  • Landes, Joan B. (ed.), 1998, Feminism, the Public and the Private, (Oxford Readings in Feminism), New York: Oxford University Press.
  • Lanzing, Marjolein, 2016, “The Transparent Self”, Ethics and Information Technology, 18(1): 9–16. doi:10.1007/s10676-016-9396-y
  • Lessig, Lawrence, 2002, “Privacy as Property”, Social Research, 69(1): 247–269.
  • Lever, Annabelle, 2012, On Privacy, London: Routledge. doi:10.4324/9780203156667
  • –––, 2013, A Democratic Conception of Privacy, London: AuthorHouse
  • Lindroos-Hovinheimo, Susanna, 2021, Private Selves: Legal Personhood in European Privacy Protection, (Cambridge Studies in European Law and Policy), Cambridge/New York: Cambridge University Press. doi:10.1017/9781108781381
  • Locke, John, 1690, Two Treatises of Government, London: Awnsham Churchill. New edition, Peter Laslett (ed.), Cambridge: Cambridge University Press, 1988.
  • Loi, Michele and Markus Christen, 2020, “Two Concepts of Group Privacy”, Philosophy & Technology, 33(2): 207–224. doi:10.1007/s13347-019-00351-0
  • Lucas, George R., 2014, “NSA Management Directive #424: Secrecy and Privacy in the Aftermath of Edward Snowden”, Ethics & International Affairs, 28(1): 29–38. doi:10.1017/S0892679413000488
  • Lundgren, Björn, 2020, “A Dilemma for Privacy as Control”, The Journal of Ethics, 24(2): 165–175. doi:10.1007/s10892-019-09316-z
  • Ma, Yuanye, 2019, “Relational Privacy : Where the East and the West Could Meet”, Proceedings of the Association for Information Science and Technology, 56(1): 196–205. doi:10.1002/pra2.65
  • MacKinnon, Catharine A., 1987, Feminism Unmodified: Discourses on Life and Law, Cambridge, MA: Harvard University Press.
  • –––, 1989, Toward a Feminist Theory of the State, Cambridge, MA: Harvard University Press.
  • –––, 1991, “Reflections on Sex Equality under Law”, Yale Law Journal, 100(5): 1281–1328.
  • Macnish, Kevin, 2018, “Government Surveillance and Why Defining Privacy Matters in a Post-Snowden World”, Journal of Applied Philosophy, 35(2): 417–432. doi:10.1111/japp.12219
  • Mainz, Jakob Thrane and Rasmus Uhrenfeldt, 2021, “Too Much Info: Data Surveillance and Reasons to Favor the Control Account of the Right to Privacy”, Res Publica, 27(2): 287–302. doi:10.1007/s11158-020-09473-1
  • Marmor, Andrei, 2015, “What Is the Right to Privacy?”, Philosophy & Public Affairs, 43(1): 3–26. doi:10.1111/papa.12040
  • Martin, Kirsten and Helen Nissenbaum, 2016, “Measuring Privacy: An Empirical Test Using Context to Expose Confounding Variables”, Columbia Science and Technology Law Review, 18(1): 176–218.
  • McKeon, Michael, 2007, The Secret History of Domesticity: Public, Private, and the Division of Knowledge, Baltimore, MD: Johns Hopkins University Press.
  • Mead, Margaret, 1928, Coming of Age in Samoa: A Psychological Study of Primitive Youth for Western Civilisation, New York: Blue Ribbon Books.
  • Menges, Leonhard, 2020, “Did the NSA and GCHQ Diminish Our Privacy? What the Control Account Should Say”, Moral Philosophy and Politics, 7(1): 29–48. doi:10.1515/mopp-2019-0063
  • Michael, James R., 1994, Privacy and Human Rights: An International and Comparative Study, with Special Reference to Developments in Information Technology, Brookfield, VT: Dartmouth Publishing Company.
  • Mill, John Stuart, 1859, On Liberty, London: John W. Parker and Son.
  • Mokrosinska, Dorota, 2015, “How much privacy for public officials?”, in Roessler and Mokrosinska 2015: 181–201. doi:10.1017/CBO9781107280557.013
  • –––, 2020, “Why States Have No Right to Privacy, but May Be Entitled to Secrecy: A Non-Consequentialist Defense of State Secrecy”, Critical Review of International Social and Political Philosophy, 23(4): 415–444. doi:10.1080/13698230.2018.1482097
  • Moore, Adam D., 1998, “Intangible Property: Privacy, Power, and Information Control”, American Philosophical Quarterly 35(4): 365–378.
  • –––, 2000, “Employee Monitoring and Computer Technology: Evaluative Surveillance v. Privacy”, Business Ethics Quarterly, 10(3): 697–709. doi:10.2307/3857899
  • –––, 2003, “Privacy: Its Meaning and Value” American Philosophical Quarterly, 40(3): 215–227.
  • –––, 2010, Privacy Rights: Moral and Legal Foundations, University Park, PA: Pennsylvania State University Press.
  • –––, (ed.), 2016a, Privacy, Security and Accountability: Ethics, Law and Policy, London: Rowman & Littlefield International Ltd.
  • –––, 2016b, “Why Privacy and Accountability Trump Security”, in Moore 2016a: 171–182 (ch. 9).
  • Moore, Barrington Jr, 1984, Privacy: Studies in Social and Cultural History, Armonk, NY: M.E. Sharpe.
  • Morley, Jessica, Josh Cowls, Mariarosaria Taddeo, and Luciano Floridi, 2020, “Ethical Guidelines for COVID-19 Tracing Apps”, Nature, 582(7810): 29–31. doi:10.1038/d41586-020-01578-0
  • Morris, Debra, 2000, “Privacy, Privation, Perversity: Toward New Representations of the Personal”, Signs: Journal of Women in Culture and Society, 25(2): 323–351. doi:10.1086/495441
  • Mulder, Trix, 2019, “Health Apps, Their Privacy Policies and the GDPR”, European Journal of Law and Technology, 10(1). [Mulder 2019 available online]
  • Nagel, Thomas, 1998a, “The Shredding of Public Privacy: Reflections on Recent Events in Washington”, Times Literary Supplement, 14 August 1998.
  • –––, 1998b, “Concealment and Exposure”, Philosophy & Public Affairs, 27(1): 3–30. doi:10.1111/j.1088-4963.1998.tb00057.x
  • Nakada, Makoto and Takanori Tamura, 2005, “Japanese Conceptions of Privacy: An Intercultural Perspective”, Ethics and Information Technology, 7(1): 27–36. doi:10.1007/s10676-005-0453-1
  • Newell, Bryce Clayton, Cheryl A. Metoyer, and Adam D. Moore, 2015, “Privacy in the family”, in Roessler and Mokrosinska 2015: 104–121. doi:10.1017/CBO9781107280557.013
  • Nissenbaum, Helen, 2010, Privacy in Context: Technology, Policy, and the Integrity of Social Life, Stanford: Stanford Law Books.
  • –––, 2015, “Respect for Context as a Benchmark for Privacy Online: What It Is and Isn’t”, in Roessler and Mokrosinska: 278–302. doi:10.1017/CBO9781107280557.016
  • –––, 2019, “Contextual Integrity Up and Down the Data Food Chain”, Theoretical Inquiries in Law, 20(1): 221–256. doi:10.1515/til-2019-0008
  • O’Neil, Cathy, 2016, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, London: Penguin Books.
  • Ochs, Carsten, 2022, Soziologie der Privatheit. Informationelle Teilhabebeschränkung vom Reputation Management bis zum Recht auf Unberechenbarkeit, Weilerswist: Velbrück Wissenschaft. doi:10.5771/9783748914877
  • Okin, Susan Moller, 1989, Justice, Gender, and the Family, New York: Basic Books.
  • –––, 1991, “Gender, the Public and the Private”, in Political Theory Today, David Held (ed.), Stanford, CA: Stanford University Press.
  • Olsen, Francis E., 1991, “A Finger to the Devil. Abortion. Privacy and Equality”, Dissent Magazine, Summer 1991: 377–382.
  • Ortner, Sherry B., 1974 [1998], “Is Female to Male as Nature is to Culture?”, in Woman, Culture and Society, Michelle Zimbalist Rosaldo and Louise Lamphere (eds), Stanford, CA: Stanford University Press, 67–87. Reprinted in Feminism: The Public and the Private, Joan b. Landes (ed.), Oxford: Oxford University Press, pp. 21–44.
  • Orwell, George, 1949, Nineteen Eighty-Four (1984), London: Secker & Warburg.
  • Paci, Patrizia, Clara Mancini, and Bashar Nuseibeh, 2022, “The Case for Animal Privacy in the Design of Technologically Supported Environments”, Frontiers in Veterinary Science, 8: 784794. doi:10.3389/fvets.2021.784794
  • Parent, W.A., 1983, “Privacy, Morality, and the Law”, Philosophy & Public Affairs, 12(4): 269–288.
  • Pateman, Carole, 1989, The Disorder of Women: Democracy, Feminism, and Political Theory, Stanford, CA: Stanford University Press.
  • Pepper, Angie, 2020, “Glass Panels and Peepholes: Nonhuman Animals and the Right to Privacy”, Pacific Philosophical Quarterly, 101(4): 628–650. doi:10.1111/papq.12329
  • Phillips, Anne, 1991, Engendering Democracy, University Park, PA: Pennsylvania State University Press.
  • Pitkin, Hannah F., 1981, “Justice: On Relating Public and Private”, Political Theory, 9(3): 327–352.
  • Posner, Richard A., 1978, “John A. Sibley Lecture: The Right of Privacy”, Georgia Law Review, 12(3): 393–422. [Posner 1978 available online]
  • –––, 1981, “The Economics of Privacy”, American Economic Review, 71(2): 405–409.
  • Pozen, David E., 2015, “Privacy-Privacy Tradeoffs”, University of Chicago Law Review, 83(1): 221–247 (article 10).
  • Prosser, William L., 1960, “Privacy”, California Law Review, 48(3): 383–423.
  • Puri, Anuj, 2021, “A Theory of Group Privacy”, Cornell Journal of Law and Public Policy, 30(3): 477–538.
  • Rachels, James, 1975, “Why Privacy is Important”, Philosophy & Public Affairs, 4(4): 323–333.
  • Rawls, John, 1997, “The Idea of Public Reason Revisited”, The University of Chicago Law Review, 64(3): 765–807. doi:10.2307/1600311
  • –––, 2001, Justice as Fairness: A Restatement, Erin Kelly (ed.), Cambridge, MA: Belknap Press of Harvard University Press.
  • Regan, Priscilla M., 1995, Legislating Privacy: Technology, Social Values, and Public Policy, Chapel Hill, NC: University of North Carolina Press.
  • –––, 2015, “Privacy and the Common Good: Revisited”, in Roessler and Mokrosinska 2015: 50–70. doi:10.1017/CBO9781107280557.004
  • Reiman, Jeffrey H., 1976, “Privacy, Intimacy, and Personhood”, Philosophy & Public Affairs, 6(1): 26–44.
  • –––, 1995, “Driving to the Panopticon: A Philosophical Exploration of the Risks to Privacy Posed by the Highway Technology of the Future”, Santa Clara High Technology Law Journal, 11(1): article 27.
  • Reviglio, Urbano and Rogers Alunge, 2020, “‘I Am Datafied Because We Are Datafied’: An Ubuntu Perspective on (Relational) Privacy”, Philosophy & Technology, 33(4): 595–612. doi:10.1007/s13347-020-00407-6
  • Richards, Neil, 2015, Intellectual Privacy: Rethinking Civil Liberties in the Digital Age, New York: Oxford University Press.
  • Richardson, Megan, 2017, The Right to Privacy: Origins and Influence of a Nineteenth-Century Idea, (Cambridge Intellectual Property and Information Law), Cambridge/ New York: Cambridge University Press. doi:10.1017/9781108303972
  • Roberts, Andrew, 2015, “A Republican Account of the Value of Privacy”, European Journal of Political Theory, 14(3): 320–344. doi:10.1177/1474885114533262
  • –––, 2022, Privacy in the Republic, London: Routledge India. doi:10.4324/9781003079804
  • Roberts, Huw, 2022, “Informational Privacy with Chinese Characteristics”, in The 2021 Yearbook of the Digital Ethics Lab, Jakob Mökander and Marta Ziosi (eds.), (Digital Ethics Lab Yearbook), Cham: Springer International Publishing, 9–23. doi:10.1007/978-3-031-09846-8_2
  • Roessler [Rössler], Beate, 2001 [2005], Der Wert des Privaten, (Suhrkamp Taschenbuch Wissenschaft 1530), Frankfurt am Main: Suhrkamp. Translated as The Value of Privacy, R. D. V. Glasgow (trans.), Cambridge: Polity, 2005.
  • ––– (ed.), 2004, Privacies: Philosophical Evaluations, (Cultural Memory in the Present), Stanford, CA: Stanford University Press.
  • –––, 2008, “New Ways of Thinking about Privacy”, in The Oxford Handbook of Political Theory, John S. Dryzek, Bonnie Honig, and Anne Phillips (eds.), Oxford: Oxford University Press, pp. 694–712. doi:10.1093/oxfordhb/9780199548439.003.0038
  • –––, 2015, “Should personal data be a tradable good? On the moral limits of markets in privacy”, in Roessler and Mokrosinska 2015: 141–161. doi:10.1017/CBO9781107280557.013
  • –––, 2017a, “Privacy as a Human Right”, Felix Koch (trans.), Proceedings of the Aristotelian Society, 117(2): 187–206. doi:10.1093/arisoc/aox008
  • –––, 2017b [2021], Autonomie: ein Versuch über das gelungene Leben, Berlin: Suhrkamp. Translated as Autonomy: An Essay on the Life Well Lived, James C. Wagner (trans.), Cambridge, UK: Polity, 2021.
  • Roessler, Beate and Dorota Mokrosinska, 2013, “Privacy and Social Interaction”, Philosophy & Social Criticism, 39(8): 771–791. doi:10.1177/0191453713494968
  • ––– (eds.), 2015, Social Dimensions of Privacy: Interdisciplinary Perspectives, New York: Cambridge University Press. doi:10.1017/CBO9781107280557
  • Rosen, Jeffrey, 2000, The Unwanted Gaze: The Destruction of Privacy in America, New York: Random House.
  • Rotenberg, Marc, Jeramie Scott, and Julia Horwitz (eds), 2015, Privacy in the Modern Age: The Search for Solutions, New York: The New Press.
  • Rubel, Alan, 2015, “Privacy, Transparency, and Accountability in the Nsa?S Bulk Metadata Program”, in A. Moore 2016a: 183–202 (ch.10).
  • Rule, James B., 2007, Privacy in Peril: How We Are Sacrificing a Fundamental Right in Exchange for Security and Convenience, Oxford/New York: Oxford University Press. doi:10.1093/acprof:oso/9780195307832.001.0001
  • –––, 2019, “Contextual Integrity and Its Discontents: A Critique of Helen Nissenbaum’s Normative Arguments”, Policy & Internet, 11(3): 260–279. doi:10.1002/poi3.215
  • Sandel, Michael J., 1982, Liberalism and the Limits of Justice, Cambridge/New York: Cambridge University Press. Second edition, 1998. doi:10.1017/CBO9780511810152
  • Sax, Marijn, 2018, “Privacy from an Ethical Perspective”, in van der Sloot and de Groot 2018: 143–172. doi:10.1515/9789048540136-006
  • –––, 2021, Between Empowerment and Manipulation: The Ethics and Regulation of for-Profit Health Apps, (Information Law Series 47), Alphen aan den Rijn, The Netherlands: Kluwer Law International B. V.
  • Scanlon, Thomas, 1975, “Thomson on Privacy”, Philosophy and Public Affairs, 4(4): 315–322.
  • Schneider, Henrique, 2021, “The Institutions of Privacy: Data Protection Versus Property Rights to Data”, SATS, 22(1): 111–129. doi:10.1515/sats-2020-0004
  • Schoeman, Ferdinand David (ed.), 1984a, Philosophical Dimensions of Privacy: An Anthology, Cambridge/New York: Cambridge University Press. doi:10.1017/CBO9780511625138
  • –––, 1984b, “Privacy: Philosophical Dimensions of Privacy”, in Schoeman 1984a: 1–33 (ch. 1). doi:10.1017/CBO9780511625138.002
  • –––, 1992, Privacy and Social Freedom, (Cambridge Studies in Philosophy and Public Policy), Cambridge/New York: Cambridge University Press. doi:10.1017/CBO9780511527401
  • Schwartz, Paul M., 1999, “Privacy and Democracy in Cyberspace”, Vanderbilt Law Review, 52(6): 1607–1702.
  • Shaffer, Gwen, 2021, “Applying a Contextual Integrity Framework to Privacy Policies for Smart Technologies”, Journal of Information Policy, 11: 222–265. doi:10.5325/jinfopoli.11.2021.0222
  • Sharon, Tamar, 2021, “Blind-Sided by Privacy? Digital Contact Tracing, the Apple/Google API and Big Tech’s Newfound Role as Global Health Policy Makers”, Ethics and Information Technology, 23(S1): 45–57. doi:10.1007/s10676-020-09547-x
  • Shvartzshnaider, Yan, Noah Apthorpe, Nick Feamster, and Helen Nissenbaum, 2019, “Going against the (Appropriate) Flow: A Contextual Integrity Approach to Privacy Policy Analysis”, Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 7: 162–170. doi:10.1609/hcomp.v7i1.5266
  • Simitis, Spiros, 1987, “Reviewing Privacy In an Information Society”, University of Pennsylvania Law Review, 135(3): 707–746.
  • Simko, Lucy, Jack Chang, Maggie Jiang, Ryan Calo, Franziska Roesner, and Tadayoshi Kohno, 2022, “COVID-19 Contact Tracing and Privacy: A Longitudinal Study of Public Opinion”, Digital Threats: Research and Practice, 3(3): article 25. doi:10.1145/3480464
  • Skinner-Thompson, Scott, 2021, Privacy at the Margins, Cambridge/New York: Cambridge University Press. doi:10.1017/9781316850350
  • Solove, Daniel J., 2002, “Conceptualizing Privacy”, California Law Review, 90(4): 1087–1156.
  • –––, 2004, The Digital Person: Technology and Privacy in the Information Age, New York: New York University Press.
  • –––, 2008, Understanding Privacy, Cambridge, MA: Harvard University Press.
  • –––, 2011, Nothing to Hide: The False Tradeoff between Privacy and Security, New Haven, CT: Yale University Press.
  • Stahl, Titus, 2016, “Indiscriminate Mass Surveillance and the Public Sphere”, Ethics and Information Technology, 18(1): 33–39. doi:10.1007/s10676-016-9392-2
  • –––, 2020, “Privacy in Public: A Democratic Defense”, Moral Philosophy and Politics, 7(1): 73–96. doi:10.1515/mopp-2019-0031
  • Stanley, Jay and Jennifer Stisa Granick, 2020, “The Limits of Location Tracking in an Epidemic”. ACLU White Paper, 8 April 2020, ACLU. [Stanley and Granick 2020 available online]
  • Stigler, George J., 1980, “An Introduction to Privacy in Economics and Politics”, The Journal of Legal Studies, 9(4): 623–644.
  • Susskind, Jamie, 2018, Future Politics: Living Together in a World Transformed by Tech, Oxford: Oxford University Press.
  • Tavani, Herman T. and James H. Moor, 2001, “Privacy Protection, Control of Information, and Privacy-Enhancing Technologies”, ACM SIGCAS Computers and Society, 31(1): 6–11. doi:10.1145/572277.572278
  • Taylor, Linnet, 2017, “Safety in Numbers? Group Privacy and Big Data Analytics in the Developing World”, in Taylor, Floridi, and van der Sloot 2017a: 13–36. doi:10.1007/978-3-319-46608-8_2
  • Taylor, Linnet, Luciano Floridi, and Bart van der Sloot (eds.), 2017a, Group Privacy: New Challenges of Data Technologies, Cham: Springer International Publishing. doi:10.1007/978-3-319-46608-8
  • –––, 2017b, “Introduction: A New Perspective on Privacy”, in their 2017a: 1–12. doi:10.1007/978-3-319-46608-8_1
  • Thatcher, Jim, David O’Sullivan, and Dillon Mahmoudi, 2016, “Data Colonialism through Accumulation by Dispossession: New Metaphors for Daily Data”, Environment and Planning D: Society and Space, 34(6): 990–1006. doi:10.1177/0263775816633195
  • Thomson, Judith Jarvis, 1975, “The Right to Privacy”, Philosophy & Public Affairs, 4(4): 295–314.
  • Tran, Cong Duc and Tin Trung Nguyen, 2021, “Health vs. Privacy? The Risk-Risk Tradeoff in Using COVID-19 Contact-Tracing Apps”, Technology in Society, 67: article 101755. doi:10.1016/j.techsoc.2021.101755
  • Tribe, Laurence H., 1990, Abortion: The Clash of Absolutes, New York: Norton.
  • –––, 2022, “Deconstructing Dobbs | Laurence H. Tribe”, New York Review of Books, 22 September 2022.
  • Tufekci, Zeynep, 2015, “What ‘Free’ Really Costs”, New York Times, New York edition, 4 June 2015, page A25. Appeared in online edition as “Mark Zuckerberg, Let Me Pay for Facebook”.
  • Turkington, Richard C. and Anita L. Allen, 1999, Privacy Law: Cases and Materials, (American Casebook Series), St. Paul, MN: West Group.
  • Veliz, Carissa, 2021, Privacy is Power: Why and How You Should Take Back Control of Your Data, London: Bantam Press.
  • van de Poel, Ibo, 2020, “Core Values and Value Conflicts in Cybersecurity: Beyond Privacy Versus Security”, in The Ethics of Cybersecurity, Markus Christen, Bert Gordijn, and Michele Loi (eds.), (The International Library of Ethics, Law and Technology 21), Cham: Springer International Publishing, pp. 45–71. doi:10.1007/978-3-030-29053-5_3
  • van der Sloot, Bart, 2017, “Do Groups Have a Right to Protect Their Group Interest in Privacy and Should They? Peeling the Onion of Rights and Interests Protected Under Article 8 ECHR”, in Taylor, Floridi, and van der Sloot 2017a: 197–224. doi:10.1007/978-3-319-46608-8_11
  • van der Sloot, Bart and Aviva De Groot (eds.), 2018, The Handbook of Privacy Studies: An Interdisciplinary Introduction, Amsterdam: Amsterdam University Press. doi:10.1515/9789048540136
  • Vincent, David, 2016, Privacy: A Short History, Cambridge/Malden, MA: Polity Press.
  • Volio, Fernando, 1981, “Legal Personality, Privacy and the Family” in The International Bill of Rights: The Covenant on Civil and Political Rights, Louis Henkin, New York: Columbia University Press, pp. 185–208.
  • Wachter, Sandra, 2018, “The GDPR and the Internet of Things: A Three-Step Transparency Model”, Law, Innovation and Technology, 10(2): 266–294. doi:10.1080/17579961.2018.1527479
  • Wacks, Raymond, 2010, Privacy: A Very Short Introduction, (Very Short Introductions), Oxford/New York: Oxford University Press. Second edition 2015.
  • Wang, Hao, 2022, Algorithmic Colonization. Automating Love and Trust in the Age of Big Data, PhD Dissertation, University of Amsterdam.
  • Warren, Samuel D. and Louis D. Brandeis, 1890, “The Right to Privacy”, Harvard Law Review, 4(5): 193–220.
  • Wasserstrom, Richard A., 1984, “Privacy: Some Arguments and Assumptions”, in Schoeman 1984a: 317–332 (ch. 14).
  • Weil, Gordon L., 1963, “The Evolution of the European Convention on Human Rights”, American Journal of International Law, 57(4): 804–827.
  • Weinstein, W.L., 1971, “The Private and the Free: A Conceptual Inquiry”, in Privacy: Nomos XIII, Roland Pennock and John W. Chapman (eds.), New York: Atherton Press, pp. 624–692.
  • Weintraub, Jeff Alan and Krishan Kumar (eds.), 1997, Public and Private in Thought and Practice: Perspectives on a Grand Dichotomy, (Morality and Society), Chicago, IL: University of Chicago Press.
  • Westin, Alan F., 1967, Privacy and Freedom, New York: Atheneum.
  • Whitman, Christina B., 1985, “Privacy in Confucian and Taoist Thought”, in Individualism and Holism: Studies in Confucian and Taoist Values, Donald J. Munro (ed.), Ann Arbor, MI: University of Michigan, Center for Chinese Studies, pp. 85–100. [Whitman 1985 available online]
  • Winter, Jenifer Sunrise and Elizabeth Davidson, 2019, “Big Data Governance of Personal Health Information and Challenges to Contextual Integrity”, The Information Society, 35(1): 36–51. doi:10.1080/01972243.2018.1542648
  • Woolf, Virginia, 1929, A Room of One’s Own, Richmond: Hogarth Press.
  • Wu, Tim, 2015, “Facebook Should Pay All of Us”, The New Yorker web site, 14 August 2015. [Wu 2015 available online]
  • Young, Iris Marion, 1990, Justice and the Politics of Difference, Princeton, NJ: Princeton University Press.
  • –––, 2004, “A Room of One’s Own: Old Age, Extended Care and Privacy”, in Rössler 2004: 168–186.
  • Zuboff, Shoshana, 2019, The Age of Surveillance Capitalism: The Fight for the Future at the New Frontier of Power, New York: PublicAffairs.

Rights/Regulations/Decisions

Copyright © 2023 by
Beate Roessler <b.roessler@uva.nl>
Judith DeCew <JDeCew@clarku.edu>

Open access to the SEP is made possible by a world-wide funding initiative.
The Encyclopedia Now Needs Your Support
Please Read How You Can Help Keep the Encyclopedia Free