1The purpose of the present article is to offer some lines of thought on the issue of identity as it has been extensively redefined in the information age. We will approach it by way of the questions that it asks us from a socio-philosophical perspective concerning that part of the secret that fuels the processes of subjectivation. In particular, we will look into the construction of identity in the context of hypermodern societies characterized by the widespread use of digital technologies. In our view, this construction takes place on different but distinct levels: the first amounts to ensuring, by its very development, something like a right to secrecy, while the second is grounded in the need to incorporate a certain interplay with difference in the process that drives it. What progress has there been in respecting these two levels of subjectivation at a time when transparency is tending to become a social norm, and even an injunction? By what means, through what arguments, can we still resist that injunction? Above all, with the help of what critical tools?
The Hypermodern Individual, Between Digital Addiction and the Protection of Secrecy
2Our voluntary servitudes take on aspects that are, on the face of things, far removed from what Etienne de La Boétie describes in his Discourse on Voluntary Servitude.  A change in perspective of this kind is justified insofar as, firstly, the nature of domination is no longer just political but involves logics of techno-industrial power. Surveillance capitalism exists.  And although forms of voluntary servitude occur, for example, in accepting technologies that allow for the increased surveillance of individuals, these forms do not simply belong to the register of domination as it has been traditionally thematized. It even involves new forms of self-construction. As Peter Burgess emphasizes, the symbolic ecosystem generated by social media strongly depends upon a new system of self-shaping: "Our affirmation of ourselves is dependent on our affirmation of all those around us – the affirmation of a necessarily incomplete, inauthentic, manufactured representation of ourselves, whose inauthenticity is increasingly the vector of our selfhood."  These networked modes of existence, which help give the digital subject a certain consistency, thrive without necessarily inferring an immediate awareness of the harmful, even toxic, effects of these practices of online sociability, especially when the criteria of dangerousness or normality changes in a given society. We may then be criticized for the slightest comment, with the ethical and political implications that can be imagined.
3Beyond the economy of affects that underlie self-presentation, the possibilities of intervening in the handling of our personal data are limited. As a result, a culture of secrecy that would seek to maintain an irreducible distance between ourselves and others becomes increasingly unlikely. Digital infrastructures are such that if we use them regularly, we are actually submitting ourselves to something like a regime of transparency: in daily life, it becomes clear that the plethora of information that is gathered makes it impractical to obtain the user’s consent each time. What’s more, the changes in data collection techniques, as demonstrated by the advent of the "internet of things," has led to a profusion of sensors that collect data without the user’s knowledge, e.g. video surveillance combined with facial recognition in urban spaces. More generally, we may wonder about the mass of information that the operators acquire on the basis of this data. It is one layer of the subject’s identity, a layer whose content and whose many possible uses are completely unknown to the person from whom it derives.
4In this context, one prevailing tendency among those involved – either government or private interests – is their desire to gain possession of the sum total of an individual’s digital traces, thereby giving in to the fantasy of reducing that individual to an increasingly complete set of attributes. In this new regime of power, the visible is reduced to what can be captured as data, to what amounts to a conception where an immediate availability of the "innermost being" of individuals could arise. The new regimes of "algorithmic governmentality" are based partly on this illusion. Such a reductionist vision of humans to their traces could not be defended within the humanities and social sciences where they know how to distinguish one from the other, but it nevertheless remains operational when industrial policy is involved.
5Moreover, this state of affairs does not come into existence without raising key questions from an ethical or political perspective. What happens to the exercise of free will in environments where individuals have no access to the economy that structures itself around their traces? Is the will to maintain control over a subjective domain that would remain secret – i.e. irreducible to the gaze of others, completely sealed off from any other outside gaze or instance of power – still possible, in fact? Does it still make sense?
6Answering these questions first requires the recognition that our technological circumstances have created a great deal of porosity, an unlimited accessibility to technical frameworks even when they are meant, as Jacques Derrida has already written, for "keeping secrets, for encoding and ensuring secrecy."  Information systems generate unprecedented transparency despite all the measures of protection that can be taken. In addition to these technical and material constraints, the discursive regimes that structure themselves around surveillance are legion. In the recent history of our so-called "democratic" societies, we know that there are situations of terror that contribute to the encouragement of certain semantic misappropriations,  as well as a form of domination that the discourse helps legitimize. In the United States, for example, the USA PATRIOT Act was conceived to serve as a reminder that terrorism can crop up anywhere, that no one is really immune to this plight, and that consequently the word "exception" itself could lose all meaning and become nonsensical.  Here the exception tends to justify, among other things, the development of technologies that prove to be ever more intrusive, by damaging the very possibility of secrecy. The effects of these discourses go well beyond the state of exception as such, and definitively install the very idea of having to accept the transformations of common law and individual liberties, by making it appear as though security must be handled through logics of surveillance. A symbolic order is expressed in the discourses accompanying the technologies of surveillance, and this symbolic order not only influences the social acceptance of a given technology, but also the way in which we relate it to ourselves.
7The tendencies we experience today relating to the unification of digital identity, the widespread use of biometric technologies and the demand for absolute transparency run counter to the existential dynamics that a subject needs to draw on in order to construct themselves. Identity is defined as ipse, by the very fact of being open to other possibilities. And yet the dominant ideologies that advocate total surveillance make our individual lives increasingly transparent – on a factual level, if not a structural one – by adhering to a naive and simplified view of cybernetics as part of this process.
8On this level, it is possible to note a certain schizophrenia in the way that we confront the questions relating to the protection of personal data or the respect of private life. Along with the expansion in information technologies making us ever more visible, in keeping with the values that have sustained the prevailing informational paradigm, it seems urgent to raise questions regarding the need for opacity and the right to secrecy. But these questions are, in fact, detached from the principles that have historically contributed to structuring the connected world in which we live today, with its instituted image repertoire.  As a result, the task of problematizing the logics inherent in total surveillance becomes all the more complex, for if we would like, on the one hand, to regain control of the systems that capture and process our data, we are, on the other, accustomed to the reality of handing over parts of ourselves in every moment of our daily life. With the social valorization that such exposure induces most of the time, the regime of transparency pays close attention to the most ordinary situations of existence and does not limit itself to those that stand out.
9In this respect, we have cause for concern in the face of the massive development of technologies that provide a nearly absolute ubiquity to some parties, by saturating the public and private spaces, "pushing to its limit the coextensivity of the political and the police domain."  The current development of technologies is restructuring space to such a degree that one’s private life at home is always at risk of being controlled or threatened. In this way, with the development of increasingly sophisticated technologies, the police has or can have its "detectors […] in our internal telephones, our e-mails, and the most secret faxes of our private lives, and even of our absolutely intimate relationships with ourselves."  This regime of transparency is all the more obvious today with the growing role of screens in our lives. This process that Derrida was already describing in the area of political power is just as valid today for industrial power, notably the power of the GAFAM companies that massively exploit their users’ data. In the face of the complex interplay of those involved in the logics of surveillance (whether for political or economic ends) what alternatives can still be proposed? Can a culture of secrecy, one that could contribute to an enhanced experience of sociability itself,  still survive today?
10If we get the feeling that a resistance of some kind needs to be organized in the hypermodern era, it must happen with a view to counter the temptation of technological determinism from which we would like to escape on a philosophical level, but which is definitely persistent on a sociological level. This temptation emanates from a long history that still largely structures our modes of thought. One possible direction for fighting against this determinism is the development of a culture of interpretation within our complex technological environments. We should give subjectivities the means to open a line of questioning regarding the meaning of coexistence in technological contexts that are now constantly changing at a fairly incredible pace. And the task of developing such critical thinking is immediately made more complex by the fact that, in many cases, technological systems are supposed to bring us more security, comfort and seamlessness. A large part of their spread in our existences, and our willing submission to them, is at play here (running the gamut from the often excessive use of our smartphones to the internet of things, to biometrics or facial recognition).
11In order to put up the most resistance to these tendencies toward the creation of a fixed, predictable identity (more a matter of the idem than the ipso), we should defend the necessary conditions for the development of the individual, who cannot be reduced to their digital traces. Essentializing subjectivity in the form of a digital identity means eliminating selfhood, which is irreducible to any typology; it means liquidating the tragic subject of politics, friendship or psychoanalysis – the subject that constructs itself within a narrative with its fragments, its shadowy zones, its shares of secrecy – to the benefit of the unified subject of an economic agent.  In this respect, the defense of the balance preserving the individual’s capacity to act, particularly through the preservation of the separation between contexts (governmental, professional, private, intimate or medical) is a cornerstone of personal autonomy.
12Privacy  is thus understood not as a simple public / private dichotomy that is imposed a priori, but as the respect of norms that inhere to each context: the flow of information must respect the contexts of use, as each relational context possesses its own norms, whether stipulated or not, which correspond to the expectations of the users as to the way in which the information will circulate. These boundaries are not established once and for all: they can be renegotiated according to the situations, the participants and the technologies. And when these dynamic norms are infringed, for example when the GPS location of an employee is sent to their employer during a weekend, the contextual integrity is damaged: the individual then gets the feeling that their private life has been violated. But this vision is not limited to the mere recognition of the need for contextual integrity. It also undertakes an attempt at reevaluating the role of privacy, an attempt whose ambition is to grasp its impact on social life as a whole: "[P]rivacy as contextual integrity is a complex, delicate web of constraints on the flow of personal information that itself brings balance to multiple spheres of social and political life."  And yet, for all that, do information systems actually allow for a high level of porosity between contexts, or for the cultivation of zones of opacity? Above all, can secrecy be reduced to the control of personal data?
Secrecy, Irreducible to Control
13To give an overall sense of the means by which the individual can be "put in the center" of the flow of data and have some agency regarding its circulation, here we shall introduce the notion of the paradigm of consent and control. In our view, this paradigm is the most general expression of both the principles of the rights relative to digital identities and the technical means made available to individuals.
14What we mean by "consent" is not only consent as such, in the limited sense in which the law understands it, but also, prior to processing, the notice that participates in the idea of an informed choice on the user’s part. This is the meaning of the expression "notice and consent."  In our conception of "consent," we also include the idea of the purpose of processing as enshrined in law, insofar as the consent to a given purpose, which the user presumably knows, is at stake here.
15What we mean by "control" is the theoretical presupposition that the actual means of controlling the flow of data, via the selective disclosure of attributes for example, helps restore an individual’s agency over their own construction. As an example, the need for this control arises when the collection of personal information is carried out "at the source," as with the systems of national digital identity used by various European countries.  But the paradigm of consent and control also remains relevant to a large degree for describing the case where data that has not been "filtered" at the source (or cannot be filtered due to its mandatory nature) is processed and cross-referenced in the databases of the "data controllers." We can assume in this case that the user is potentially aware of the very existence of data concerning them, for example in the case of PNRs (passenger name records)  – all of the information held by the airlines and made available to the government on request – or in that of social media in general. In this last case, control can be exerted through access to and correction of information concerning the user. With increasing frequency, this case calls for the reinforcement of the means of control made available to the user.
16This tendency has been reinforced by the implementation of the GDPR (General Data Protection Regulation)  and the right to be forgotten. The purpose of these measures is to establish the necessary conditions for the exercise of consent and control. Nevertheless, a difficulty remains in the problematic interaction between the positive definition of privacy and the establishment of control measures by the individual. As paradoxical as it might seem, the overdetermination of the private sphere may also lead to the tightening of surveillance and control over the individual. We may illustrate this idea with two examples.
17The first relies on the fact that giving the individual putative control over their personal data does not necessarily lead to their full autonomy. Control is not just an empowerment value that would merely grant the individual more autonomy: it is also a prescriptive value, which indicates what the individual must do. In reality, the more the realm under the subject’s control is circumscribed – whether this control is real or not – the more their actions are themselves subject to visibility and control. What is essential here is not just the permanent suspicion that marks our society, in which the individual is only considered in two aspects: as a consumer, or as a criminal.  In our view, what is much more important is the very approach instituting the regulation of what must or must not be controllable by the subject, an approach that has the effect of defining the realm that the subject is capable of considering as their own, as part of their private life. Through the subject’s regulation of what can be controlled, the possible and its horizon of meaning are suggested, operated on, or even positively delimited, in keeping with Foucault’s characterization of neoliberal societies.
18Paradoxically, in order to move beyond user control and allow for the existence of a space of play in which subjectivity may come into being – a space that can help counter both the "invasive"  profusion of rules and the exhaustive collection of information – the seamlessness of today must be resisted by promoting what Julie E. Cohen calls "semantic discontinuity."  Here, semantic discontinuity refers not just to the separation between "contexts," but more fundamentally to the need for spaces of non-determination, to the absence of overlap between existential territories. Semantic discontinuity is the opposite of seamlessness:  "it is a function of interstitial complexity within the institutional and technical frameworks that define information rights and obligations and establish protocols for information collection, storage, processing, and exchange. Interstitial complexity permeates the fabric of our everyday, analog existence, where it typically goes unappreciated. Its function, however, is a vital one. It creates space for the semantic indeterminacy that is a vital and indispensable enabler of the play of everyday practice." 
19Here, we will mention three aspects of this seamlessness that bear the mark of the modern world’s ambiguity: the seamless interaction between humans and machines, an enduring theme in the domain of science, heralded in the industrial world by Apple; the phenomenon of convergence that is acted out on the level of network architecture; and the growing body of laws on data transfer. Regarding this third aspect, we may note among other examples the inherent ambiguity of the debate on data portability in France and Europe: the very ambition to give the user more control by means of standardized formats makes it technically feasible for other parties to exchange this same data, ever more seamlessly and beyond all control. We then start to appreciate how information technologies, and their accompanying discourses, tend to homogenize semantic discontinuity.
20The second example concerns the way that the discourse on control technologies influences their perception, and how a certain discourse on the protection of individual liberties carries the paradoxical risk of becoming complicit in the control of the subject. Here, the most telling demonstration of this is probably in biometrics and the recent developments in the field. For a long time, it has been customary to distinguish, as the CNIL  did until 2016 for example, between the family of trace biometrics on the one hand, and that of trace-free, intermediate biometrics on the other. According to this distinction, fingerprints are categorized as trace biometrics insofar as the finger leaves traces in the physical environment, and these traces can be collected and processed for the purposes of identification. In particular, this technique leaves the door open to misuse and "remains risky in terms of identity theft," which leads the CNIL to regulate it fairly strictly. The techniques of the second family, i.e. trace-free (usually the venous network of the fingers) and intermediate (the iris, the form of the face) are then treated with less caution, since the risk mentioned earlier is less of a concern. In this respect, this second family could be seen as more neutral regarding privacy, and more "protective" of the individual.
21In our view, such an interpretation already entailed a fundamental ambiguity even before 2016. In the transition from the first family (trace biometrics) to the second, two things seemed to be at play. First, the tendency was to move further inside the body, even while leaving the impression that the process was less invasive due to the absence of contact. The motif of seamlessness and ergonomic interaction prevailed. But the interiority of the body itself now ends up under increasingly closer examination for the purposes of identification, which no longer takes place at the boundary between the world and my body, but within my body. With the argument of protection as an alibi, identification then in fact becomes ever more reliable and technically infallible. Although it is still possible to avoid identification through fingerprints by burning one’s fingers, a practice seen among asylum seekers,  it becomes harder to imagine a similar strategy in the case of iris recognition without escalating the level of mutilation. Thus the protective discourse paradoxically adapts to this twofold movement both into the body and toward the infallibility of identification.
22The distinction between trace and trace-free biometrics was abandoned by the CNIL in 2016. Two new families now replace the former distinction. In the first (1), any biometric template is stored on a centralized server (it is then referred to as "in database"); it therefore receives a higher level of oversight from the CNIL. In the second, either (2a) the access to the template "in database" is protected by a "secret" (usually a secret code associated with a PKI architecture) known only to the user, or (2b) the template is stored and verified locally on the ID card held only by the user (referred to as "match on card"). The CNIL subjects this second family to less scrutiny, and it is clearly preferable as far as the non-disclosure and non-centralization of biometric data is concerned. But in our view, the paradox with the distinction that was made prior to 2016 is still essentially present: with the argument of protection as an alibi, the consequences of the new procedures actually become increasingly problematic. For example, in the case (2a), if the user loses or reveals the code of the card, who is responsible for the possible access to the template stored on the server? Are the revocation of that access and the accompanying issues in the same category as losing the code for an ordinary debit card? Will there be insurance associated with it as there is with payment methods, while individualizing the risk still further? In the case (2b), doesn’t the problem become even worse in a way? Now it is up to the user to carry with them at all times a digital appendage that validates the status of their body as identified – and, ultimately, to consider that card (2b) as just as "private" as their body. Failing that, they could be considered as without an identity, refused access to a particular service, or called upon to prove that they have not committed some wrong in order for their identity to be "reestablished."
23Family (2) is described by the CNIL as "biometric systems allowing people to retain control of their biometric template." Obviously, control always goes hand in hand with greater responsibility. But mandatory "control," even when reasons of protection require it, always goes hand in hand with the overdetermination of the private sphere and a greater potential cost – and in this case, the cost is clearly borne by the user. Consequently, privacy becomes a mere injunction to use biometrics "well," for example – in other words, to use it all the same. In this way, the fundamental ambiguity of these technologies is clear: the more we progress in defining the private sphere and what must be protected, the more this sphere can be substantively delimited, and the more access it provides to strategies of power (whether political or industrial). As Gilles Deleuze and the psychoanalyst Félix Guattari wrote in this respect, "There is always a perception finer than yours, a perception of your imperceptible, of what is in your box."  There is always someone to secretly perceive the secret.
24Although hypermodern societies produce subjects concerned about their private lives, by encouraging them to adopt a certain culture of secrecy, these subjects often do so in what is ultimately a fairly conformist fashion. And yet, as Sami Coll explains, it may be harmful to only consider surveillance as a threat to private life. This could end up strengthening it. Paradoxically, privacy and surveillance are not antagonistic: "rather, they seem to work together in the deployment of the surveillance society. The more that is said about privacy, the more consumers focus on their individuality, […] which shapes them as the subjects of control."  A situation of this sort leads fairly naturally to a kind of weakening of the meaning of secrecy, limiting it to either a protective or restrictive experience. For these reasons, and in order to move beyond such a conception of the experience of secrecy, other conceptual underpinnings must be found. In our view, the concept of "existential territory," developed by Guattari,  seems to be in a position to support this attempt at conceiving humanity, by fully taking on the complexity as well as the ambiguities that define it. The multiplicity of existential territories through which the subject passes corresponds to a multiplicity of meanings or frames of reference. In many respects, the concept of "existential territory" enriches the interpretation of contemporary phenomena, essentially because the notion of territory involves the notions of a realm, a space of play, a journey, or an approach possibly initiated by the subject. Emphasis is thereby placed on the idea that the subject is not merely subjected to a profusion of horizons of meaning, but that it asserts itself above all in an active relationship with how those horizons are constituted. Finally, territories may overlap, by having the subject participate in different territories, simultaneously or successively. The whole question is then to know if the subject is the source of that overlapping or if it suffers the consequences of it. And yet although our era is structured around the watchwords that accompany new forms of subservience, the way in which we apprehend information technology should be able to create the conditions of possibility for a permanent reinvention of self, in a relationship to secrecy that can remain irreducible, constantly reinventing itself as well.
Etienne de la Boétie, Discours de la servitude volontaire (Paris: Flammarion, 1993) [Discourse on Voluntary Servitude, trans. James B. Atkinson and David Sices (Indianapolis: Hackett, 2012)].
On this subject, see Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs, 2018).
Peter Burgess, "I am not the sum of my personal data," theconversation.com, 18 June 2018, Web, 15 November 2021.
Jacques Derrida, De l’hospitalité: Anne Dufourmantelle invite Jacques Derrida à répondre (Paris: Calmann-Lévy, 1997), 61 [Of Hospitality: Anne Dufourmantelle Invites Jacques Derrida to Respond, trans. Rachel Bowlby (Stanford CA: Stanford University Press, 2000), 65].
The use of certain terms, particularly in the United States, such as "Homeland Security," seems to indicate that the darkest times of recent history may return, on a semantic level at least. See Robert Harvey, "Un monde sous contrôle: Retour sur l’USA Patriot Act," in Technologies de contrôle dans la mondialisation: enjeux politiques, éthiques et esthétiques, ed. Pierre-Antoine Chardel and Gabriel Rockhill (Paris: Kimé, 2009), 50-51.
Robert Harvey and Hélène Volât, USA Patriot Act: De l’exception à la règle (Paris: Lignes & Manifestes, 2006), 119. See also Pierre-Antoine Chardel, Robert Harvey and Hélène Volât, "Un USA PATRIOT Act à la française? ou les inquiétantes résonances d’une loi," Lignes 48 (October 2015): 105-124; Rada Ivekovic, "Terror/isme comme politique ou comme hétérogénéité: Du sens des mots et de leur traduction," Rue Descartes 62, www.cairn.info, 2008, Web, 15 November 2021, 68-77.
On this question, we refer the reader to Philippe Breton, Le Culte de l’Internet: Une menace pour le lien social? (Paris: La Découverte, 2000) [The Culture of the Internet and the Internet as Cult: Social Fears and Religious Fantasies, trans. David Bade (Duluth MN: Litwin Books, 2011)], as well as Pierre-Antoine Chardel, L’Empire du signal: De l’écrit aux écrans (Paris: CNRS Éditions, 2020).
Derrida, Force de loi: Le fondement mystique de l’autorité (Paris: Galilée, 1994), 107 ["Force of Law: The’Mystical Foundation of Authority’," trans. Mary Quaintance, in Acts of Religion, ed. Gil Anidjar (New York, London: Routledge, 2002), 279].
Derrida, De l’hospitalité, 65 [Of Hospitality, 69].
On this matter, see Georg Simmel, "The Secret and the Secret Society," in Sociology: Inquiries into the Construction of Social Forms, Volume 1, trans. and ed. Anthony J. Blasi, Anton K. Jacobs, and Mathew Kanjirathinkal (Leiden / Boston: Brill, 2009), 307-362.
Here and in the pages that follow, we rely on elements developed in Armen Khatchatourov, Pierre-Antoine Chardel, Andrew Fenneberg and Gabriel Périès, Digital Identities in Tension: Between Autonomy and Control (London: ISTE; Hoboken NJ: John Wiley & Sons, 2019).
"Privacy": all instances of the word are in English in the text [translator’s note].
Helen Nissenbaum, Privacy in Context: Technology, Policy and the Integrity of Social Life (Palo Alto: Stanford University Press, 2010), 128.
"Notice and consent": in English in the text [translator’s note].
In this respect, the case of Alicem in France is as emblematic as it is ambiguous on the ethico-political level. See Charlotte Jee, "France plans to use facial recognition to let citizens access government services," MIT Technology Review, www.technologyreview.com, 03 Oct 2019, Web, 15 November 2021.
"Passenger Name Record": in English in the text [translator’s note].
See "Regulation (EU) 2016/679 of the European Parliament and of the Council," Official Journal of the European Union, eur-lex.europa.eu, 27 April 2016, Web, 15 November 2021.
Louise Merzeau, "Présence numérique: les médiations de l’identité," Les Enjeux de l’information et de la communication 2009/1, www.cairn.info, 2009, Web, 15 November 2021, 79.
Julie E. Cohen, Configuring the Networked Self: Law, Code, and the Play of Everyday Practice (New Haven: Yale University Press, 2012).
"Seamlessness": in English in the text [translator’s note].
Cohen, Configuring the Networked Self, 239.
La Commission nationale de l’informatique et des libertés, the French data protection authority and national privacy watchdog [translator’s note].
Jean-Marc Manach, "Les’doigts brûlés’ de Calais," Le Monde Diplomatique, www.monde-diplomatique.fr, 25 September 2009, Web, 15 November 2021.
Gilles Deleuze and Félix Guattari, Mille Plateaux (Paris: Minuit, 1980), 351 [A Thousand Plateaus: Capitalism and Schizophrenia, trans. Brian Massumi (Minneapolis: University of Minnesota Press, 1987), 287].
Sami Coll, "Power, knowledge, and the subjects of privacy: understanding privacy as the ally of surveillance," Information, Communication & Society 17.10, www.tandfonline.com, 22 May 2014, Web, 15 November 2021.
Félix Guattari, Chaosmose (Paris: Galilée, 1992) [Chaosmosis: An Ethico-Aesthetic Paradigm, trans. Paul Bains and Julian Pefanis (Bloomington, Indianapolis: Indiana University Press, 1995)].