A Theory of Privacy
ABSTRACT
In the age of Big Data Analytics and COVID-19 Apps, the conventional conception of privacy that focuses excessively on the identification of the individual is inadequate to safeguard the identity and autonomy of the individual. An individual’s autonomy can be impaired and her control over her social identity diminished, even without infringing the anonymity surrounding her personal identity. A century-old individualistic conception of privacy that was designed to safeguard a person from unwarranted social interference is incapable of protecting her autonomy and identity when she is being targeted on the basis her interdependent social and algorithmic group affiliations. In order to overcome these limitations, in this paper, I develop a theoretical framework in form of a triumvirate model of group right to privacy (GRP), which is based on privacy as a social value (Pv). An individual has an interest in protecting her social identity arising out of her participation in social groups. The panoptic sorting of individuals by Big Data Analytics for behavioural targeting purposes gives rise to epistemic bubbles and echo chambers that impede the formation of an individual’s social identity.
I construct the formulation of GRP1 to protect an individual’s interest in her social identity and her socially embedded autonomous self. Thereafter, I emphasize an individual’s right to informational self-determination and against algorithmic grouping in GRP2. Lastly, I highlight instances where an organized group may be entitled to privacy in its own right as GRP3. I develop a Razian formulation to state that the constant surveillance and monetization of human existence by Big Data Analytics is an infringement of individual autonomy. I highlight that the violation of GRP subjects an individual to behavioural targeting including hyper-targeted political advertising and distorts her weltanschauung. As regards the COVID-19 Apps, I assert that the extraordinary circumstances surrounding the pandemic do not provide an everlasting justification for reducing the identity of an individual to a potential disease carrier. I argue that the ambivalence regarding the existence of surveillance surrounding an individual’s social identity can leave her in a perpetual state of simulated surveillance (surveillance).
I further assert that it is in the long-term best interests of the BigTech corporations to respect privacy. In conclusion, I highlight that our privacy is not only interdependent in nature, but it is also existentially cumulatively interlinked. It increases in force with each successive protection. The privacy challenges posed by COVID-19 Apps has helped us realize that while limited exceptions to privacy may be carved out in grave emergencies, there is no justification for round the clock surveillance of an individual’s existence by Big Data Analytics. Similarly, the threat to privacy posed by Big Data Analytics has helped us realize that privacy has been wrongly focusing on the distinguishing aspects of the individual. It is our similarities that are truly worth protecting. In order to protect these similarities, I formulate the concept of mutual or companion privacy, which counter-intuitively states that in the age of Big Data Analytics we have more privacy together rather than individually.
Puri, Anuj, A Theory of Privacy (September 3, 2020). Cornell Journal of Law and Public Policy, forthcoming.
Comments
Post a Comment
If you have any doubt just let me know