“Life After Privacy”: Thoughts on Big Data (1)

A couple of months ago, while attending a conference on social philosophy, a participant mentioned in passing that she needed to recruit panelists for an Author-Meets-Critics session for a book on the ethico-political ramifications of Big Data. The book was Firmin DeBrabander’s Life After Privacy: Reclaiming Democracy in a Surveillance Society (Cambridge, 2020); the session was to take place at the APA Central Division meeting this February

As an (erstwhile?) philosopher who now works in Big Data, I thought it’d be interesting to give it a shot, so I volunteered. So for the next couple of posts, I’m going to subject you to my thoughts on Big Data (privacy, etc.), thoughts I’ve been piecing together for the eighteen months or so that I’ve spent in the industry. I thought I’d begin in this post with a neutral summary of DeBrabander’s book, move in later posts to some criticisms, and maybe offer some thoughts on what one learns while working in the industry that can’t be learned as a spectator.

***********

DeBrabander begins with a well-documented fact that by now should be common knowledge: Big Data, meaning the data-mining and data-harvesting branches of the modern corporation and modern state, has within just a few decades subverted almost all of the norms of privacy that preceded the rise of the Internet, and created a surveillance regime of unprecedented scope and power. The imperative within Big Data is to maximize data collection from ostensibly private sources, whether for purposes of revenue-maximization or security, by virtually any means necessary. If they can get your data by asking, they’ll ask; if they can take it without asking, they won’t. But one way or another, they want your data, whether to monetize it or to keep tabs on your activities, and one way another, they’ll find a way to get it, no matter how you try to hide, or how you try to resist. In the digitized world, data is knowledge, and knowledge is power. Power, of course, is an end-in-itself to be piled up indefinitely, like the data itself. 

I won’t belabor the details of DeBrabander’s story here, which relies on well-known work by Shoshana Zuboff, Cathy O’Neil, and Zeynep Tufecki, among others.* The  bottom line is that, through the (literal) devices of Big Data, your privacy is either a dead letter or on its way to getting there. Every move you make, and every breath you take, leaves a digital footprint that someone, somewhere, is harvesting and monetizing. You may regard yourself as benefited by the convenience you enjoy and opportunities for self-expression you get as a result, but it’s also likely that you have no idea how many liberties Big Data has taken with your “private life,” and how little privacy you now enjoy. 

How did this happen? DeBrabander is only too happy to blame the victims. Our predicament (he thinks) might be likened to that of the Biblical Esau: we sold our privacy for the digital equivalent of a mess of pottage.** The story he tells goes something like this: We live in societies of unbridled preference-satisfaction subject to the imperatives of immediate gratification. Knowing this and capitalizing on it, Big Data gave us an iterated series of trade-offs, over decades, of convenience or self-expression over privacy. The pattern is by now boringly commonplace: you want something you see online; the host website asks to click “yes” to permit the use of cookies or location tracking etc.; you click “yes,” and get what you want; Big Data harvests your data; in consequence, you lose a bit of privacy.

Iterate this process maybe a dozen times a day over a couple of decades, and you’ve likely given whole tranches of personal data away by one data harvesting process. There are, alas, dozens and dozens of others. According to DeBrabander, we’ve gotten so used to engaging in this Esau-type Transaction (my phrase)–give your privacy away for pottage–that whenever we see something shiny and new online that we want right now, we habitually click “yes,” and, without a second thought, relinquish yet another bit of our privacy.

Each time you click “yes” (or click at all), it looks as though you’re optimizing on your time and resources to get what you want at a minor, even indiscernible cost in privacy. Wouldn’t it be stupid, after all, to stop and ask on each occasion whether or not to accept cookies? Or whether to read a 20 page Terms of Service agreement? Or whether to write away for the hard copy version of an insurance form? How much privacy could you possibly be giving away with one mouse click, anyway? Or two? Or 12*365? Or 20(12*365)? Or by turning on the location finder on your phone? Or turning it off? Or not knowing the difference?

And yet, like the Biblical Esau, the result of all that clicking is that we’re left with mere pottage masquerading as rational optimization under informational constraints.  There’s so much cool stuff out there–gadgets, information, porn.  Where’s the profit in privacy when you’re dying for stuff? The profit, of course, is Big Data’s, not yours, and lies in monetizing the data you’ve given away while selling you whatever bits of pottage you were dying to have. In short, little by little, we gave away our own privacy. Too late now: like Esau’s birthright, it’s good and gone. 

In a sober moment, it might occur to us that Big Data is a threat to us, and that we’ve given too much away to it. What to do? There are, essentially, two options: either we explicitly fight for privacy under that description, or we surrender our privacy and learn to live without it. DeBrabander makes an extended, albeit reluctant, case for the latter option.  His argument has two halves, one practical, and one more philosophical.

The practical half of DeBrabander’s argument tells us that resistance to Big Data has at this point become futile. For one thing, there’s nothing left to fight about: Big Data already has our data, and already has the means by which to acquire whatever is left. So there’s really nothing left to salvage. For another, there are no weapons left with which to fight: there’s no plausible or viable mechanism by which to hold the line against Big Data, much less to get back the privacy we’ve lost. The only two options are some combination of government regulation and/or non-governmental political activism. But (on DeBrabander’s reading) Big Data can easily defeat either of them. Regulation moves too slowly to catch up to Big Data’s work-arounds, and contemporary activism, heavily reliant on the Internet, can easily be neutralized by the very Internet providers on which it relies. So neither route is likely to work.

This practical impasse ought to get us to rethink privacy from a more philosophical perspective. On reflection, DeBrabander argues, it ought to occur to us that despite its undeniable value and attractions, privacy is not where the real normative action is, or should be. Privacy is a problematic concept and ideal whose loss, however regrettable, is not the one that should induce us to the barricades.

One problem with privacy is that it’s not clear what harm is involved when “it” is “invaded.” Consider an extended example.

Suppose I see an attractive woman walking down the street. Lost in her private thoughts, she has no idea that I’m ogling her and having filthy sexual fantasies about her.*** Have I thereby violated her privacy, or harmed her? It’s not clear how or why. 

Now suppose that the same woman is lounging in her living room with the curtains open, but visible from the sidewalk. Repeat the preceding scenario. Has anything changed? Have I violated her privacy simply because this time, she was in her house? What difference does that make if the curtains were open, and she was perfectly visible from the street? And what am I doing to her, anyway, that harms her?

Now imagine that the woman puts a profile picture of herself on Facebook, which inspires the same lustful thoughts in me. Indeed, I copy/paste the photo from her profile page into a special folder on my computer desktop, and “use” it periodically for my own sexual purposes.  That may be “creepy,” but have I crossed a moral or legal boundary that requires adjudication or rectification? 

Now imagine that I see the same woman on the street, and surreptitiously take a picture of her that I put in the same folder for the same purpose. How different is that from simply seeing and remembering her? Why is the photographic image more problematic than the merely mental one?

Imagine, finally, that I encounter the same woman at the mall. She’s going up the escalator, while I’m going down. She’s wearing a short skirt that permits a momentary glimpse at her “upskirt,” and I’m the kind of guy who’s on the look-out for such things. So I not only take a look, but snap a photo. Both looking and taking a picture violate her privacy. But it’s not obvious how: we’re both out in public, and the phenomenon itself was visible in public. How does a public phenomenon suddenly become private? What I do with the memory or image of her upskirt is not known to her. It doesn’t causally interact with her. It doesn’t take any physical thing from her, and doesn’t invade any physical boundary of hers. Yet we’re tempted to call it an “invasion of her privacy.” How? Where exactly is the invasion? 

The point is that “invasion of privacy” is not a self-interpreting or self-justifying idea. We rely on it constantly, and yet moral philosophers have provided no adequate account of how to understand or justify it even in apparently obvious, paradigm cases. Unfortunately, the cases relevant for understanding privacy in the data mining and data harvesting cases are less obvious than those involved in the ogling case above. The ogling case is unconsenting, but most data mining cases, at least technically, involve consent. We consent to give our personal health information away to providers, insurers, and all of the subsidiary organizations and personnel involved in processing health claims–not that we know who they are, or what they do, or why they need our data, or what they’re doing with it, or how much risk they impose by having it. But our consent is right there, in pixellated black and white. Likewise, we consent to allow the use of cookies. We consent to have our locations tracked. We consent to almost everything Big Data does to us. They make sure of that.  

In neither case is it obvious what harm ensues to either party. How are you harmed when someone masturbates over an image of you? Maybe you ought to be flattered. Again, how are you harmed when someone accesses a (secure, encrypted) database containing your name, address, insurance information, provider information, date of service, Social Security number, diagnoses, procedure codes, lengths of stay, insurance denials, inpatient/outpatient status, payment history, and payment status (etc.)? Maybe they’re doing you a favor. At least they have their shit together, remembering clinically and financially important facts that you’ve probably forgotten. In my experience, the average patient calling in to inquire about their bill can’t even remember what hospital they visited for a given procedure, much less what procedure they had done or when. Well, someone has to remember all of those tedious details. And that someone would be Big Data. So what harm is done by ensuring that Big Data has all the data it needs, and then some? Yes, there’s a loss of privacy, but isn’t there a gain in convenience? Esau may have sold his birthright, but at least he got something to eat.

If we can’t account for the wrongness of ogling or upskirt photos (and it seems we can’t), we a fortiori can’t account for the wrongness of data mining. But then, the uncomfortable question arises: how can data privacy matter so much if we have no account of the wrongness of violating it? 

Beyond that theoretical lacuna, privacy is a pernicious and self-subverting idea. By its nature, the quest for privacy privatizes life. It atomizes us, separates us, scatters us, and drives us into cocoon-like enclaves of comfort designed to filter out the unpleasant, the noxious, the loud, the fractious, and the otherwise undesirable. In doing so, it systematically unfits us for political life, which requires us actively to deal with the things that privacy is designed to filter out. The more privacy we enjoy, the less capable we become of managing our own political affairs on our own initiative. The less politically capable we become, the more we outsource our political responsibilities to the state or to private or semi-private corporations designed to do what we can’t or won’t. The more we outsource to them, the more power they get over us. The more power they get, the more data they seek. The more data they seek, the more data they take. The more data they take, the greater the loss of our privacy. Given our Esau-like proclivities, it’s not as though we’re inclined to resist either their blandishments or their takings. And so, to paraphrase Marx, the ethos of privacy becomes its own gravedigger.

Finally, privacy is problematic because it presupposes an indefensible conception of the self–a private self that enjoys its privacy by retreating away from the social realm to commune with itself, by itself. DeBrabander traces this idea to the early modern, and then the modern and contemporary, periods of Western philosophy: there is, on his account, a fairly straight line to be drawn from the early prophets of privacy–Montaigne, Descartes, Leibniz, Thoreau–to the nineteenth and twentieth century apotheosis of privacy found in its most extreme form in American jurisprudence. Nineteenth century American jurisprudence gave us the “right to privacy,” which “amounts to or consists in ‘a right to be left alone’” (Life After Privacy, p. 3). That same “right to be left alone” grounds our right to use contraception, at least within the context of marriage (Griswold v. Connecticut [1965]), our right to loiter at will without having to have or reveal a public purpose (Papachristou vs. Jacksonville [1972]), and a woman’s right to abortion (Roe. v Wade [1973]). Essential to this conception of privacy is the asocial atom of social contract theory (and, I suppose, existentialism): the utterly self-determining, self-forming, rigidly bordered Self that must be left absolutely alone, like some Leibnizian monad, to realize itself. 

But if there is no such asocial self, as DeBrabander suggests, there is no need for the extreme sort of privacy it demands. We ought instead to replace that asocial conception of privacy with a more moderate one of the kind we find in Stoic, religious, and otherwise non-individualist conceptions of interiority–he mentions Seneca, Augustine, Luther, Ignatius of Loyola, and John Dewey as exemplars–whose less romantic conceptions of privacy found expression in political regimes radically different from those that prevail in individualist Britain or America. The point is not to oscillate from, say, wild Thoreauvian freedom to theocratic repression, but to find the mean between them. 

Given this, it’s perhaps misleading of me to have described DeBrabander as counseling “surrender” to Big Data, full stop. What he wants is surrender on the privacy front, but an opening on a different front.  We should, on his view, replace our crusade for and valorization of privacy with a turn to a properly political conception of public life inspired by (a version of) Aristotle, Dewey, and Arendt. The practical models here are the American civil rights and labor movements. Neither movement aimed at or relied heavily for the effectuation of its aims on privacy: equality, freedom, and justice are essentially public in character, and were won, not by a retreat into the private sphere, much less by privatization, but by publicity, exposure, and the values of collective action in the service of a common good. Unlike digital activism, these movements succeeded, so we would do well to follow their lead. Doing so might win back some of our privacy without explicitly aiming at it, but more importantly, would pay dividends in the restoration of our common public life, which is where the action is, or ought to be. 

****

That, at any rate, is how I read DeBrabander. Apart from the first two paragraphs, I don’t agree with a word of it. I’ll explain why not in later posts. 


 *See Cathy O’Neil’s Weapons of Math Destruction on the problematic use of algorithms by Big Data; Zeynep Tufekci’s Twitter and Teargas on the problems and prospects of political activism via social media; and Shoshana Zuboff’s The Age of Surveillance Capitalism for an overview of the strategies and tactics used by Big Data. Though not mentioned in Life After Privacy (and not on privacy per se), I would recommend Edward Tenner’s The Efficiency Paradox for useful insights into the daily routine of data analytics in contemporary business.  

**On Esau, see Genesis 25:29-34.

***I prefer to think that I got this example from a passage in Nozick’s Anarchy, State, and Utopia rather than personal experience (Anarchy, p. 32). At any rate, the sexual examples are all mine, not DeBrabander’s.

I am a Junior Analyst in Support and Implementations for CorroHealth’s Aergo division in Iselin, New Jersey. I write here in a private capacity, expressing my personal views, for which I bear sole responsibility. I am not a spokesperson, official or non-official, for CorroHealth, Aergo Solutions, or any other entity. 

3 thoughts on ““Life After Privacy”: Thoughts on Big Data (1)

  1. Pingback: “Life After Privacy”: Thoughts on Big Data (2) | Policy of Truth

  2. Pingback: “Life After Privacy”: Thoughts on Big Data (3) | Policy of Truth

  3. Pingback: “Life After Privacy”: Thoughts on Big Data (4) | Policy of Truth

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s