"Revolutionaries often forget, or do not like to recognize, that one wants and makes revolutions out of desire, not duty."
Deleuze and Guattari Anti-Oedipus: Capitalism and Schitzophrenia, p 366
Privacy turned rationalist
It’s been over a century since Warren and Brandeis declared the Right to Privacy in 1890.
Since then, interests in privacy have grown, with nearly everyone today touting some form of “privacy preserving technology.” The specific nature of the privacy conferred by these technologies is often complex and rarely specifically spelled out, beyond realizations of autonomy or control.
This isn’t too surprising, as outside of a rationalist “privacy is control” framing, privacy itself lacks a unified analytical conception. We’ve seen privacy defined in different ways e.g., with Columbia University Law School Alan Westin posing it as control over one’s personal information
"...each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication…"
or philosopher Ferdinand David Schoeman in 1992 suggesting privacy as a constituent of dignity or the protection of “intimacies of personal identity.”
It may even be that spiritual humanist visions of privacy – safe spaces to be left alone and explore without surveillance – have culturally given way to rationalist forms of privacy
“As privacy has become commodified, many of us have begun to question its value. Many no longer believe that there is anything fundamentally essential about a more private or anonymous human existence.
Whereas privacy was previously framed in humanistic terms, it is now far more likely to be thought of as a type of property.”
wrote Columbia Law School professor Bernard Harcourt.
We’ve long understood the value of this property to social media and search platforms, who transform observations of us into lucrative ads businesses. AI is poised to find new value in its observations of us, some saying (e.g., Gavin Becker at Atreides Management) that AI’s observations of user activity is critical to powering enduring enterprise value.
But as we’ve also seen, platform observations of us are reflected back in the form of individualized content that shape new cultural narratives.
If privacy is seen as individual control over information, surely then we might have control over to whom we grant this information and at what price.
Refuse what we are
Earlier this month, Jack Dorsey spoke at the 16th Annual Oslo Freedom Forum.
“I think the free speech debate is a complete distraction right now. I think the real debate should be about free will.
We are being programmed. We are being programmed based on what we say we’re interested in, and we’re told through these discovery mechanisms what is interesting—and as we engage and interact with this content, the algorithm continues to build more and more of this bias.”
We agree with Jack (and Elon who retweeted it). Jack’s speaking reminds us of the warnings of Mill, how social pressures of digital platforms can lead to an inescapable conformity
"Society can and does execute its own mandates: and if it issues wrong things with which it ought not to meddle, it practices a social tyranny more formidable than many kinds of political oppression, since, though not usually upheld by such extreme penalties, it leaves fewer means of escape, penetrating much more deeply into the details of life, and enslaving the soul itself."
It also reminds us Foucault’s The Subject and Power
“Maybe the target nowadays is not to discover what we are but to refuse what we are."
Here, “what we are” refers not only to technology’s categorization of us
- “Wife”
- “UC Berkeley graduate”
- “ex-New Yorker”
- “Gen Z”
but now also to algorithms telling us, as Jack explains, what to think. With the direct link of our data to platform power, we are increasingly platform subjects.
Foucault continues
“We have to imagine and to build up what we could be to get rid of this kind of political "double bind," which is the simultaneous individualization and totalization of modern power structures.
The conclusion would be that the political, ethical, social, philosophical problem of our days is not to try to liberate the individual from the state and from the state's institutions but to liberate us both from the state and from the type of individualization which is linked to the state.”
In the modern internet, we find an instantiation of Foucault's totalizing double bind: a space where individual freedom of expression is celebrated, yet where that very expression is continually shaped and molded by the invisible hand of algorithmic systems.
While we’re free to individually post and engage however we want, we do so under the Algorithm which shapes the narrative and context powering our next post. And with information and content increasingly mediated by AI, finding ways to subvert this double bind become ever more vital.
This idea of 'refusing what we are' takes on particular significance in the context of today's digital platforms, which function as modern forms of Foucault’s Technologies of the Self.
Technologies of the self
Technologies today are broadly confessional.
- Twitter: “What is happening?!”
- Facebook: “What is on your mind?”
- Instagram: “Share everyday moments”
- Snap: “Express yourself and Live in the moment”
- TikTok: “Inspire creativity”
- Strava: “Record over 30 types of activities with features to help you explore, connect and measure your progress”
Search may even be understood this way. Foucalt’s Technologies of the Self help explain these confessional technologies. Norm Friesen writes
“Foucault defines technologies of the self as “reflected and voluntary practices by which men not only fix rules of conduct for themselves but seek to transform themselves, to change themselves in their particular being, and to make their life an oeuvre.” These are practices or techniques, in other words, that are both undertaken by the self and directed toward it. Specifically confessional technologies involve a deliberate and often structured externalization of the self, often with the help of a confessor or a confessional text or context.”
We share in seeking transformation with the help of a mediator, today now Tiktok, Twitter, Facebook or ChatGPT. Each of these technologies enacts Foucault's totalizing double bind, offering users a sense of individual expression while simultaneously shaping that expression according to the platform's own controlling paradigms.
It's clear that interests in privacy can exacerbate Foucalt’s double bind, as modern privacy is increasingly understood to be something like
- an oligopoly of companies
- they have have large slices of your data
- It’s fine because they don’t really share it with anyone
The data moats members of the oligopoly build from our confessionals grow stronger the more we share and the more privacy is understood as protection from sharing than individual exercise of control.
With our data taking on greater power in mediating our digital experiences with AI, we believe privacy as information control takes on even more importance as a tool free agency and resistance against our present totalizing double bind.
“The algorithm is effectively a black box and … it can be moved and changed at any time. And because people become so dependent on it, it’s actually changing and impacting the free agency we have. The only answer to this is not to work harder at open sourcing algorithms or making them more explainable … but to give people choice, give people choice of what algorithm they want to use … that they can plug into these networks.”
Jack and Elon's call for greater user choice (“refuse what we are”) appear to be Foucauldian. If we are to resist the shaping of our subjectivity by algorithmic systems, we must have the ability to choose the applications we use and the algorithms they employ, without sacrificing convenience.
"The reason the Internet sucks now is because it has a different ruling class from the 90s-00s. A subject mindset now vs a citizen mindset then. Fix this, and you fix the Internet."
George Hotz tweeted last month.
A technology to enable applications with
- inference or data interoperability from any AI model
- onto any application surface
- for any purpose
- based on any context a user shares
- without costing user control or convenience
could provide the free agency Jack seeks and the citizen mindset Hotz suggests.
This is what we’re building at Crosshatch – a button that can turn anything and everything more you.
Crosshatch aims to be this technology of the self that could subvert the double bind of modern tech platforms by giving users the ability to actively shape their own subjectivity, rather than simply being shaped by external forces. Data interoperability (e.g., via AI) could enable subversion of the totalizing lock-in of large platforms, as user context can be shaped by AI into whatever format or structure is required by a given applicaton.
Deleuze and Guattari's insistence that "one wants and makes revolutions out of desire, not duty" is crucial here. For our proposed system of user choice and algorithmic transparency to succeed, it must be driven not by a sense of intellectual duty or purity, but by a genuine desire on the part of users. It must offer tangible benefits and satisfactions that draw people in, rather than relying on a sense of moral obligation.
Our experiences on the internet are shaped by the data collected about us by large platforms that subject us to their view of the context they happen to know about us.
Of course, we’re more complicated than any platform knows. A button of desire – that we push for new
- convenience
- connections
- value
- delight
could enable a new subjectivity – one that’s yours. Whenver it's not, simply disconnect and push the button on a new application, making it instantly more you.
We can't wait to experience this new internet.
Stay tuned for announcements this summer.