Notes for a talk I’m doing at this event tomorrow
An obvious place to start is with what I take ‘images of the human’ to be and what I take ‘digital social science’ to be. Images of the human are fairly straight forward. I mean everything from the most explicit claims about human properties and powers to implicit assumptions that go unrecognised by those who can be claimed to hold them.
What do I mean by digital social science? There is digital social science being conducted within the boundaries of the academy, both in terms of an explicit tendency towards a digital focus (digital sociology, digital geography, digital anthropology and the digital humanities) and that which is digital in its methods and/or objects but does not designate itself as such. There are also newer entrants to the intellectual scene, such as the computational social sciences and data science, often though by no means always being lead by physicists and mathematicians seeking to understand the social as a complex system, fundamentally no different to any other found in the natural world. Crucially, much digital social science takes place outside the academy and it is through this that the images of the human explicit or implicit within it can exercise an influence upon the world (and the humans) which they represent.
One of the most provocative offerings comes in Dataclysm, a book by Christian Rudder, a mathematician by training who co-founded the popular online dating site OkCupid. Now with one million to thirty million users, depending on how one defines the level of activity that constitutes a ‘user’, it is the 425th highest ranked site on the internet. The subtitle of Dataclysm is Who We Are (When We Think No One’s Looking) and this captures the impulse of the book: as Rudder puts it, “instead of asking people survey questions or contribing small-scale experiments, which was how social science was often done in the past, I can actually go and look at what actually happens when, say, 100,000 white men and 100,000 black women interact in private” (loc 63). The new technological possibilities offered by transactional data are seen to both call into question ‘who we are’ and offer us ways of looking behind this veil to the underlying reality.
In this sense, I think we misunderstand such an approach if we describe it simply as empiricism. There’s a critical theory here, implicitly claiming to disrupt what Hubert Dreyfus and Charles Taylor call “our ordinary pre-critical view of ourselves and the world”. The impulse here is to reveal the reality of human behaviour at scale: the conditions of both our everyday and scientific knowledge claims are implicitly understood to offer only a limited perspective on what it is humans actually do. Through the data we can reveal what they really do, as opposed to what they tell social surveys and interviewers they do. The reality of behaviour can be read from the numbers: ‘let the data speak for themselves”, at least if we are talking about the right kind of data. I’m at a very early stage of this work and I don’t want to impose a uniformity on the objects of my (meta)-critique, but I’ve chosen Rudder’s example because it’s the most explicit articulation of something which I believe is much more pervasive: an image of the human in which reflection and articulation are seen as mere forth, concealing the reality of what lies beneath.
Mark Andrejevic does a superb job in his book InfoGlut of unpicking the variety of ways in which those beholden to this image struggle with the “recalcitrant mind” which disguises the reality of who we are and why we do what we do. Interiority is rendered as a problem, one struggled with through the identification of ever more diverse proxies and ever more sophisticated measuring instruments, but one which continually returns as a frustrating opacity that eludes conceptualisation. It’s hard to make sense of what is being studied if we deny reflexivity: why did people choose OkCupid other competing sites, why are they using online dating rather than other alternatives, why are some demographics so over-represented amongst those who use online data and others aren’t? Each of these questions can be met by ad-hoc hypotheses, propping up the assumed transparency of the self but doing so provides us with a truncated and flattened social world, devoid of reflexivity. Reasons for action are tacitly acknowledged but representations of those reasons is deemed methodologically illicit: the real things people do (and as a corollary the real reasons) are in fact the focus of inquiry but to seek to represent these leaves us tangled up in the thickets of representation that the project of the transparent self is trying so hard to avoid.
What fascinates me is the reflexivity of those so committed to denying reflexivity. Why do they choose these topics? Why do they write these papers? Why do they talk at these conferences? Their trajectories of social action become inexplicable without reference to precisely the human capacity which their intellectual projects so thoroughly empty out from the social world. We need what Nick Couldry calls a hermeneutic of the anti-hermenutic: what drives this impulse towards the dissolution of reasons and evaluations? How does it differ from and what does it share with similar projects to be found in postmodernism and behavioural science? Or indeed in something like phrenology, as Mark Andrejevic, Dan McQuillan and others have suggested. This investigation will be philosophical, recovering the implicit images of the human and engaging with the explicit ones, but it will also be sociological: looking at the designers, engineers and managers and data scientists and the intra-organisational and inter-organisation contexts within which they are embedded, as well as how digital capitalism as a whole is taking shape through their activity.
These questions matter at a political, as well as philosophical level, because as the legal theorist Frank Pasquale suggests, the causal power of the algorithms that now govern the dynamics of reputation and visibility increasingly lead to calls for us to self-reflect in these terms in order to negotiate a social world governed through them: leading to what Pasquale describes as an “algorithmic self”. But crucially the operation of the algorithms remains opaque, subject to review and revision in a way that is always stubbornly outside our purview. Pasquale cites the example of “the self-promoter whose status updates on Facebook or LinkedIn gradually tip from informative to annoying” or “the search engine−optimizing website whose tactics become a bit too aggressive, thereby causing it to run afoul of Google’s web spam team and consequently sink into obscurity”.
This isn’t a case of people being moulded into a new shape by technology, such that we eventually become what the algorithm assumes that we already are. Reflexivity doesn’t go away. In fact it becomes more imperative than ever, but the risk is that it’s a new form of reflexivity, anticipatory in its focus but orientated towards ubiqutuous algorithmic evaluation rather than our own projects and passions. A communicative reflexivity disciplined by the algorithmic environment rather than the evaluations of trusted interlocutors within a secure micro-context.
This is a peculiar kind of hyper reflexivity which becomes necessary under circumstances that are at least for now isolated within particular tracts of daily life for most (e.g. social media use): a hyper reflexivity which ensues when people are “constantly assessing how each word or deed will affect permanent reputational profiles”. Under such conditions, reflexivity becomes a matter of modulation, adjusting to a context within which the rules are both given and yet fluctuating. Crucially, as Pasquale observes, “we all sense that certain activities win the approval of assorted watchers and others do not”. Measurement is not new by any means but what’s new is its ubiquity, driven it should be noted by near entirely commerical concerns, facilitated by the sheer efficiency with which this is possible when digitally mediated action generates transactional data subject to scruinty.
Pasquale invokes the novel Super Sad True Love Story in which credit scores from 400 to 1600 are displayed publically on ‘credit poles’ at shops and on the streets, ‘personality’ and ‘sexiness’ are automatically ranked within any social space and ‘mood + stress indicators’ provide public heirarchical rankings within any work places. There are similar themes explored in The Circle by Dave Eggars, as well as Whiskey Tango Foxtrot by David Shafer. This increasingly rich vein of fiction, beginning to find parallel representation in television, reflects an incipient unease about the place of measurement within social life. As Pasquale describes Super Sad True Love Story, “In an anomic world where social mores are adrift, the characters in the novel scramble to ‘find their place’ in the social pecking order by desperately comparing themselves with each other”. It seems to me that this could equally be a description of digital capitalism in late 2015. The ‘near future’ world these novels explore seems very near indeed.
The risk here is that the a-reflexive human subject becomes something close to a self-fulfilling prophecy. Reflexivity is not dissolved but truncated, restricted in its scope to orientating oneself within hierarchies that become axiomatic: a metric driven neo-traditionalism in which even the most reflexive projects go no further than creative projects of self-optimization through the quantified self? When considered at the level of law and politics, we can see how reflexivity itself can become an object of suspicion in a way likely to become ever more common.
Predictive analytics function by inferring from past behaviour, licensing interventions in concrete fields (policing, education, health care) based on parameters of ‘normal behaviour’ elaborated in a profoundly opaque way: one’s own past behaviour is the most knowable thing about these emerging standards and retreat into what we know we have done in the past is likely to become a safety zone, sought in order to avoid what Kate Crawford has called ‘predictive privacy harms’. Elsewhere she identifies what might be a forbearer of the coming conservatism in the tendency towards ‘norm core’: a style of dress that celebrates bland ubiquity, an aesthetic affirmation of the value of not standing out in any way whatsoever. Predictive privacy harms are by their nature unpredictable and, which is worse, we’re often unlikely to realise we have been harmed in this way. As Frank Pasquale writes, “we risk freezing into place a future that rigidly reenacts the past, as individuals find that replicating the captured patterns of past behavior is the only safe way to avoid future suspicion, stigma, and disadvantage.” We retain our reflexivity but become something less than human, enthusiastically objectifying ourselves – in a paradoxically reflexive way – while becoming objects to which things happen in wider social life.
- Algorithmic Authoritarianism and Digital Repression
- Between Post-Capitalism and Techno-Fascism
- Cognitive Triage: Practice, Culture and Strategies
- critical theory
- data science
- digital capitalism
- Digital Capitalism and Digital Social Science
- Digital Distraction, Personal Agency and The Reflexive Imperative
- Digital Inequalities
- Personalisation and Escaping the Filter Bubble: The Iron Cage in Binary Code
- Philosophy of Technology
- Post-Democracy, Depoliticisation and Technocracy
- The Political Economy of Digital Capitalism
- The Technological History of Digital Capitalism