- It’s important that I inventory concrete examples of [[real time behavioural science]] (such as the Australian Facebook leak of their claims about influencing teenagers and the emotional contagion study) in order to illustrate my claims about the front-stage and back-stage aspects of social platforms.
- This is a lovely expression from Zuboff on loc 969 about what Maggie would call the vexatious fact of society. Recognising this dynamic is central to my idea of [[Platform and agency]]:
- Under this new regime, the precise moment at which our needs are met is also the precise moment at which our lives are plundered for behavioral data, and all for the sake of others’ gain. The result is a perverse amalgam of empowerment inextricably layered with diminishment
- Zuboff argues that ‘data exhaust’ is a profoundly ideological term which obscures the commercial value and intimate character of this data.
- Her claim this is a mutant (bad) form of capitalism is somewhat unconvincingly but there’s definitely leverage to her use of Harvey to argue that platform capitalism involves accumulation by dispossession of human experience.
- In common with [[Nick Srnicek]] she tries to historicise the platform, arguing there was an elective affinity between the surveillance state and the rise of surveillance capitalism.
- What she calls the “extraction fo behavioural surplus” is what other people describe as [[datafication]]. What are the strengths and weaknesses of these two framings?
- She offers a convincing explanation that the apparent convergence of tech firms across a dizzying array of initiatives in fact reflects a single underyling business model which is the systematic and ever-expanding extraction of behavioural surplus.
- She uses the idea of an ‘incursion strategy’ to explain how firms have developed a playbook to expand their behavioural surplus extraction into new domains of social life. They proceed until they meet opposition, try to charm or destroy their opposition, rarely backing down in the process. This is a set of corporate strategies developed through trial and error with regards to things like Facebook’s Beacon and Google’s Street View.
- There is a vibrant commercial sphere of surveillance as a service which collates data on a b2b basis which is often opaque to end users.
- I thought her account was most powerful when it came to the internet of things. She frames this as a huge territorial expansion which is unwanted by consumers but pushed by surveillance capitalist firms because of the vast horizon of behavioural data which is opened up by creating an ambient infrastructure for the extraction of behavioural surplus. Voice assistants are competing to be the ubiquitous and trusted pipeline into this ambient infrastructure.
- A key feature of her analysis is asking why the fruits of surveillance capitalism were so gratefully received? What does it say about the socio-economic reality of most people’s lives that these promises were so entrancing? This is a story about neoliberalism but also one about austerity.
- The basic trade off of surveillance capitalism: the capacity of surveillance capitalism to be useful in your life depends on the legibility of your life to the infrastructure of surveillance capitalism. There are real benefits these firms bring but the loss of privacy and autonomy is inherent to them rather than some contingent downside that could be overcome with time.
- Digital technology hugely lower the transaction costs of doing behavioural science experiments. It simply wasn’t practical to run millions of customised A/B tests through analogue means.
- The concept of the [[two texts]] is useful for something I tried to express in public and platforms as the front-stage and back-stage: the distinction between the public text which is celebrated in participatory terms and the private text which is mined for commercial interest and shielded from public view.
- This piece is a great summary of the concept of [[division of learning]] which I want to incorporate into my account of the [[The platformisation of knowledge production]]:
- > By this term Zuboff means to demarcate a shift in the ‘ordering principles’ of the workplace from the ‘division of labour’ to a ‘division of learning’ as workers are forced to adapt to an ‘information-rich environment’. Only those workers able to develop their intellectual skills are able to thrive in the new digitally-mediated workplace. Some workers are enabled (and are able) to learn to adapt to changing roles, tasks and responsibilities, while others are not. The division of learning, Zuboff argues, raises questions about (1) the distribution of knowledge and whether one is included or excluded from the opportunity to learn; (2) about which people, institutions or processes have the authority to determine who is included in learning, what they are able to learn, and how they are able to act on their knowledge; and (3) about what is the source of power that undergirds the authority to share or withhold knowledge (181)
- > But this division of learning, according to Zuboff, has now spilled out of the workplace to society at large. The elite experts of surveillance capitalism have given themselves authority to know and learn about society through data. Because surveillance capitalism has access to both the ‘material infrastructure and expert brainpower’ (187) to transform human experience into data and wealth, it has created huge asymmetries in knowledge, learning and power. A narrow band of ‘privately employed computational specialists, their privately owned machines, and the economic interests for who sake they learn’ (190) has ultimately been authorized as the key source of knowledge over human affairs, and empowered to learn from the data in order to intervene in society in new ways.
- > Sociology of education researchers have, of course, asked these kinds of questions for decades. They are ultimately questions about the reproduction of knowledge and power. But in the context of surveillance capitalism such questions may need readdressing, as authority over what constitutes valuable and worthwhile knowledge for learning passes to elite computational specialists, the commercial companies they work for, and even to smart machines. As data-driven knowledge about individuals grows in predictive power, decisions about what kinds of knowledge an individual learner should receive may even be largely decided by ‘personalized learning platforms’–as current developments in learning analytics and adaptive learning already illustrate. The prospect of smart machines as educational engines of social reproduction should be the subject of serious future interrogation.
- I like this framing by Zuboff in Surveillance Capitalism as a way of talking about what Bacevic has described as independent irrational animals. It fits interestingly with what Ben Tarnoff and Moira Weigel describe as ‘tech humanism’: the belief that “unhealthy and inhumane” business models can be fixed through better design which rests on an ironically dehumanising language of our ‘lizard brains’ being ‘hijacked’:
- > In addition to inevitabilism, surveillance capitalism has eagerly weaponized behavioral economics’ ideology of human frailty, a worldview that frames human mentation as woefully irrational and incapable of noticing the regularity of its own failures. Surveillance capitalists employ this ideology to legitimate their means of behavior modification: tuning, herding, and conditioning individuals and populations in ways that are designed to elude awareness.
- This relates to what I’ve written about as the evisceration of the human under digital capitalism: the process by which an engagement with the behavioural traces of agency comes to substitute for an engagement with agents. In the work I’m doing with Andrea Maccarini on platform socialisation we’ll explore what this means during the intensified reliance upon platforms which the pandemic has given rise to.
- What Zuboff describes as the conscious deployment of velocity resembles the acceleration literature and its sensitivity to the relationship between speed and power. Not only can Big Tech move fast, it does so as a deliberate strategy to overwhelm and outrun potential constraints on its activity. This is connected to the asymmetrical character inherent to the [[division of learning]] and how it is amplified by “our scarcity of time, resources and support” (loc 6253) which was the central insight of my digital distraction book.
- It’s interesting to note that the Chinese social credit system was driven in part by a government concern about a crisis of trust in Chinese society, as a consequence of the pace of urbanization. In this sense it is a mode of social integration liable to be analysed in Durkheimean terms.
- Her observations about Pentland’s Social Physics are extremely interesting, particularly the sense in which he positions himself and his followers as members of an avant-garde who have already adapted to a new reality which others will gradually be forced to confront in their lives. She’s also astute to map out the extent of his commercial undertakings and how they build directly on his research – most of which are part of the surveillance as a service sector.
- I like her framing of social platforms in terms of the withdrawal of Goffman’s [[back stage]] (which falls into [[Things I want to read]] about) so that everything increasingly tends towards public performance. This draws people in even if they themselves are not users of social media, simply because photos and livestreams (etc) draw them in.
- It’s interesting she talks about reflexivity explicitly towards the end of the book. It’s important I engage explicitly with her social psychological account of the implications of surveillance capitalism for [[personal morphogenesis]].
