My notes on Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2), 2053951717738104.
What do social scientists mean when they talk about ‘algorithms’? This insightful piece by Nick Seaver draws attention to the elephant in the room: much of the time the social sciences are not addressing actual algorithms but rather processes that have an algorithmic component. He suggests part of the enthusiasm for the term stems from a “renewed concern for technical specificity” and algorithms appear to be “the core stuff of computer science” (pg 1). But the phenomena under investigation (search systems, recommendation engines, predictive policing systems) were not algorithms. Had critical scholars misunderstood what it was they were investigating? This question can easily be coded in terms of technical expertise but it is more deeply one of disciplinary boundaries. By failing to reproduce the specialised language of computer scientists, a certain relationship was opened up by these nascent group of experts and those with a more longstanding claim to the terrain.
Should critical scholars therefore correct their use of the term by cleaving it more closely to the thought and talk of computer scientists? Seaver argues not and that instead algorithms should be approached as ‘multiples’: “unstable objects that are enacted through the varied practices that people use to engage with them” (pg 1). Taking what is perceived to be expert terminology as a given implies a false uniformity, cautioning that “uncritical reliance on experts takes their coherence for granted and runs the risk of obscuring a key interest of critical scholars: what happens at the edges of knowledge regimes” (pg 2). Nonetheless, it can be argued that it is significant as an emic term, used by actors within a particular sphere of social life. But Seaver argues that an apparently singular definition when actors report on their technical understanding might nonetheless obscure a plural definition as they approach it in everyday life. His own fieldwork in a producer of algorithmic recommender systems for streaming music services found a vaguer use, in which algorithm was used to refer to various properties of an algorithmic system. He takes issue with what he calls the *algorithms in culture* approach because it carves out an abstract space for algorithms, necessary to access in order to understand how algorithms shape culture and are shaped by culture, while remaining distinct from culture.
In contrast, he calls for an *algorithms as culture* approach. Drawing on the practice turn in general, as well as Annemarie Mol in particular, Seaver draws a parallel between how the notion of ‘culture’ grew outside of anthropology and how the notion of ‘algorithm’ has increasing purchased beyond computer science. The fact that non-specialists have understandings of and act on the basis of ‘algorithms’ presents a methodological and epistemological problem for those who want to study the relationship between algorithms and culture. The multiple definitions do work in the world, shaping people’s practices, rather than being froth to be dispensed with in the name of definitional purity. Accepting the multiple character of ‘algorithm’ helps us see them *as* culture, as opposed to simply being objects of cultural concern, mechanisms for shaping cultural trends or targets of strategic action. This approach will leave our “accounts more adequate to the practices they describe” because assuming that “algorithms must have a stable and coherent existence makes it harder, not easier, to grapple with their production and ongoing maintenance” (pg 5). He makes the powerful argument that audit approaches to algorithms, seeking to expose the links between inputs and outputs, helps enact algorithms as secret. As he continues on pg 5:
“By treating the ‘‘inside’’ of the algorithm as unknow- able, these approaches participate in enacting an under- standing of the algorithm as a black box, as knowable only through the relation between inputs and outputs. This is not to say that audit approaches are responsible for algorithmic secrecy—they are clearly responding to other efforts to keep algorithmic operations hidden— but they are part of a set of coordinated practices through which algorithms becomes understood as, and remain, secret.”
He goes on to make a compelling argument for the importance of ethnography “for apprehending the everyday practices that constitute them and keep them working and changing, not just for observing how they relate to a distinct and properly ‘cultural’ domain” (pg 6). In doing so, algorithms are not cast as unique or novel, but rather objects that can be investigated in a similar way to others. This can involve *scavenging* by taking advantage of the heterogeneous sources of information which are available. As he says on pg 7, “There is much to be scavenged if we do not let ourselves be distracted by conspicuous barriers to access” e.g. “off-the-record chats with engineers about industry scuttlebutt, triangulated with press releases and the social media updates of my interlocutors”, “On mailing lists, in patent applications, and at hackathons, I found arguments, technical visions, and pragmatic bricolage”. Furthermore, what barriers to access there are remain part of the field site rather than contingent obstacles, constituting ways of generating knowledge in their own right. This includes the obstacles faced in the interview process, “turning the mundane mechanics of arranging and con- ducting conversations into material for analysis” (pg 8). The apparent banality of corporate messages can be powerful sources of insight, ably described by Seaver in terms of the identification of the multiple voices found within them. He cautions on pg 9 against imputing singularity to a company:
“Linguistic anthropologists have made this case for people, but for corporations its necessity is even clearer: outsiders often attribute singular agency and voice to corporations composed of hundreds or thou- sands of employees, working in scores of internal groups. Thus, we hear that ‘‘Facebook’’ has claimed or done something or that ‘‘Spotify’’ has a particular point of view. But while managers may try to coordin- ate their work, nothing intrinsically binds an engineer- ing team to a social media intern or a company founder. Especially in young companies or those in transformation, the institutionalizing forces that work to align these various voices are weak, and obvious heteroglossia in public statements is one notable consequence.”
This includes being aware of irony, with the risk that statements taken out of context miss the nuance of the actors who we are interested in. He cautions on pg 9 that “attribution of fundamentalisms—technological determinism, naive economism, or hyper-rationalism— to computer programmers may, in some cases, indicate more about critics’ inability to parse ironic speech than it does about technologists’ simplistic beliefs”