Tagged: big data Toggle Comment Threads | Keyboard Shortcuts

  • Mark 4:00 pm on September 15, 2019 Permalink | Reply
    Tags: big data, data analytics, , moneyball   

    The Poetics of Data Analytics 

    Originally a 2003 book by financial journalist Michael Lewis, the 2011 film with Brad Pitt tells the story of the Oakland Athletics’ 2002 season. Struggling with fewer resources than his competitors and lacking the resources to replace star players who have been poached, general manager Billy Breane embraces data analytics to assemble “an island of misfit toys”: unpopular players flawed in specific ways whose superficial idiosyncrasies obscure an underlying solidity as players which registers empirically for those willing to look at the data. Even if you have no interest in baseball, the encounters between the orthodoxy of baseball management and the new world of data analytics are wonderfully engaging.

    Breane grows ever more frustrated as a room full of ageing coaches and scouts earnestly recount their intuitions and heuristics to him in lieu of evidence-based argument, encompassing such certainties as that a player whose girlfriend is unattractive obviously lacking confidence and thus cannot be depended upon on the field. Having turned down a full scholarship to an Ivy League university as a young man because scouts cut from this same cloth hailed him as a future superstar, he has long nurtured a scepticism towards their intuitions as his own career failed to live up to  expectations.

    Breane’s chance encounter with an economist furtively working with a rival team, withholding much of his analysis lest he upset his established colleagues, provides him with a way out of this impasse. The embrace of data analytics promises a way of gaining competitive advantage over the better resourced teams in the league, ultimately paying off with a hugely successful campaign regarded as amongst the most notable in the history of the sport.

    What’s notable about the film is the heroism it lends to a narrowly instrumental exercise, as what might once have seemed to be a dehumanising approach to the team’s players instead represents a brave struggle against incumbent irrationality. Breane is supported in this struggle by Peter Brand, whose numeracy carries an almost inhuman connotation, with the grizzled coaches and scouts struggling to recreate the maths as Breane’s human computer spits out arithmetic on command. 

    It’s an entertaining film which is sociologically interesting for the way in which it prefigures a poetics of data analytics which has only grown in subsequent years. It introduces elements which we can see elsewhere (the inhumanity of epistemic bias, the relationship between intuition and incumbency, the bravery needed to call orthodoxy into question) into the familiar form of a sports narrative, producing something new in the process: the embrace of data as a mode of bravery

     
    • momslovelearning 6:29 am on September 16, 2019 Permalink

      Interesting post. I had heard about this story already but I had not realized that there were a book and a film about it.

    • Mark 8:18 pm on September 19, 2019 Permalink

      it’s good!

  • Mark 12:47 pm on June 5, 2019 Permalink | Reply
    Tags: big data, , digital socialism, ,   

    Digital capitalism or digital socialism? 

    My notes on Morozov, E. (2019) Digital Socialism? The Calculation Debate in the Age of Big Data. New Left Review 116/117, 33-66

    A range of terms have entered circulation in recent years which suggest a transformation in capitalism. Digital capitalism, platform capitalism, data capitalism and surveillance capitalism point to a shift which is significant in its scope, even if the character of that transformation remains uncertain. It’s interesting that we haven’t seen the emerge of platform modernity, surveillance modernity, data modernity etc. These would certainly be unattractive coinages but they would also suggest something different. They would be more epochal in their implications, less grounded in an account of capitalist transformation. There’s a lot to unpack in how a digital transformation which is seemingly underway is coming to be talked about.

    As Evegny Morozov argues in this thought provoking piece, we have to understand discussions about capitalism’s (digital) future against the background of capitalism’s recent crisis, such that “promises of meritocracy and social mobility ring increasingly hollow” and “capitalist ideaologues are eagre for good publicity”. The tech sector occupies the most “prominent role on the horizon of the Western capitalist imaginary” and is the most “promsiing field for regenerative mythologies”. The capitalist ideology of the future is being inculcated within this “laboratory” of market solutions (33). Viktor Mayer-Schönberger, legal scholar and software entrepreneur, plays an increasingly prominent role in this process. His co-authored book on Big Data defined did more than most to define data ideology and his more recent work seeks to outline how capitalism will be reinvented in an age of ubiquitous big data. It is telling that both have been co-written with Economist journalists and self-consciously appeal to an audience of prominent policy and economic influencers. 

    The price system was a prominent target in this later book. This is something which has been at the heart of the neoliberal case against socialism, with the claim being that the absence of the real time signals provided by the price system means that socialist planning couldn’t keep up with the speed of economic and social change, leaving socialism fatally doomed at the level of logistics and operations. As Morozov points out, critics from the left have begun to point to the centralised planning undertaken by digital Goliaths like Amazon, identifying how digitalisation facilitates real time signalling beyond the price system. Others have argued that big data clogs the price system, as the extent of digital subsidy (platforms of users and venture capital of platforms) means there is increasingly little relationship between the cost of a service and the price attached to it.

    Mayer-Schönberger and Ramge argue that the price system was always limited, reducing multidimensional preferences into a single measure. It facilitates manipulations, frustrates nuance and loses context. They argue that it is now possible to move from price to data in the coordination of market activities. This will facilitate match-making with a degree of granularity which the price system cannot hope to meet, reducing inefficiencies at every level of capitalism. The problem is that we currently have monopoly ownership over the feedback systems through which this data is generated from commercial transactions, leaving data with value to many actors in the hands of the few largest digital corporations. They argue a New Deal on Data is necessary in order to force these firms to share the data with other actors for whom it might have value and interest. However as Morozov observes, there is “scant sense in this analysis of capitalism as a system, with a history, a present and a perceptible logic – of competition – that imposes significant constraints on its future paths” (39).

    In this sense, it is quite a dangerous vision which advocates profound reform alongside little engagement of the politics preceding it or accompanying it. He draws a parallel to Zuboff’s recent book and suggests these texts postulate a previous stage of capitalism and then present the digital as a deus ex machina giving rise to a profound transformation, leaving them with a “presentist two-stage schema” (39). He predicts “we are likely to see further furries of books that are nominally about the future of capitalism, but offer, at best, depictions of observed regularities in how capitalist frms expand their stocks of capital to include data” (42) (ouch).

    The problem is that many of the developments in an arena like FinTech which the authors seize upon are better explained by the dynamics of capital accumulation in a way continuous with what we have seen before, as well as being driven by the existing players within finance. From pg 40

    The big banks—heavyweight ambassadors of supposedly outdated fnance capitalism—are spending large sums on tech: Citigroup’s tech budget was $8 billion in 2019; Wells Fargo’s, $9 billion; Bank of America’s, $10 billion; jp Morgan’s topped out at $11 billion. These are impressive fgures, on a par with the tech giants them-selves. Indeed, the top ten us spenders on technology last year were banks and tech frms, with the addition of Walmart.12 jp Morgan has launched a well-staffed ai team in New York and a 1,000-person FinTech campus in California, suggesting it’s on the cutting-edge of innovation. Palo Alto now also hosts BlackRock Lab for Artifcial Intelligence.

    However Morozov claims much of this investment is going into maintaining legacy systems, rendered increasingly expensive by multiple rounds of mergers, rather than innovation as such. However the bigger banks are spending more, as well as more of what they spend being directed towards advanced technology. The promise is that FinTech can provide profits on a par with contemporary banking but at substantially lower costs, highlighting again how the pattern of investment is explained by existing dynamics of capital accumulation rather than something extrinsic to them. The future behaviour of these firms is presented in breathless terms as the future of capitalism, as opposed to being a predictable expression of already familiar dynamics.

    Morozov observes that the price system rests upon patterns of behaviour which renders shifts in prices legible. If I understand correctly, it presupposes a common background of orientation towards prices and their significance. If data is to substitute for price then what is the comparable common background of orientation? Where are the “new behavioural modes and frameworks of meaning” (46)? There was always a knowledge system, facilitating coordination at a distance from the commodities which are circulating, underpinning the price system and digitalisation merely formalised the latter. From pg 47:

    Read from a Hayekian perspective, the digital economy simply formalizes and improves earlier processes of opinion formation, making the reputations of market participants easier to update in real time, or simply alerting cus-tomers, via a notifcation on their phone, to the launch of a new taxi service where the driver would be happy to whistle the client’s favourite tune.

    Reducing platform services to technological innovation obscures the institutional change at work. Rather than price being replaced by information, facilitated by the magic of digital technology, collective-legal solutions are being replaced by individual-market ones, with the mystification of technology playing a role in legitimating this transition. As Morozov notes of taxi fare on pg 51:

    The rigidity of taxi fares was not a consequence of fawed assumptions about price and information, but a refection of the legal conditions imposed on the cab owners: what they knew about passengers or changing market conditions was irrelevant, as they were legally compelled to offer the same service, at the same rates, to everyone.

    Morozov argues that what is at stake here are modes of social coordination. The significance of the technological change rests in feedback infrastructure and the politics surrounding their ownership, operation and use of the data generated. As he writes on pg 53, “the active dismantling of existing forms of planned or law-based social coordination requires the ability to furnish alternative forms that would at least avoid complete anarchy and chaos”. The politics of feedback infrastructure begin here as a struggle over the modes of social coordination which will become hegemonic as we move forward, over a decade after the financial crisis and with populisms of left and right emerging around the world.

    The project for the left is how to “find ways to deploy ‘feedback infrastructure’ for new, non-market forms of social coordina-tion, thus challenging neoliberalism with the very tools it has helped to produce” (54). The Chinese social credit system is one example of the form this could take, at a cost which most people would find too high. A more liberal mechanism could be found in deliberative discover mechanisms existing between competition and centralised planning. As he describes on pg 56:

    Social existence presents us with a plethora of problems to solve, some of them highly specifc and only relevant to small groups of people, others of much wider impor-tance. Digital ‘feedback infrastructure’ could be used to flag social problems and even to facilitate deliberation thagf them, by presenting different conceptual approaches to the issues involved. What counts as aghh‘problem’ would also be open for debate: citizens could enlist allies and convince others of the virtues of their own readings of particular problems and proposed solutions to them. This framing would suggest that deliberation-based democratic procedures could themselves be modes of problem-solving and means of social coordination.

    A nascent form of this coordination between problem-providers and solution-providers can be seen in something like hackathons, notes Morozov, at least in their non-commercial variants. What he advocates is that these processes could be scaled through technological mediation in a way which leaves them independent of co-presence for a defined period of time. The other would be designing ‘non-markets’ which find ways to distribute scarce resources without recourse to the price system. He offers a fascinating account of how Beer’s Cybernetics help us make sense of the modalities of such an initiative. But the underlying question is what social coordination might look like for socialist ends what it is liberated from the ideological baggage of the price system. As he writes on pg 63:

    Given this new context, it does not seem very productive for the left to keep advocating for the use of more powerful computers to calculate input prices for the Central Planning Board—or to retain a centralized bureaucracy, with all the political problems it entails. Why insist on central planning, when a more decentralized, automated and apparatchik-free alternative might be achievable by putting the digital feedback infra-structure to work? The most ambitious effort to sketch what such an alternative might look like—think ‘guild socialism’ in the era of Big Data—was undertaken by the American radical economist Daniel Saros, in his rigorous, lucid—and unjustly neglected—Information Technology and Socialist Construction.

    Aspects of this system are already in operation, as he writes on pg 65

    How realistic is Saros’s system? An examination of how big technol-ogy frms organize their platforms reveals that some aspects of it are already in operation. Amazon, for example, rewards customers with lower prices for registering their expected future needs and ‘subscribing’ to periodic deliveries of regularly consumed products; it also carefully studies product searches and the offerings of other suppliers in its own ‘general catalogue’ to locate gaps in the market. Democratizing access to that information infrastructure, so that all producers can build on these emerging product insights, would surely result in a system that is far less centralized than today’s, where just one frm (Amazon) monopolizes all the planning based on that data.

    For Morozov feedback infrastructures are the means for producing alternative modes of social coordination. What matters most is getting them out of the hands of the tech giants. The ecology of social coordination, the many kinds of institutions and technologies which can mediate it, have been obscured by the post-Cold war binary between central planning and the price system. What Morozov describes as “The emancipatory promise of information technology is to rediscover and enrich this repertoire, while revealing the high invisible costs of relying on the current dominant mode of social coordination—capitalist competition” (67)

     
  • Mark 9:47 am on January 26, 2019 Permalink | Reply
    Tags: big data, data politics, politics of big data   

    CfP: Political epistemologies of Big Data 

    I’m giving serious thought to this, as much as I’m trying to save money and travel less:

    Call for Papers for the Conference „Scraping the Demos“: Political
    epistemologies of Big Data

    Organizers: Research Group Quantification and Social Regulation
    (Weizenbaum Institute for the Networked Society) and DVPW Thematic Group
    “Internet and Politics. Electronic Governance”

    Date: 8-9 July 2019 (lunch-to-lunch)

    Conference location: WZB Berlin Social Science Center, Reichpietschufer
    50, D-10785 Berlin, Germany

    Responsible: Dr. Lena Ulbricht lena.ulbricht@wzb.eu

    The conference explores political epistemologies of big data. Political
    epistemologies are practices by which societies construct politically
    relevant knowledge and the criteria by which they evaluate it. These
    practices and criteria may originate in scientific, political, cultural,
    religious, and economic contexts. They evolve over time, vary between
    sectors, are inherently political and therefore subject to conflict. Big
    data is the practice of deriving socially relevant knowledge from
    massive and diverse digital trace data. The rise of digital technologies
    in all social spheres has enabled new epistemic practices which have
    important political implications: Political elites see digital
    technologies as sources of new and better tools for learning about the
    citizenry, for increasing political responsiveness and for improving the
    effectiveness of policies.

    Practices such as “big data analysis”, “web scraping”, “opinion mining”,
    “sentiment analysis”, “predictive analytics”, and “nowcasting” seem to
    be common currency in the public and academic debate about the present
    and future of evidence-based policy making and representative democracy.
    Data mining and web scraping, techniques to access information “hidden”
    behind the user interface of a website or device, seem to establish
    themselves as epistemic practices with political implications. They
    generate knowledge about populations and the citizenry which diverge in
    many ways from previous ways of “seeing” and constructing the demos.
    Data that is based on digital collection tools is often much more
    personal, it can relate different kinds of information and in many cases
    offer an improved predictive capability. Therefore, survey methods and
    traditional administrative data may lose influence on political
    epistemologies. To rely on big data means to rely on data sources that
    accumulate information without awareness of the concerned individuals.
    This epistemic shift can be observed in policy advice, government and
    administration, and political campaigning. Emerging research strands
    such as “computational social sciences,” “social physics,” “policy
    analytics”, “policy informatics”, and “policy simulations” strive for
    better evidence, more transparency and responsiveness in policy making
    and governments such as in the UK, or, as in Australia, have set up
    strategies of “open policy making”, “agile policy making” and “public
    service big data”.

    Political parties and advocacy groups use digital data to address
    citizens and muster support in a targeted manner; public authorities try
    to tailor public policy to public sentiment measured-online, forecast
    and prevent events (as in predictive policing, preemptive security and
    predictive healthcare), and continuously adapt policies based on
    real-time monitoring. An entire industry of policy consultants and
    technology companies thrives on the promise related to the political
    power of digital data and analytics. And finally, academic research
    engages in digitally enhanced computational social sciences, digital
    methods and social physics on the basis of digital trace data, machine
    learning and computer simulations. The political implications of these
    epistemic practices have yet to be examined in detail. Indeed, the rise
    of digital technologies in all social spheres may alter the relations
    between citizens and political elites in various ways: it could improve,
    impoverish (or simply change) political participation, policy
    transparency, accountability of political elites and, and decision-making.

    The aim of the conference is to bring together scholars from various
    related disciplines working on the topic, including, but not limited to:
    political communication, elections and party politics, science and
    technology studies, political theory, history, sociology and philosophy
    of science, critical data studies and computational social sciences.
    These fields of research have addressed various aspects related to
    political epistemologies in the digital age – but there have been only
    few opportunities to relate them, to compare similar practices in
    different fields (for example in public policy and in political
    campaigning) and to examine the broader picture in order to generate
    theories about the political epistemologies of big data, algorithms and
    artificial intelligence. Contributions can be both, conceptual or empirical.

    The conference is interested in research concerning the following
    questions and similar topics:
    •       What are the political epistemologies underlying the use of big data
    and related phenomena such as algorithms, machine learning and
    artificial intelligence in political contexts?
    •       Which scientific, political, social and economic practices make use of
    digital data and methods? How do these practices construct knowledge
    which is deemed as politically relevant?  By which
    (rhetoric/procedural/technical) means do these practices and the actors
    involved substantiate their claims to political relevance?
    •       What insights can we gain from the computational social sciences in
    relation to traditional social science methods when it comes to
    political behavior, public opinion, policy making etc.?
    •       How are digitally mediated political epistemologies related to other
    political epistemologies? How are they embedded in institutional
    practices and values?
    •       Which interpretive conflicts do we witness with regard to the
    knowledge produced and legitimized by digital technologies; which are
    its major challengers? In which ways do epistemic practices based on big
    data, compared to other epistemic practices, influence the chances for
    challenging political knowledge claims?
    •       How can we place political epistemologies in a historical or cultural
    perspective?
    •       What are the implications of digitally mediated political
    epistemologies for evidence-based policy making and for representative
    democracy? Which conceptions of participation, representation and good
    governance are embedded in the related practices? How do big
    data-related epistemic practices reconfigure democratic concepts? Do we
    witness a new form of technocracy?
    •       How should democratic societies shape and regulate big-data-based
    epistemic practices? Which contributions can we expect from algorithmic
    accountability, data protection and research ethics?
    The conference will provide academic reflections to current public
    debates about the state of democracy in the digital age, considering
    that in 2019 various elections take place in German speaking countries,
    at the level of the European Parliament and within the German federal
    states of Bremen, Hamburg, Saxony, Brandenburg and Thuringia, as well as
    in Austria and Switzerland (regional and federal level). The keynote
    will be held by professor Daniel Kreiss, the author of a seminal book
    about the use of data-related practices in political campaigning
    (“Prototype Politics” 2016). The conference will also include artistic
    interventions and a lab.

    The conference will offer childcare, will be video-recorded, and held in
    English. If the funding application is successful, the travel costs of
    paper presenters will be covered. The organizers plan on following up
    the conference with a publication project.

    Abstracts should make explicit on which theories, methods and, if
    applicable, empirical material the paper is based. Please send your
    abstract of 300-500 words until February 24 to the following address:
    demosscraping-weizenbaum@wzb.eu

    Preliminary program structure
    8 July 2019
    14.00   Welcome address
    14.15   Keynote by professor Daniel Kreiss + discussion
    15.30   Coffee
    16.00   Paper presentations
    17.30   Lab and art exhibition
    18.30   Reception
    9 July 2019
    9.00    Paper presentations
    10.30   Coffee
    11.00   Paper presentations
    12.30   Paper presentations or panel discussion
    14.00   Ending

     
  • Mark 9:58 am on January 25, 2019 Permalink | Reply
    Tags: big data, ,   

    The Politics of Big Data: Social Listening and Reclaiming the Future 

    From an interesting British Academy funded workshop last November:
    https://methods.sagepub.com/video/embed/srmpromo/oohinf/social-listening-and-reclaiming-the-future

     
  • Mark 10:56 am on December 19, 2018 Permalink | Reply
    Tags: Alison Hearn, big data, data mining, , Helen Kennedy, ,   

    Cultural studies of data mining 

    My notes on Andrejevic, M., Hearn, A., & Kennedy, H. (2015). Cultural studies of data mining: Introduction, European Journal of Cultural Studies 18(4-5), 379-394

    In this introduction to an important special issue, Mark Andrejevic, Alison Hearn and Helen Kennedy that the ubiquity of data infrastructure in everyday life means that “we cannot afford to limit our thinking about data analysis technologies by approaching them solely as communication media” and offer a list of questions which we need to address:

    what kinds of data are gathered, constructed and sold; how these processes are designed and implemented; to what ends data are deployed; who gets access to them; how their analysis is regulated (boyd and Crawford, 2012) and what, if any, possibilities for agency and better accountability data mining and analytics open up. (pg 380)

    This creates a problem for cultural studies because data mining challenges established forms of representation, “promising to discern patterns that are so complex that they are beyond the reach of human perception, and in some cases of any meaningful explanation or interpretation”. It is not “not only a highly technical practice, it also tends to be non-transparent in its applications, which are generally privately owned and controlled”. It poses an ontological challenge to cultural studies, as well as epistemological and methodological ones. In the absence of access to the products of data mining, the authors suggest cultural studies is left theorising their effects.

    If we approach data analysis technologies as communicative media, we miss a “shift away from interpretive approaches and meaning-making practices towards the project of arranging and sorting people (and things) in time and space” (pg 381). Data mining isn’t undertaken to understand the communication taking place, as much as to “arrange and sort people and their interactions”. They suggest that recent developments in social theory mirror this changing reality (pg 381-382):

    Perhaps not coincidentally, recent forms of social and cultural theory mirror develop- ments in big data analytics; new materialism, object-oriented ontology, post-humanism and new medium theory – all of which are coming to play an important role in digital media studies – de-centre the human and her attendant political and cultural concerns in favour of a ‘flat’ ontology wherein humans are but one node, and perhaps not the most  important, in complex networks of interactions and assemblages. Thus, analysis of the circulation of affects and effects rather than of meanings, content or representations, con- nected as they are to human-centred forms of meaning-making, has become a dominant trope in some influential current approaches to media. Such analyses tend to fashion themselves as anti-discursive in their rejection of a focus on representation and cognition and their turn towards bodies and things in their materiality (rather than their significa- tion).

    They make the compelling argument that to “remain within the horizon of interpretation, explanation and narrative” can be a “strategic critical resource in the face of theoretical tendencies that reproduce the correlational logic of the database by focusing on patterns and effects rather than on interpretations or explanations” (pg 382). The promise of these new approaches to correct an excessively discursive focus risks an “over-correction” and a “view from nowhere” in which “the goal of comprehensiveness (the inclusion of all components of an endless network of inter-relations) tends towards a politically inert process of specification in which structures of power and influence dissipate into networks and assemblages” (pg 383). Pushing beyond human concerns too easily leads to ever more specific analyses which collapse the substance of interactions into their effects, leaving us with “no way of generating a dynamics of contestation and argument in a flat ontology of ever proliferating relations or objects” (pg 384).

    This is not a claim that there is nothing beyond culture, but rather a reminder that invoking this beyond is intrinsically cultural and a call for “an interrogation of the embrace of a post-cultural imaginary within contemporary media theory” (pg 384). This imaginary often obscures the political economy of data infrastructure, compounding the existing tendency for the ‘virtual’ character of digital phenomenon to distract from their socio-economic materiality; for all their opacity, complexity and power they are just another phase in the technological development of human civilisation (pg 385). When we recognise this in becomes easier to reject the “celebratory presentism” and remember that “technological forms, and the rhetorics and analytic practices that accompany them, do not come from nowhere – they have histories, which shape and condition them, and inevitably bear the marks of the cultural, social and political conditions surrounding their production and implementation” (pg 385). They end this wonderful paper with a call to action which I’d like to explore in the digital public sociology book I’m writing with Lambros Fatsis (pg 393):

    We need to develop new methodologies and new intellectual and critical competencies to tackle the embedded assumptions buried in the code and their political and cultural implications. Our ability to accomplish these things will require more than isolated scholarly effort; collaborative, politically engaged activist sensibilities will no doubt be required in order to push past the privatized digital enclosures and open up access to the algorithms, analytics, distributive regimes and infrastructural monopolies that are increasingly coming to condition the contours and substance of our daily lives.

     
  • Mark 4:12 pm on May 31, 2018 Permalink | Reply
    Tags: , big data, , , , , ,   

    Expertise and the politics of discipline 

    In the last few days, I’ve spent a lot of time reflecting on a remark Susan Halford made at this event about the difference between expertise and discipline. If I understand her correctly, her point was that capacities for knowing and acting in the world (expertise) can have their reproduction organised socially in different ways (discipline) and this is crucial for understanding how knowledge production responds to novel developments. In some cases, discipline might support expertise but in other cases it might hinder it. In either case, expertise is dependent upon it because it requires a social organisation through which existing knowledge is codified, new knowledge incorporated and knowledge practitioners trained. This means that we can’t ever have ‘pure expertise’ as a response to novelty because experts are embedded, even if loosely or unorthodoxly, within disciplines. This is the problem Susan identifies with the politics of discipline generated by big data:

    How we define Big Data matters because it shapes our understanding of the expertise that is required to engage with it – to extract the value and deliver the promise. Is this the job for mathematicians and statisticians? Computer scientists? Or ‘domain experts’ – economists, sociologists or geographers – as appropriate to the real-world problems at hand? As the Big Data field forms we see the processes of occupational closure at play: who does this field belong to, who has the expertise, the right to practice? This is of observational interest for those of us who research professions, knowledge and the labour market, as we see how claims to expert knowledge are made by competing disciplines. But it is also of broader interest for those of us concerned with the future of Big Data: the outcome will shape the epistemological foundations of the field. Whether or not it is acknowledged, the disciplinary carve-up of big data will have profound consequences for the questions that are asked, the claims that are made and – ultimately – the value that is derived from this ‘new oil’ in the global economy.

    One response to this upheaval is to retreat into disciplinary silos and there’s inevitably a comfort to this. But not only does it cede terrain in a way which might allow narrow forms of expertise to become hegemonic, doubling down on a form of discipline unlikely to survive this transformation of expertise in its current form will inevitably be short sighted. This is how Felicity Callard and Des Fitzgerald describe the shifting plate tectonics of the human sciences in their book on interdisciplinarity:

    The more we wander down strange interdisciplinary tracks, the more apparent it becomes to us that being disciplined isn’t playing it safe: the truth is that staying within the narrow epistemological confines of –for example –mid-twentieth-century sociology, while it may produce short-term gains, is not, in fact, the best way to guarantee a career in the twenty-first century (and we mean ‘career’ in its most capacious sense here: we are not using it with the assumption that everyone wants a permanent post at a university, but to express an idea that many would like to find some way to advance their projects, ideas, and so on). The plate tectonics of the human sciences are shifting: we here describe our own forays into one small, circumscribed niche between the social and natural sciences, but expand this horizon to epigenetics, to the emergence of the human microbiome, to all kinds of translational research in mental health, to ‘big data’ and the devices that append it, to the breakdown of the barrier between creative practices and research, and to a whole host of other collapsing dichotomies, and it becomes apparent that ‘neuro-social science’ is only one local effect of a much broader reverberation.

    But there’s also a great deal of creativity in this space. It just means we have to consider projects of expertise alongside projects of discipline, mapping out these issues as neither purely matters of expertise nor purely matters of discipline. This is what I hope we’ll manage to explore in my session at the TSR conference on defending the social. It’s a Fireside Chat with Val Gillies and Ros Edwards, as well as their co-author who couldn’t make it when we recorded the podcast below.

     
  • Mark 11:46 am on April 28, 2018 Permalink | Reply
    Tags: , , big data, , economic man, , ,   

    From homo economicus to homo digitalis 

    In a recent paper, I’ve argued we find a cultural project underpinning ‘big data’: a commitment to reducing human being, in all its embodied affective complexity, stripping it of any reality beyond the behavioural traces which register through digital infrastructure. Underlying method, methodology and theory there is a vision of how human beings are constituted, as well as how they can be influenced. In some cases, this is explicitly argued but it is often simply implicit, lurking beneath the surface of careful choices which nonetheless exceed their own stated criteria.

    It’s an argument I’m keen to take further than I have at present and reading Who Cooked Adam Smith’s Dinner by Katrine Marçal  has left me interested in exploring the parallels between homo economicus (and why we are invested in him) and the emerging homo digitalis. Marçal writes on pg 162 of the allure of the former, misunderstood if we see it as nothing more than an implausible theoretical construct or a mechanism to exercise influence over political decision-making:

    Many have criticized economic man’s one-dimensional perspective. He lacks depth, emotions, psychology and complexity, we think. He’s a simple, selfish calculator. A caricature. Why do we keep dragging this paper doll around? It’s ridiculous. What does he have to do with us? But his critics are missing something essential. He isn’t like us, but he clearly has emotions, depth, fears and dreams that we can completely identify with. Economic man can’t just be a simple paper doll, a run-of-the-mill psychopath or a random hallucination. Why, if he were, would we be so enchanted? Why would we so desperately try to align every part of existence with his view of the world, even though collected research shows that this model of human behaviour doesn’t cohere with reality? The desperation with which we want to align all parts of our lives with the fantasy says something about who we are. And what we are afraid of. This is what we have a hard time admitting to ourselves. Economic man’s parodically simple behaviour doesn’t mean that he isn’t conjured from deep inner conflicts

    What makes homo economicus so compelling? This allure has its roots in a denial of human dependence, describing on pg 155 how our fascination with “his self-sufficiency, his reason and the predictable universe that he inhabits” reflect discomfort with our once having been utterly dependent on others, “at the mercy of their hopes, demands, love, neuroses, traumas, disappointments and unrealized lives”, as well as the inevitability that we will be so again at the other end of the life-course. But he also embodies a vision of what life should be like between the two poles of dependency, as she writes on pg 163:

    His identity is said to be completely independent of other people. No man is an island, we say, and think that economic man’s total self-sufficiency is laughable. But then we haven’t understood his nature. You can’t construct a human identity except in relation to others. And whether economic man likes it or not –this applies to him as well. Because competition is central to his nature, his is an identity that is totally dependent on other people. Economic man is very much bound to others. But bound to them in a new way. Bound to them. Downright chained to them. In competition. If economic man doesn’t compete, he is nothing, and to compete he needs other people. He doesn’t live in a world without relationships. He lives in a world where all relationships are reduced to competition. He is aggressive and narcissistic. And he lives in conflict with himself. With nature and with other people. He thinks that conflict is the only thing that creates movement. Movement without risk. This is his life: filled with trials, tribulations and intense longing. He is a man on the run.

    If I’m right about the existence of homo digitalis, a clear vision of human constitution underpinning ‘big data’*, we can ask similar questions about this truncated, eviscerated, predictable monad. So complex when we look up close, so simple when we gaze down from on high. Our individuality melts away in the aggregate, leaving us no longer overwhelming but simply overwhelmed. Manageable, knowable, stripped back. Why might this be an appealing vision of human kind? Who might it be appealing to? I’m sure many can guess where I’m going with this, but it’s a topic for another post.

    *A term I use to encompass digital social science, commercial and academic, as well as the organisations and infrastructures which it facilitates.

     
  • Mark 8:49 pm on October 17, 2017 Permalink | Reply
    Tags: , big data, computational skills, , , quantitative methods, ,   

    The sociology of quantitative methods in the U.K.  

    Some tweets about this blog post worry me because it appears as if people think this is my analysis. It’s not. These are my notes on the excellent paper below which I’d strongly recommend reading in full. 

    This thought-provoking article by Malcolm Williams, Luke Sloan and Charlotte Brookfield offers a new spin on the familiar problem of the quantitative deficit within U.K. sociology. Many accounts of this sort are concerned with the explanatory implications of this deficit (the phenomena that defy explanation without quantitative terms) while digital sociology is concerned with its implications for computational skills. However, the authors look to a deeper level: the tradition within British sociology which defines itself against quantitative methods. They explore this possibly by drawing a contrast between analytical sociology and critical sociology:

    Analytic sociology is the term often used to describe a quite specific version of scientific sociology that combines theories and empirical data to produce sociological explanations (Bunge, 1997; Coleman, 1986; Hedström, 2005; Hedström and Swedberg, 1998). It mostly employs mechanistic explanation and variants on middle range theory. Our use of the term ‘analytic’ encompasses this specific use, but is also broader and meant solely to indicate a sociology that aims to produce descriptions and explanations of social phenomena. It does not exclude ‘understanding’ as methodological virtue, nor does it deny the role of ‘critique’ as an element in the methodological toolkit. It certainly does not exclude qualitative methods and indeed the research described here has qualitative elements

    http://journals.sagepub.com/doi/full/10.1177/1360780417734146

    Their distinction tracks familiar oppositions between explanation/ understanding and positivism/hermeneutics. Their interest is in how the latter term in each pair was advantaged by the dynamics of expansion in U.K. universities, where (non-quantitative) sociology was a cheap route to expanded student numbers with little to no necessary capital investment. It was during this period of expansion during the 1960s that ‘scientific method’ began to be tied to militarism by the burgeoning anti-war movement. They argue that successive intellectual movements (postmodernism, the linguistic turn, the cultural turn) accentuated this antipathy, such that progressive thought came to be instinctively cautious about quantitative methods. This trend played out within the discipline, its students and teacher, rather than simply being located ‘out there’.

    They see this hostility as being dampened by the methodological pluralism encouraged by critical realism and mixed methods pragmatism. But for reasons I don’t understand, which seem to misread the motivations and methods of the critical realist project, incorporate them to analytical sociology:

    While there are important differences in the analytic approach (say between realism, post-positivism, and positivism), there is a common core as treating social phenomena as real (or a proxy for real) (Kincaid, 1996) that can be caused, or can cause other social phenomena. The analytic approach shares the common foundations of science: description, explanation, and theory testing and, more specifically, that through the use of appropriate sampling we can generalise from sample to population or from one time or place to another.

    http://journals.sagepub.com/doi/full/10.1177/1360780417734146

    These are precisely the features which what they call critical sociology rejects as “either methodologically impossible to achieve, in the social world, or ethically undesirable”. More positively, it is concerned with situated meaning and the possibility of emancipation. Their characterisation here is much vaguer but they admit there is an element of strawman to each. Their concern is with how these sociological stereotypes enter into the understanding of students, as extreme versions of actually existing tendencies take hold in the imagination of those who are the next generation of sociologists and the cohorts which the discipline sets loose upon the world.

    This is an important possibility because evidence suggests that sociology students are not driven by a fear of number in choosing their degree. Or at least that other mechanisms are at work in bringing about the quantitative deficit within U.K. sociology. The evidence they present suggests a humanistic understanding of sociology is dominant within the student body:

    Table 2 clearly shows that the majority of students scored the discipline as closer to the arts/humanities than science/maths. It has been speculated that students taking a prior A-levels in art might be inclined to see sociology as closer to the arts and those taking a mathematics A-Level as closer to science. In fact, though there was some variation at the different measurement points, more students in both groups still thought sociology nearer to the arts/humanities than the sciences.

    All but one of subsequent focus groups revealed a “proclivity towards the qualitative involving the theoretical and critique with scepticism about statistics and a clear preference from the students for doing discursive work”. The BSA survey, asking more nuanced questions than the aforementioned survey, produced a more cautious endorsement of sociology’s status:

    Table 4 shows that the majority of participants viewed the subject content (64.3%) and status (66.9%) of sociological research as closer to the arts and humanities. In terms of methodology, analytical tools, and public utility, sociology was seen as mid-way between the arts and humanities and the natural sciences

    Their overarching argument, supported by intriguing comparative data concerning sociology in Netherlands and New Zealand, concerns how a cultural antipathy to quantitative methods gets reproduced across successive professional cohorts (compounded by the marginalisation of quantitative methods teaching within the broader curriculum):

    Many, if not most, sociologists in UK universities have themselves come from a culture of sociology that emphasises critique over analysis, theoretical positions, and qualitative over quantitative methods of enquiry that reflect the historical influences on the discipline, as described above. This culture exists at all levels of teaching, from pre-university A-level teaching through to postgraduate training. Their attitudes and practices incline them ideologically and practically to favour a humanistic and critical attitude towards the discipline, the selection of research questions that require interpretive methods, and often either an expertise in these methods or a preference for theoretical reasoning alone

    The result is an absence of methodological pluralism within U.K. sociology, held it seems as a point of principle. They suggest this might also be coupled with a vague sense of persecution, as critical sociology perceives itself as being under threat in a discipline it in fact dominates.

    The ensuing ‘split personality’ might be a source of strength for the discipline in troubled times:

    In the UK, quite apart from sociology ceding many of its former areas of interest to other disciplines, what sociology is depends on who you ask. The appearance is one of fragmentation. Nevertheless, a counterfactual argument may go something like this: a fragmented discipline might also be described as a diverse one, whose survivability does not depend on the adherence to any particular paradigm. Psychology, for example, which has long been largely associated with experimental method, faces something of a crisis as the statistical reasoning that underpin the experiment have been increasingly challenged in the last two decades (see, for example, Krueger, 2001). Sociology, in the UK, may actually be more agile as a result of its analytic/critique split personality

    But crucially there is a risk of the quantitative practitioners exporting themselves from the discipline, even as its capacity to generate them increases:

    One might further speculate that those graduate sociologists, from universities with Q-Step centres or other more quantitatively inclined courses, will not necessarily work in sociology or identify as sociologists because they too see it as a primarily humanistic discipline based upon critique, but rather go to other disciplines or become generic ‘social researchers’ with a consequent continuation of the present situation where analytic sociology continues to be a minority pursuit within the UK discipline.

     
  • Mark 11:54 am on October 17, 2017 Permalink | Reply
    Tags: , big data, , , , office for students, , ,   

    The coming big data revolution within higher education 

    It seems passé to talk about the ‘big data revolution’ in 2017. Much of the initial hype has subsided, leaving us in a different situation to the one in which big data was expected to sweep away all that had come before. Instead, we have the emergence of data science as well as the institutionalisation of computational methods, albeit unevenly, across the full range of the natural and social sciences. Furthermore, addressing the challenge posed by early waves of big data evangelicism to established methodologies, particularly those with a critical and/or hermeneutic focus, has generated a vast outpouring of creativity with the potential to generate significant reorientations within these disciplines. The ‘big data revolution’ has proceeded in a much more constructive way than those early prophets of epochal change were able to predict.

    However, we are still far from harmony within the academy. While the intellectual changes driven by big data are well underway, institutional changes of potentially greater importance are still in their infancy. This is how Susan Halford describes the politics of discipline surrounding big data:

    How we define Big Data matters because it shapes our understanding of the expertise that is required to engage with it – to extract the value and deliver the promise. Is this the job for mathematicians and statisticians? Computer scientists? Or ‘domain experts’ – economists, sociologists or geographers – as appropriate to the real-world problems at hand? As the Big Data field forms we see the processes of occupational closure at play: who does this field belong to, who has the expertise, the right to practice? This is of observational interest for those of us who research professions, knowledge and the labour market, as we see how claims to expert knowledge are made by competing disciplines. But it is also of broader interest for those of us concerned with the future of Big Data: the outcome will shape the epistemological foundations of the field. Whether or not it is acknowledged, the disciplinary carve-up of big data will have profound consequences for the questions that are asked, the claims that are made and – ultimately – the value that is derived from this ‘new oil’ in the global economy.

    https://discoversociety.org/2015/07/30/big-data-and-the-politics-of-discipline/

    We can see rapid transformation at this level, with expertise in the social and natural sciences responding to the opportunities and incentives which big data has brought with it. The institutional landscape has begun to change, most notably around funding, with important consequences for how individual and collective agents plan their career-path through this environment. However, this is still unfolding within organisations that have not themselves undergone change as a result of big data. It is this which is likely to change in the coming years. As WonkHe reported earlier this week of the consultation on how the Office for Students will regulate providers of higher education in England:

    The consultation will also be looking at the nuts and bolts of the OfS – how will it balance the demands of competition and autonomy while maintaining “proportionate” regulatory approaches? How will the remarkable new powers of entry (extreme audit?) be used? What sanctions will be available to the new regulator, and how will they be applied? Following strong ministerial direction, we can also expect measures on senior staff pay to feature prominently, but what form will they take, and will they have any real teeth? And how will approaches compare to other sectors?

    Widely expected is an end to regular institutional visits – the “periodic review” is likely to be replaced by a new method for the OfS to use live data to monitor institutions. It may well be easier than the annual submission, but now is a good time to be a big data wonk, as new systems and process will need to be established in institutions to respond to a new approach.

    This concern for real time metrics, institutionalising transactional data into the fabric of higher education itself, only seems likely to grow. What does this mean for the politics of discipline? My hunch is that the big data revolution within higher education has only just begun and that it’s eventual form will be different to that which most predicted.

     
  • Mark 4:24 pm on November 14, 2016 Permalink | Reply
    Tags: , , , big data, , , social legibility, , ,   

    The bureaucratic origins of algorithmic authoritarianism 

    I just came across this remarkable estimate in an Economist feature on surveillance. I knew digitalisation made surveillance cheaper but I didn’t realise quite how much cheaper. How much of the creeping authoritarianism which characterises the contemporary national security apparatus in the UK and US is driven by a familiar impulse towards efficiency?

    The agencies not only do more, they also spend less. According to Mr Schneier, to deploy agents on a tail costs $175,000 a month because it takes a lot of manpower. To put a GPS receiver in someone’s car takes $150 a month. But to tag a target’s mobile phone, with the help of a phone company, costs only $30 a month. And whereas paper records soon become unmanageable, electronic storage is so cheap that the agencies can afford to hang on to a lot of data that may one day come in useful.

    http://www.economist.com/news/special-report/21709773-who-benefiting-more-cyberisation-intelligence-spooks-or-their

    In reality, it is of course anything but, instead heralding a potentially open ended project to capture the world and achieve the utopia of total social legibility. An ambition which always makes me think of this short story:

    The story deals with the development of universe-scale computers called Multivacs and their relationships with humanity through the courses of seven historic settings, beginning in 2061. In each of the first six scenes a different character presents the computer with the same question; namely, how the threat to human existence posed by the heat death of the universe can be averted. The question was: “How can the net amount of entropy of the universe be massively decreased?” This is equivalent to asking: “Can the workings of the second law of thermodynamics (used in the story as the increase of the entropy of the universe) be reversed?” Multivac’s only response after much “thinking” is: “INSUFFICIENT DATA FOR MEANINGFUL ANSWER.”

    The story jumps forward in time into later eras of human and scientific development. In each of these eras someone decides to ask the ultimate “last question” regarding the reversal and decrease of entropy. Each time, in each new era, Multivac’s descendant is asked this question, and finds itself unable to solve the problem. Each time all it can answer is an (increasingly sophisticated, linguistically): “THERE IS AS YET INSUFFICIENT DATA FOR A MEANINGFUL ANSWER.”

    In the last scene, the god-like descendant of humanity (the unified mental process of over a trillion, trillion, trillion humans that have spread throughout the universe) watches the stars flicker out, one by one, as matter and energy ends, and with it, space and time. Humanity asks AC, Multivac’s ultimate descendant, which exists in hyperspace beyond the bounds of gravity or time, the entropy question one last time, before the last of humanity merges with AC and disappears. AC is still unable to answer, but continues to ponder the question even after space and time cease to exist. Eventually AC discovers the answer, but has nobody to report it to; the universe is already dead. It therefore decides to answer by demonstration. The story ends with AC’s pronouncement,

    And AC said: “LET THERE BE LIGHT!” And there was light

    https://en.wikipedia.org/wiki/The_Last_Question

     
  • Mark 2:19 pm on July 31, 2016 Permalink | Reply
    Tags: , , big data, , , ,   

    The economic limitations of the attention economy 

    From Douglas Rushkoff’s Throwing Rocks at the Google Bus, loc 2256:

    Besides, consumer research is all about winning some portion of a fixed number of purchases. It doesn’t create more consumption. If anything, technological solutions tend to make markets smaller and less likely to spawn associated industries in shipping, resource management, and labor services.

    Digital advertising might ultimately capture the entirety of advertising budgets, but it does nothing to expand these budgets. There are upper limits on the revenue growth of the corporations that define the ‘attention economy’: how are they going to respond to these?

     
  • Mark 6:58 pm on April 29, 2016 Permalink | Reply
    Tags: , , , , big data, , , , , ,   

    The Entlastung of the Quantified Self 

    I’m very interested in this concept, which I was introduced to through the work of Pierpaolo Donati and Andrea Maccarini earlier this year. It emerged from the work of Arnold Gehlen and refers to the role of human institutions in unburdening us from existential demands. This is quoted from his Human Beings and Institutions on pg 257 of Social Theory: Twenty Introductory Lectures by Hans Joas and Wolfgang Knobl. He writes that institutions

    are those entities which enable a being, a being at risk, unstable and affectively overburdened by nature, to put up with his fellows and wit himself, something on the basis of which one can count on and rely on oneself and others. On the one hand, human objectives are jointly tackled and pursued within these institutions; on the other, people gear themselves toward definitive certainties of doing and to doing with in them, with the extraordinary benefit that their inner life is stabilized, so that they do not have to deal with profound emotional issues or make fundamental decisions at every turn.

    In an interesting essay last year, Will Davies reflected on the ‘pleasure of dependence’ in a way which captures my understanding of entlastung. It can be a relief to trust in something outside of ourselves, settling into dependence on the understanding that our context is defined by a degree of reliability due to an agency other than our own:

    I have a memory from childhood, a happy memory — one of complete trust and comfort. It’s dark, and I’m kneeling in the tiny floor area of the back seat of a car, resting my head on the seat. I’m perhaps six years old. I look upward to the window, through which I can see streetlights and buildings rushing by in a foreign town whose name and location I’m completely unaware of. In the front seats sit my parents, and in front of them, the warm yellow and red glow of the dashboard, with my dad at the steering wheel.

    Contrary to the sentiment of so many ads and products, this memory reminds me that dependence can be a source of deep, almost visceral pleasure: to know nothing of where one is going, to have no responsibility for how one gets there or the risks involved. I must have knelt on the floor of the car backward to further increase that feeling of powerlessness as I stared up at the passing lights.

    http://thenewinquiry.com/essays/the-data-sublime/

    At a time when entlastung is failing, when institutions are coming to lose this capacity to unburden us, could faith in self-tracking, big data and digital technology fill the gap? The technological system as a whole comes to constitute the remaining possibility of entlastung and we enthusiastically throw ourselves into its embrace, as the only way left to feel some relief from the creeping anxiety that characterises daily life.

    The essay by Will Davies is really worth reading: http://thenewinquiry.com/essays/the-data-sublime/

     
  • Mark 9:48 am on January 13, 2016 Permalink
    Tags: , big data   

    Living by numbers: big data & society 

    Screen Shot 2016-01-13 at 09.47.19

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel