Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms populism Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

The old web is dying, while the new web struggles to be born

This feels like an extremely important article by James Vincent about how rapidly AI generated text is swamping the web, as well as what this means for its longer term evolution. It hinges upon something I’ve been preoccupied by, namely that generative AI has emerged at exactly the point where the (possibly) mature business model of social media firms has emerged after the ‘free trial period’ has ended. As Vincent writes, social platforms corralled the creative production of the open web into walled gardens in the hope of monetising user generated content:

Years ago, the web used to be a place where individuals made things. They made homepages, forums, and mailing lists, and a small bit of money with it. Then companies decided they could do things better. They created slick and feature-rich platforms and threw their doors open for anyone to join. They put boxes in front of us, and we filled those boxes with text and images, and people came to see the content of those boxes. The companies chased scale, because once enough people gather anywhere, there’s usually a way to make money off them.

The content AI generates is parasitic upon this past activity, while also competing with it for attention in the context of platforms built around the competition for visibility. Until there are costs involved in flooding platforms with AI generated content, we are likely to see its rapid growth because of the potential returns entailed by maximising output, particularly when it can be spread across a range of accounts. Playing around with generative AI video software recently I was stunned by the ease with which I could setup an automated system to pump out videos which were a variation on a theme based on adding columns to a spreadsheet. The capacity for industrial scale content production is now in the hands of a single individual with no special resources of technical skills. I struggle to see how this could be anything other than a bad thing.

The switch to AI summaries in search embodies this parasitism in an even more transparent way. Whereas Google was once a gateway to visibility, accounting for a significant share of a websites’s traffic even after social media became dominant, we will potentially see that relation get lost as cultural outputs are drawn upon in non-attributed (and possibly non-attributable ways) in order to generate summaries. Vincent plausibly suggests this creates a significant incentive towards paywalls for those who have the capacity to leverage the perceived reliability of their content, leading to the most significant restructuring of the political economy of the web since social media.

The problem is that AI-generated content “fluent but not grounded in real-world experience, and so it takes time and expertise to unpick”. It also risks fatally clogging the most useful platforms we have (e.g. Wikipedia, Stack Overflow, Reddit) for accumulating expertise around particular domains in open, civic and non-commercial ways. The problem is that generative AI unsettles the advantage which scaling has allowed social platforms in ways which entirely change the dynamics which ensuing on mass open platforms:

In each case, there’s something about AI’s ability to scale — the simple fact of its raw abundance — that changes a platform. Many of the web’s most successful sites are those that leverage scale to their advantage, either by multiplying social connections or product choice, or by sorting the huge conglomeration of information that constitutes the internet itself. But this scale relies on masses of humans to create the underlying value, and humans can’t beat AI when it comes to mass production.

This blog post’s title is a line from Vincent’s column by the way, rather than my own creation