Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work theory The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

Why do people who worry about the existential risks of AGI refuse to talk about capitalism?

I was struck in this Lex Friedman interview with Max Tegmark how the latter simply refuses to talk about capitalism when accounting for the existential risks he perceives as generated by AGI. In making sense of the competition between capitalist firms he doesn’t reach for political economy as an explanation or even neoclassical economics but rather the folkloric notion of Moloch: “an accelerating race towards a goal that has both a tremendous payoff and that guarantees our destruction”:

The answer is Moloch is tricking us into doing it. And it’s such a clever trick that even though we see the trick we still have no choice but to fall for it, right?

At points in the interview he basically uses Moloch as a synonym for capitalist competition but at other points its invoked in an actively mystical way, explaining away the destructive motivations of human actors as malign expressions of Moloch’s influence. For someone who I think could fairly be characterised as a ultra-rationalist, I find this incredibly weird. Not least of all because I’m pretty certain he’s not self-consciously using this notion in an allegorical way but in a real sense is suggesting that ‘Moloch’ is an efficacious force within the social world.

Has any one seen other examples of this? It’s an interesting route into the broader question which danah boyd most recently posed about the anthropological underpinning of the ‘AGI will destroy us’ mythology amongst digital elites.