Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work theory The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

Large language models as a corporate pissing contest

This is an excellent interview with Timnit Gebru about the current hype surrounding generative AI. She describes the rush towards ever increasing large language models as a corporate pissing contest driven by executives scared of being left behind:

And then the higher ups be like: Why are we not the biggest? Why don’t we have the biggest model? Why don’t we? I’m just like: Oh, my God, why? What’s the purpose of having the biggest? What kind of pissing contest is this?

https://techwontsave.us/episode/151_dont_fall_for_the_ai_hype_w_timnit_gebru

I found the conversation in the latter half of the podcast particularly helpful for thinking about how the current economic environment (high interest rates, declining consumer spending, tanking share prices) mean this will be seized upon as a technological innovation, after crypto and the metaverse failed to catch fire as the ‘next big thing’. This was expertly exploited by OpenAI who launched ChatGPT upon the world in a way primed to create a viral sensation. The impending release of GPT-4 will push this contest for bigness to the next level:

We’re possibly just weeks away from the debut of GPT-4. While OpenAI has been mum on the details of this much-anticipated release, it has been rumored that GPT-4 will contain 100 trillion parameters, which would make it the largest LLM in the world.

While it’s been fashionable to downplay the importance of big data in recent years, the “bigness” of the LLMs is the precise source of all the new capabilities and the excitement. In fact, researchers are eagerly awaiting what new capabilities they might squeeze out of LLMs as they crank the size even higher.

https://www.datanami.com/2023/02/03/like-chatgpt-you-havent-seen-anything-yet/