Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work theory The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

Some thoughts on GAI, trust and the future of the human sciences

  • GAI has emerged at a point of crisis within the behavioural and social sciences. Epistemic parameters have been upended by data science, there’s a crisis of reproducibility, the publishing system is creaking. Each of these in different ways has foregrounded the relationship between trust and epistemology: the trust that makes findings viable, the trust involved in the reproduction of data, the trust necessitated to engage with literature.
  • This crisis is occurring against a backdrop of financial pressures. University systems with fixed charges to students suffer under circumstances of double digit inflation, creating pressure on staff to do more in the same amount of time (e.g. more students) or with less (e.g. fewer colleagues). Corner cutting inevitably thrives under conditions of time pressure because it’s a pervasive solution to intensification of labour, particularly when there’s no oversight.
  • The promise that GAI can streamline research processes, potentially cutting costs with administrative and research assistant services is likely to be compelling under these circumstances. These financial incentives represent a form of corner cutting at the institutional level, likely to compound but not create the problems of reproducibility inherent to use GAI in the research process, as well as the problems concerning data governance and data justice.
  • The likely flood of GAI content on social platforms whose moderation procedures have been decimated challenges the basic assumption that online content has some relationship to a human actor. The range of possible relations expands vastly because ultimately someone still sets the GAI system into motion, at least for now. But the epistemic parameters of GAI-content are more uncertain than with user generated content.