Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work theory The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

Science fiction publications are drowning in a sea of GAI-written submissions. Will academic journals be next?

I argued in a keynote last December that a wave of automation in journal publishing is pretty much inevitable, if academic authors use GAI to increase their rate of publication. There’s a bleak but realistic prospect of AI-written papers being AI-reviewed by journals before being AI-summarised by authors.

The scifi publication Clarkesworld recently closed submissions because they’re struggling to cope with the rate of GAI-written submissions. It’s particularly interesting how GAI influencers are seen to be driving the trend:

Clarke says they’ve seen this problem growing for a while, but they took the time to analyze the data before talking about it publicly. “The reason we’re getting these is a lot of the side-hustle community,” he says. “‘Make money using ChatGPT.’ They’re not science fiction writers—they’re not even writers, for the most part. They’re just people who are trying to make some money on some of these things, and they’re following people who make it sound like they know what they’re doing.” He adds that having seen some of the how-to videos in question, “There’s no way what they’re hawking is going to work.”

https://www.wired.com/story/sci-fi-story-submissions-generative-ai-problem/

Is the academic equivalent people writing guidance about how to use GAI to work more productively? Generative AI for Academics genuinely isn’t this. If anything it’s a book length argument about why you shouldn’t do this, intended to appeal to people who might otherwise be inclined to try. But I’m still worried about the prospect for scholarly publishing over the next few years, given the incentives attached to publication.