Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms populism Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

The psuedo-singularity of generative search answers

Given how transfixed I am by Rings of Power season 2 (so much better than the original) I’ve been asking Perplexity background information about Tolkien lore to address my uncertainty about elements of the show e.g. if Sauron is a spirit then why does he turn into Venom-esque black goo when he dies? There’s no question too obscure for Perplexity to provide an answer, with ‘suggested questions’ rapidly spiralling off into hyper-obscurity as you go down a rabbit hole. It satisfies my curiosity without leading me to spend hours lost in fan wikis, even though it poses obvious question about the ethics of what generative search trained on those wikis will be doing to their visibility.

So I thought I’d test it in an area where I know the lure inside and out. In Jon Hickman’s epic Secret Wars there were lots of threads which I know were never fully elaborated. I’ve been asking Perplexity questions about this unresolved plot threads and it will consistently provide a singular and definitive answer for each one, even when they weren’t actually shown in the story arc. The problem as far as I can see isn’t hallucination in the classic GAI sense but rather stitching together partial inferences in sources access through retrieval-augmented generation.

There’s a hole in knowledge, part of a story that was gestured to but never actually told, which various people have filled in through more or less speculative means. Their speculative answers are drawn upon by Perplexity in order to provide a singular answer to a question which is in reality unanswerable. It resides as a unrealised creative intention in Jon Hickman’s mind rather than something out there in the world which can be veridically described. Yet Perplexity treats every question as having an answer, generating those answers in a way that papers over the fractures and gaps in the knowledge system.

The combination of GAI hallucination and the combinatorial dynamics of RAG is very interesting. I feel like I’ve not got the language to adequately describe this yet, but this was a first attempt to put this dynamic into words, because I believe it is inherent in generative search and will manifest in different ways in other RAG systems.

Fediverse Reactions