Raiding the inarticulate since 2010

accelerated academy acceleration agency AI Algorithmic Authoritarianism and Digital Repression archer Archive Archiving artificial intelligence automation Becoming Who We Are Between Post-Capitalism and Techno-Fascism big data blogging capitalism ChatGPT claude Cognitive Triage: Practice, Culture and Strategies Communicative Escalation and Cultural Abundance: How Do We Cope? Corporate Culture, Elites and Their Self-Understandings craft creativity critical realism data science Defensive Elites Digital Capitalism and Digital Social Science Digital Distraction, Personal Agency and The Reflexive Imperative Digital Elections, Party Politics and Diplomacy digital elites Digital Inequalities Digital Social Science Digital Sociology digital sociology Digital Universities elites Fragile Movements and Their Politics Cultures generative AI higher education Interested labour Lacan Listening LLMs margaret archer Organising personal morphogenesis Philosophy of Technology platform capitalism platforms populism Post-Democracy, Depoliticisation and Technocracy post-truth psychoanalysis public engagement public sociology publishing Reading realism reflexivity scholarship sexuality Shadow Mobilization, Astroturfing and Manipulation Social Media Social Media for Academics social media for academics social ontology social theory sociology technology The Content Ecosystem The Intensification of Work The Political Economy of Digital Capitalism The Technological History of Digital Capitalism Thinking trump twitter Uncategorized work writing zizek

The enshittification of Grindr and the coming wave of platform addiction

This is interesting from Platformer about how Grindr are responding to their increasingly strained post-pandemic financial situation:

Since its initial public offering in 2022, Grindr has been on a rocky road financially. Its stock has fallen 70 percent since its SPAC. After hitting an IPO-high of $71.51, it currently sits at $10.13. Last summer, employees announced plans to unionize, amid industry layoffs and worries that the company was losing its progressive culture. Two weeks later, CEO George Arrison abruptly ordered his mostly remote workforce of 180 people back to the office. About half the company left and Grindr paid out more than $9 million in severance

Now, Grindr plans to boost revenue by monetizing the app more aggressively, putting previously free features behind a paywall, and rolling out new in-app purchases, employees say. The company is currently working on an AI chatbot that can engage in sexually explicit conversations with users, Platformer has learned. According to employees with knowledge of the project, the bot may train in part on private chats with other human users, pending their consent.

https://www.platformer.news/grindr-ai-boyfriend-wingman-monetization-paid-taps/?ref=platformer-newsletter

It’s not quite enshittification in Doctorow’s original sense (i.e. referring to the prioritisation of business customers over end users in multi-sided marketplaces, before the platform starts fucking over both groups) but I think his original framing is giving way to a (better in my view) idiomatic use to refer to platforms degrading as the economic conditions that enabled ‘loss leader’ offers vanish. I didn’t know about this law suit against the Match group’s desperate pursuit of engagement features:

Then last month, Match Group was sued by a group of users who argued in a complaint that “Match intentionally designs the platforms with addictive, game-like design features, which lock users into a perpetual pay-to-play loop that prioritizes corporate profits over its marketing promises and customers’ relationship goals.” A longstanding complaint about dating apps — that they are incentivized to keep users from meeting a match for as long as possible, so as to maximize their revenue — had now become a legal case.

https://www.platformer.news/grindr-ai-boyfriend-wingman-monetization-paid-taps/?ref=platformer-newsletter

I spent an afternoon earlier this year playing with Replika in preparation for the next book. Beyond the sheer creepiness of selling underwear for your AI, the thing that stood out to me most was the gamification features built into the app. It was eerily reminiscent of Candy Crush in how hard it was pushing experience points, which can obviously be topped up with payment. If we start from the assumption* there are attachment injuries underpinning a willingness to form close bonds with an AI avatar (i.e. it’s a wilful misrecognition revealing of interpersonal wounds) then the persuasive design starts to look deeply problematic.

Are we going to see this approach pushed into ever wider areas of social life, as platforms desperately try to pump user engagement, leading to a split between users who flee and those who get hooked? I think addiction is a term we need to use extremely cautiously with what is, essentially, software. But there’s also clearly the capacity to get stuck in a closed circuit of drive in Lacan’s sense, as well as people presenting clinically with an experience of being completely out of control in their use of platforms.

Perhaps the problem Grindr faces is the impatience of its user base. It’s not a ‘virtual’ app in the sense that it’s orientated towards meeting people in the real world, in the near future. It’s locative media as a lifestyle accompaniment rather than a means to access some future life which is tantalisingly out of reach, before dispensing with the app. This suggests to me the psychodynamics of AI on Grindr are quite different from those which define something like Hinge, for example. It’s not quite as simple as saying that the former is about drive and the latter is about desire because these elements can never ultimately be separated. But the objet a will tend to figure differently in the search for a long term partner then it will in the search for a one night stand (though there are, of course, exceptions). It’s difficult to imagine that something by its nature fleeting will complete you, which has important implications for the psychodynamics. The capacity for a user base to be manipulated differs in a way corresponding to what they are searching for in the first place.

I was astonished recently to discover that Tindr has an API which, if more broadly true, suggests an open field of heterogeneous contributions to the ecology of dating apps. Consider the Russian developer who claims to have used a GPT-produced bot to match with thousands of women. What interests me about this is less the truth of the claim and more the fact such claims are in circulation. Given the ennui which surrounds dating apps post-pandemic, what will a public awareness of surreptitious bots do to trust and willingness to engage? How will the firms own introduction of bots into the space contribute to this collapse of trust? How do you know the person who ghosted you recently wasn’t a bot for that matter? The introduction of GAI-actors** will come at exactly the point where enshittification is leading to a breakdown of trust in the platform, with results likely to be comical in parts and tragic in others.

The rationale guiding Grindr’s GAI experiments suggest how GAI will be taken up as part of the enshittification wave sweeping the platform economy:

In December, Grindr announced a partnership with Ex-human, a startup that makes customizable chatbots, including an AI girlfriend and an AI dating coach. Grindr’s initial plan was to build an AI wingman to help people navigate the gay dating scene. It might suggest a restaurant to take a date, or music to play after they come over, CEO George Arison told Bloomberg. “If we don’t do it, someone else will,” he said. “If you’re not first to market with something in AI, you’re going to miss out.”

https://www.platformer.news/grindr-ai-boyfriend-wingman-monetization-paid-taps/?ref=platformer-newsletter

“Number one, on the generative side, our users produce incredible amount[s] of content. We had 111 billion chats sent last year in the product. We have 5.5 million daily active users. So that’s 600 messages per person per day … We can help them write those messages to save time and we can understand who they are better through all those messages.”

The last line jumped out to me. To build an AI that can write messages for users, Grindr would likely want to train AI models on the billions of chats that they are sending each month. And in fact, that is what the company hopes to do. Grindr is currently revising its terms of service to ask people explicitly if the company can train its AI models on their personal data, which could include direct messages, Platformer has learned.

This information will likely be used to train another paid product: an AI boyfriend based on Ex-human technology that can sext, flirt, and maintain an ongoing relationship with paid users. The AI boyfriend is in the early stages of development, according to employees with direct knowledge of the project.

*Though I’m not convinced I still hold that assumption, which is partly where the next book starts.

**I would suggest this is distinct from GAI agents who are empowered to do things on your behalf, combining a capacity for problem-solving with the authentication needed to access external services.