This is a point which seemed so obvious to me I’m surprised to realise it does need to be spelled out. Rather than ‘AI slop’ being some exogenous factor which is now swamping previously functional social media platforms, we need to see it as an outcome of existing practices of engagement farming. The political economy of social platforms has over many year inculcated a strategic orientation towards engagement because of the direct monetary and indirect status rewards which come from maximising it. What it means in practice is using whatever techniques are available to maximise engagement with your content while minimising the cost. In essence it treats other people’s attention as a resource to be farmed, with the ‘farming’ being a matter of strategic action which makes it more likely their attention will be translated into engagement with specific content.
In practice this is almost painfully mundane. It’s a matter of tweaking the content and its framings in ways which are likely to increase engagement. When people say that the algorithm creates certain effects on platforms (e.g. increases the amount of emotive content) this is the missing step through which platform architectures bring about human action. It’s because strategic actors recognise the algorithm rewards certain things (or at least imagine they do, there’s loads of folk theory here) that they take create content intended to exploit that characteristic. There’s also directly preparing content in ways to appeal to individual actors without relying on the mediation of the algorithm. Indeed the most effective engagement farming involves speaking to both ‘audiences’ at the same time: producing content which directly grabs people and feels ‘authentic’ while also being optimised for algorithm distribution.
The flood of AI slop we now see on platforms reflects a shift in engagement farming practices. It’s now possible to do engagement farming effectively at scale because LLMs make content creation so easily. There’s also a disturbing lack of AI literacy sufficient to create attentional markets ripe for exploitation by AI-content which is startlingly obvious if you have any sense of what you’re looking for. The problem is the political economy of the social platform rather than the AI-content per se, even if in practice the two things run together. This matters because we can’t have a meaningful conversation about the problem of ‘AI slop’ without talking about how fundamentally broken social media platforms are.
