I’ve been dwelling on this passage from Trump University’s sales manual, republished on loc 980 of this insider account of the ill-fated ‘university’, which it should be added had a MOOC system (in its first phase) and a recruitment strategy (in its second phase) which were extreme manifestations of what can be found in US higher education rather than definitive breaks with it. The manual briefs sales staff on how to hook in prospects who’ve attended free ‘taster’ sessions in order to get them to pay for expensive workshops:

Experience Is On Our Side: •   Because we decide what happens in the training, an attendee must react to what we say. They don’t have a choice. For example, we can spend hours and hours planning a question that they must deal with and give an answer to within seconds. We also have the advantage of testing the question out on hundreds of people and adjusting it to increase our chances for a desirable response. The attendee does not have the luxury of “practicing” his or her answer. However, we are losing this advantage if we don’t take time to develop what we say and consciously practice what we say.

What is the nature of the power being exercised here? It could be read as the third face of power in the sense of Steven Lukes, exercising influence by setting the agenda. But this fails to capture the temporal character of its exercise. What puts people on the back foot is that they have no time to prepare against the manipulations of people who have all the time they need to prepare. It is a temporal advantage being leveraged to ensure the interests of A are served by the actions of B. It is chronopower.

But it is chronopower in analogue mode. How different is it from the manipulative infrastructures we find on social media? We constantly fall into interactions with actors who’ve had plenty of time to design interventions, taking advantage of a vast informational asymmetry in which they have a great deal of data about us and we have none about them. But even if this wasn’t the case, we don’t have time to prepare for the interaction because we don’t know its coming. This I’d suggest is digital chronopower.

Saving this interesting CfP for later look up

 

Contributors are invited to submit abstracts (about 200 words) toward our
new edited collection entitled: *Social Media and the Production and Spread
of Spurious Deceptive Contents,* to be published by IGI Global (Hershey,
PA), under the series: *Advances in Digital Crime, Forensics, and Cyber
Terrorism* (ADCFCT)

Topics being covered include:

·         *History, literature, perspectives and the prevalence of online
deception*

·         *Methods, techniques and approaches to researching digital
deception*

·         *Fake news, disinformation/misinformation and misleading reports
on Facebook and Twitter             (case studies are encouraged here)*

·         *Defamation and character assassination (case studies are
encouraged here)*

·         *Phishing *

·         *Business falsehood, employment scam and commercial lies*

·         *Investment/financial scam; Ponzi/Pyramid schemes*

·         *Deceptive online dating, romance scam and fake marriage *

·         *Religious deception and political lies (case studies are
encouraged here)*

·         *Deceptive contents by extremist and terrorist groups (other
online platforms are inclusive here)*

·         *Deception detection and behavioral control methods.*

·         *Etc.*

All proposals are to be submitted through the *eEditorial
Discovery®TM online* submission Manager. Please click on this link to
submit an abstract and for full description of the CFP:
https://www.igi-global.com/publish/call-for-papers/call-details/3356.

*Important Dates*

*June 30, 2018:* Proposal Submission Deadline
*October 31, 2018:* Full Chapter Submission

*Inquiries can be forwarded to:*

Sergei Samoilenko

George Mason University, Virginia, USA

ssamoyle@masonlive.gmu.edu

In the last few years, I’ve become interested in what I think of as shadow mobilisation: assembling people under false pretences and/or in a way intended to create a misleading impressions of the mobilisation. This is often framed in terms of astroturfing – fake grass roots – however it appears to me to extend beyond this. It would be a mistake to see it as a new thing but it might be out present conditions are making it easier and more likely.

It implies a relationship between the instigators and those mobilised, either through manipulation or reimbursement, which is fundamentally asymmetrical. One group has the capacity to plan, enact and reflect on these mobilisations while the other is a mereaggregate, induced to action on an individual-by-individual basis, furthering an agenda which might cohere with their own individual concerns but has no basis in collective concerns. In this sense, shadow mobilisations are a facimale of collectivity. 

If we accept the adequacy of this concept, it raises many questions. Foremost amongst them though is how widespread such shadow mobilisations are, as well as the conditions which facilitate this. I’ve come across examples in many sectors and I wish I’d been recording these systematically. The most recent comes in Anna Minton’s Big Capital, an illuminating study of how global capital is transforming London. From loc 1281-1297:

In a House of Commons debate in 2013, Labour MP Thomas Docherty, a former lobbyist, shared with Parliament some of the techniques of his former colleagues, recounting stories of lobbyists being planted in public meetings to heckle people who opposed their clients’ schemes. His stories chime with a wealth of anecdotal evidence of dirty tricks, including fake letter-writing campaigns and even actors attending planning meetings. Martyn, a film maker from Brighton, described to me how he had been offered ‘cash in brown envelopes’ to attend a planning meeting and pose as a supporter of Frank Gehry’s controversial plans for an iconic new development of 750 luxury apartments on the seafront. He remembers how ‘at least five of us’ from the drama school where he was studying were approached by an events company and asked if they’d like to participate. ‘We were told to go there and shout down the local opposition to the development. A couple of people were pointed out to us –residents, leaders of the local opposition –and we were told to be louder than them and be positive about the development. We were paid on exit, cash in hand, I think it was £50 or £100. I was there and I’m not proud of it. It is something that horrifies me,’ he said. 36 In Parliament, Docherty described dirty tricks as ‘utterly unacceptable’, although ‘not a crime’.

While each particular case of this manipulation of the planning process occurs on a small scale, it reflects an asymmetry we can see in other cases of shadow mobilisation. Residents who coordinate their action, potentially constituting an organised collective in the process, confront organisations which deploy their resources towards drowning this nascent collectivity through a shadow mobilisation. As Minton points out, such activity sometimes occurs alongside organised harassment, suggesting the ethical climate in which shadow mobilisation is seen as a viable strategy by those pursuing private profit.

In The Making of Donald Trump, David Johnston identifies the tactics used by Trump to deflect inquiries into his many shady dealings and questionable decisions. Sometimes this is a matter of outright threats, with an enthusiasm for litigation (1,900 suits as plaintiffs coupled with an explicitly articulated philosophy of vengeance proving a dangerous combination for any who dare to cross him. But somewhat contrary to his public image as a blundering fool, he is often much more subtle than this, engaging in strategies of deflection and misdirection with all the deftness of the most accomplished public relations manager. In other cases, it just becomes weird, with Trump willing to publicly deny that a recording he had previously admitted to be of his own voice was anything other than a hoax:

This combination of viciousness, skilfulness and brazenness has left him insulated from meaningful scrutiny. But what has he averted in this way? What might have happened but hasn’t? On page 154 Johnston offers a description which has caught my imagination:

Together, these strategies – muddying the facts and deflecting inquiries into past conduct – help ensure that Trump’s carefully crafted public persona will not be unmade. He will not suffer the curtain to be pulled back to reveal a man who tricked society into thinking he was all wise and all powerful.

This public persona which has been crafted, sometimes deliberately while at other times impulsively, remains intact. I’m interested in what such a ‘pulling back of the curtain’ requires to be effective: the sustained attention of an audience, a sufficient familiarity with the person(a) in question, a prolonged campaign to sort fact from fiction and a lack of contestation concerning this process of sorting.

What is being framed somewhat unhelpfully as a ‘post-truth era’ are the conditions under which this ceases to be possible. There’s lots of ways in which we could try and explain them, not all of which are necessarily mutually exclusive. The collapse of authority in late modernity. The acceleration of communication. The weakening of journalism and the dominance of public relations. Theories of social change should be able to account for the specifics of such cases, rather than simply allowing them to be rendered thematically.

In his InfoGlut, Mark Andrejevic takes issue with the assumption that fostering ‘disbelief’ or ‘challenge’ is necessarily subversive.  As he puts it, “strategies of debunkery and information proliferation can work to reinforce, rather than threaten, relations of power and control” (loc 293). Recognising this in the abstract is important but I intend to read more about the specific cases in which these tactics are used regressively, as I’m increasingly fascinated by the extent to which these tactics are informed (or not) by epistemological and ontological understandings (even if these words are not used).

Under these conditions, what  Andrejevic describes as the ‘big data divide’ seems ever more prescient by the day. From loc 464:

The dystopian version of information glut anticipates a world in which control over the tremendous amount of information generated by interactive devices is concentrated in the hands of the few who use it to sort, manage, and manipulate. Those without access to the database are left with the “poor person’s” strategies for cutting through the clutter: gut instinct, affective response, and “thin- slicing” (making a snap decision based on a tiny fraction of the evidence). The asymmetric strategies for using data highlight an all- too- often overlooked truth of the digital era: infrastructure matters. Behind the airy rhetoric of “the cloud,” the factories of the big data era are sprouting up across the landscape: huge server farms that consume as much energy as a small city. Here is where data is put to work – generating correlations and patterns, shaping decisions and sorting people into categories for marketers, employers, intelligence agencies, healthcare providers, financial institutions, the police, and so on. Herein resides an important dimension of the knowledge asymmetry of the big data era – the divide between those who generate the data and those who put it to use by turning it back upon the population. This divide is, at least in part, an infrastructural one shaped by ownership and control of the material resources for data storage and mining. But it is also an epistemological one –a difference in the forms of practical knowledge available to those with access to the database, in the way they think about and use information.

 

There’s an interesting summary in Mediated Memories, by Jose van Dijck, pg 100-101 detailing research into the power of doctored media to shape false narratives:

In the early 1990s, researchers from America and New Zealand persuaded experimental subjects into believing false narratives about their childhoods, written or told by family members and substantiated by “true” photographs. 8 Over the next decade, these findings were corroborated by experiments in which doctored pictures were used; more than 50 percent of all subjects constructed false memories out of old personal photographs that were carefully retouched to depict a scene that had never happened in that person’s life.

I took part in an experiment a few years ago, under false pretences, in which I was ultimately presented with a fake video of myself stealing money. I told them that “I can’t explain the video but I didn’t steal your money”. This makes me learn towards a narrative view of what’s going on here: I didn’t feel able to deny the reality of the video but, sans narrative, I couldn’t accept that it portrayed what it purported to.