A recurrent theme in stories about Facebook is the privilege which Mark Zuckerberg accords for himself which his radical transparency denies for others. My favourite example had been the opaque meeting room hidden away at the back of his glass fronted office, allowing him to retreat into privacy while everyone around him stands exposed. But this example from Roger McNamee’s Zucked loc 2955 is even better:

One particularly awkward story that week revealed that Facebook had been deleting Zuck’s Messenger messages from the inboxes of recipients, a feature not available to users. Facebook initially claimed it made the change for security purposes, but that was patently unbelievable, so the next day it announced plans to extend the “unsend” feature to users.

My notes on Caplan, R., & Boyd, D. (2018). Isomorphism through algorithms: Institutional dependencies in the case of Facebook. Big Data & Society, 5(1), 2053951718757253.

Are data-driven technologies leading organisations to take on shared characteristics? This is the fascinating question addressed in this paper by Robyn Caplan and danah boyd which they begin with the example of news media. The popularity of social media platform as intermediaries has forced many news media producers to change their operations, increasingly producing with a view to popularity on these platforms. As they put it, “these platforms have upended the organizational practices of news-producing platforms, altering how both the newsroom and individual journalists operate” (2). They use the concept of isomorphism to understand how “algorithms structure disparate businesses and aims into an organizational field, leading them to change their goals and adopt new practices” (2). This is a process of homogenisation, as organisations reconstruct themselves into a field orientated around the assumptions embedded into the t mediating platform. The ensuing ambiguity has regulatory consequences, as social media platforms are not straight forward media actors but nor are they mere intermediaries. By theorising algorithmic mediation as akin to bureaucratisation, it become easier to identify the precise character of the role of platforms within it. It also makes clear the continuities with earlier isomorphic processes, for instance as corporate software platforms introduced common features to organisations.

The roots of this connection are deep. They argue that “algorithms that serve to pre- process, categorize, and classify individuals and organizations should be viewed as extensions of bureaucratic tools such as forms, that have been associated with the state in the past” (3). Software like Lotus 1-2-3 and Microsoft Office restructured business activity through the affordances it offered to digitalise bureaucratic processes and algorithmic technologies should be seen as a further extension of this process. The neutrality which animated the promise of bureaucracy is also often expressed in the belief that algorithmic judgement will negate the role of subjectivity and bias in decision making processes. This is obscured by the familiar black box of the algorithm but also the mythology of its uniqueness, seeing it as something distinct from previous organisational processes. However if we see algorithms as organisational phenomena then the problem comes to look quite different, simultaneously more straight forward but also more challenging because the problems will likely spiral outwards across dependent organisations. 

They use DiMaggio and Powell’s concept of isomorphism which considers how a common environment can lead otherwise different units of a population facing that environment to come to resemble one another. For organisations this occurs through one organisation becoming dependent on another organisation, with the expected degree of resemblance tracking the degree of that dependence. For instance in the case of Facebook’s newsfeed, the concept of what is ‘relevant’ has been redefined by the vast size of the audience whose access is mediated through this mechanism. The dependence of the news media on that mechanism means they come to reproduce its characteristics, increasingly operating with a view towards metrics like clicks, likes and shares. The early winners in the Facebook ecosystem were those publishers like Buzzfeed and Upworthy who “subsumed their own organizational practices to the logic of Facebook’s algorithms” (5). But Facebook’s attempts to modulate this mechanism in order to produce what they deemed better quality results inevitably leads the actors dependent upon it to make adaptive changes in response to these modulations. Mimesis thrives in this environment as they explain on pg 6-7:

“Changes stemming from coercive forces, especially when frequent, lead to an environment of uncertainty that prompts dependent organizations to learn from other dependent organizations that have successfully conformed to the structuring mechanisms. This process of ‘‘mimesis,’’ or imitating models for success, is another process DiMaggio and Powell (1983: 151) argue will induce similarity across an organizational field. In this sense, the dominant organization’s incentives or goals become embedded across an industry through the borrowing of practices that lead to success over the network. In the case of Facebook, this was seen in the adoption of data-driven metrics and analytics into newsrooms, as well as the growth of a new set of intermediaries that were fed directly by the Facebook API, whose role it was to analyze and com- municate Facebook metrics back to publishers”

A further ecosystem of intermediaries thrives under these circumstances, as new players emerge who help the firms concerned address their common problems. These responses to uncertainty are driven by a concern to “demonstrate to others that they are working to change their practices to be in-line with those of the dominant organization“ (7) as well as increasing possibilities for success. The discussion of professionalisation is really important for my interests. The roles themselves changed as a result of isomorphism, with normative pressure to enact new functions and perform new skills which contrbute to the success of the organisation. This is my concern about the institutionalisation of social media within higher education. There’s a lot here which I’m going to need to go back to and I think it’s crucial for my developing project on the digital university. 

There’s an interesting extract in Roger McNamee’s Zucked about how strategically Facebook have reduced the significance of organic reach (i.e. unpaid distribution of content) on the platform. The promise of being able to communicate directly to a vast audience through Facebook pages has been central to the motivation of individuals, networks and organisations who have directed their resources towards engaging on the site. But there has been a steady decline in the percentage of followers who will see a post for free. As he describes on loc 1147-1161:

Every year or so, Facebook would adjust the algorithm to reduce organic reach. The company made its money from advertising, and having convinced millions of organizations to set up shop on the platform, Facebook held all the cards. The biggest beneficiaries of organic reach had no choice but to buy ads to maintain their overall reach. They had invested too much time and had established too much brand equity on Facebook to abandon the platform. Organic reach declined in fits and starts until it finally bottomed at about 1 percent or less. Fortunately, Facebook would periodically introduce a new product—the Facebook Live video service, for example—and give those new products greater organic reach to persuade people like us to use them.

There’s a difficult balance to strike here between nudging people into using paid advertising and obliterating the economic rationale for using Facebook in the first place. Once there is a long term trajectory of engagement, the effectiveness of the platform will still beat the alternatives even with declining organic reach. For many actors organic reach is what brings them to the platform in the first place, in spite of the fact the business model relies on soliciting advertising income. But will the long, slow decline of organic reach ultimately end with its demise? Almost certainly not but this tension is at the heart of the business model yet the dynamic it gives rise to is extremely subtle.

I’m enjoying Zucked by Roger McNamee more than I expected to. What’s useful about his account is the stress it places on the engineering capacity of startups. What makes cloud computing significant is not just enabling growth without major capital investment in infrastructure, but also “eliminating what had previously been a very costly and time-consuming process of upgrading individual PCs and servers“ (loc 674). What makes open source software significant is not just the reduced costs of free of the self components, but the time saved when everything can be built on open source stacks. I’d tended to see these things as a matter of reduced costs and increasing flexibility, without grasping their significance for how limited resources could be deployed within the firm.

Both mean that the engineer capacity of a startup can be directed mainly at the “the valuable functionality of their app, rather than building infrastructure from the ground up“ (loc 660). This led to the lean start up model and the capacity to go live then iterate in response to what happens. Without this modus operandi and the infrastructure supporting it, would the harvesting and use of what Zuboff calls behavioural surplus have even able to happen? A little earlier he frames this in terms of an epochal shift in the engineering constraints which tech startups had been subject to:

For the fifty years before 2000, Silicon Valley operated in a world of tight engineering constraints. Engineers never had enough processing power, memory, storage, or bandwidth to do what customers wanted, so they had to make trade-offs. Engineering and software programming in that era rewarded skill and experience. The best engineers and programmers were artists. Just as Facebook came along, however, processing power, memory, storage, and bandwidth went from being engineering limits to turbochargers of growth. The technology industry changed dramatically in less than a decade, but in ways few people recognized. What happened with Facebook and the other internet platforms could not have happened in prior generations of technology.

Under these conditions, it’s much easier to experiment and it’s easier to fail. As he points out, Facebook’s infamous motto Move Fast and Break Things is emblematic of the freedom which decreasing engineer constraints offers for a nascent firm. But what does this mean culturally? I think this is a crucial question to explore both in terms of the firm itself but also the expectation which investors then have of what constitutes adequate growth for a lean startup?

To what extent does the obsessive drive for increased user engagement have its origins in the changing expectations of investors and the challenge of meeting them?  McNamee observes it makes investors more inclined to identify what they perceive as losers and kill the startup before it uses too much money. The rewards on offer for successes are so immense that many failures can be tolerated if they lead to one success.

It also makes it much less onerous for the firm to scale, with important consequences for a nascent corporate culture. Inexperienced staff can be a plus under these circumstances, as long as they have the relevant technical skills. McNamee argues that the success of Facebook’s staffing policy had a big impact on human resources culture in Silicon Valley.

On the subject of the collapse of the tech mythology, a wonderful Slate headline succinctly conveys the significance of what is taking place: Facebook is a normal sleazy company now.  As Siva Vaidhyanathan puts it, “Facebook is now just another normal sleazy American company run by normal sleazy executives, engaged in normal sleazy lobbying and corporate propaganda”. He lists the controversies which have surrounded Facebook in the last few years and the founder’s response to them:

Over the past three years, Facebook has been outed for abusing the trust of its users, sharing personal data with third parties like Cambridge Analytica, unwittingly hosting Russian-backed propaganda intended to undermine American democracy, amplifying calls for religious and ethnic violence in places like Sri Lanka and Myanmar, and promoting violent authoritarian and nationalist leaders like Rodrigo Duterte in the Philippines and Narendra Modi in India. As these stories piled up and public trust eroded, the Times reports, Zuckerberg consistently exempted himself from crucial discussions with the Facebook security team and acted generally baffled that anyone would question his baby. After all, didn’t he just want, in his words, to “bring the world closer together?”

In contrast Sandberg initiated a lobbying operation with a particularly unseemly propaganda exercise attached to it, obviously at odds with the lofty rhetoric accompanying Facebook’s public pronouncements in the face of mounting scandal. Vaidhyanathan’s case is that the transition to sleaze is a recent phenomenon, reflecting the growing panic of a company which had formerly “made too much money to care about money and had too strong a reputation to care about its reputation”. Nonetheless, the mounting controversies are created by the platform working in the way it was designed to. As Vaidhyanathan says, “The problem with Facebook is Facebook.”

However my suggestion is that we have to recognise the collapse of the tech mythology as a distinct factor, beyond the current crisis in Facebook. There is an increasing  politicisation of Big Tech, as firms which positioned themselves as outside the normal rules of capitalism are increasingly recognised as what is driving a shift in capitalism itself. Their epochal rhetoric of disruptive innovation, bringing the world together through the power of their platforms, decreasingly obscures the material interests they embody. Without this broader collapse of the tech mythology, it would be easier for Facebook to make it through their present storm.

An important idea offered by Mike Caulfield. The embrace of frictionless sharing and the relentless pursuit of engagement have created the problems which are now being naturalised by the emerging ‘did Facebook lead to Trump’ discourse:

We have prayed at the altar of virality a long time, and I’m not sure it’s working out for us as a society. If reliance on virality is creating the incentives to create a culture of disinformation, then consider dialing down virality.

We know how to do this. Slow people down. Incentivize them to read. Increase friction, instead of relentlessly removing it.

Facebook is a viral sharing platform, and has spent hundreds of millions getting you to share virally. And here we are.

What if Facebook saw itself as a deep reading platform? What if it spent hundreds of millions of dollars getting you read articles carefully rather than sharing them thoughtlessly?

What if Facebook saw itself as a deep research platform? What if it spent its hundreds of millions of dollars of R & D building tools to help you research what you read?

https://hapgood.us/2016/11/15/maybe-rethink-the-cult-of-virality/

A fascinating article on the LSE’s Media Policy Blog about the global ambitions of contemporary technology giants and the corporate structures which facilitate them:

The folks who run these companies understand this. For if there is one thing that characterises the leaders of Google and Facebook it is their determination to take the long, strategic view. This is partly a matter of temperament, but it is powerfully boosted by the way their companies are structured: the founders hold the ‘golden shares’ which ensures their continued control, regardless of the opinions of Wall Street analysts or ordinary shareholders. So if you own Google or Facebook stock and you don’t like what Larry Page or Mark Zuckerberg are up to, then your only option is to dispose of your shares.

Being strategic thinkers, these corporate bosses are positioning their organisations to make the leap from the relatively small ICT industry into the much bigger worlds of healthcare, energy and transportation. That’s why Google, for example, has significant investments in each of these sectors. Underpinning these commitments is an understanding that their unique mastery of cloud computing, big data analytics, sensor technology, machine learning and artificial intelligence will enable them to disrupt established industries and ways of working in these sectors and thereby greatly widen their industrial bases. So in that sense mastery of the ‘digital’ is just a means to much bigger ends. This is where the puck is headed.

http://blogs.lse.ac.uk/mediapolicyproject/2016/07/12/digital-dominance-forget-the-digital-bit/

A slogan more frequently encountered on pro-police demos has been repeatedly daubed inside the Facebook headquarters, creating embarrassment for a corporation whose staff are overwhelmingly white and male:

Facebook CEO Mark Zuckerberg has reprimanded employees following several incidents in which the slogan “black lives matter” was crossed out and replaced with “all lives matter” on the walls of the company’s Menlo Park headquarters.

“‘Black lives matter’ doesn’t mean other lives don’t – it’s simply asking that the black community also achieves the justice they deserve,” Zuckerberg wrote in an internal Facebook post obtained by Gizmodo.

http://www.theguardian.com/technology/2016/feb/25/mark-zuckerberg-facebook-defacing-black-lives-matter-signs

Will such attitudes inevitably thrive under the conditions of meritocratic elitism which characterise much of the technology world?

This interesting article (HT Nick Couldry) explores the challenge faced by Facebook in imposing standards on a user base distributed around the globe:

As Facebook has tentacled out from Palo Alto, Calif., gaining control of an ever-larger slice of the global commons, the network has found itself in a tenuous and culturally awkward position: how to determine a single standard of what is and is not acceptable — and apply it uniformly, from Maui to Morocco.

For Facebook and other platforms like it, incidents such as the bullfighting kerfuffle betray a larger, existential difficulty: How can you possibly impose a single moral framework on a vast and varying patchwork of global communities?

If you ask Facebook this question, the social-media behemoth will deny doing any such thing. Facebook says its community standards are inert, universal, agnostic to place and time. The site doesn’t advance any worldview, it claims, besides the non-controversial opinion that people should “connect” online.

https://www.washingtonpost.com/news/the-intersect/wp/2016/01/28/the-big-myth-facebook-needs-everyone-to-believe/

Their ‘global community standards’ are the mechanism through which the digital activity of over one and a half billion users is policed. But these regulations have an uncertain grounding in the normative judgements of the user base: the aggregate of users are far too heterogeneous (to say the least) to facilitate any layer of moral intuition which can reliably buttress the legitimacy of the global community standards. This problem is amplified by two factors:

Facebook has modified its standards several times in response to pressure from advocacy groups — although the site has deliberately obscured those edits, and the process by which Facebook determines its guidelines remains stubbornly obtuse. On top of that, at least some of the low-level contract workers who enforce Facebook’s rules are embedded in the region — or at least the time zone — whose content they moderate. The social network staffs its moderation team in 24 languages, 24 hours a day.

https://www.washingtonpost.com/news/the-intersect/wp/2016/01/28/the-big-myth-facebook-needs-everyone-to-believe/

Having moderators embedded in a region might help on occasion. But this would assume the normativity of the region is any less fragmented and, as the Centre for Social Ontology’s recent book explores, we cannot assume this to be true. What’s more likely is that this vast army of poorly paid moderators will exercise little to no autonomy over their tasks, with the Facebook standards nonetheless being inflected through their variable judgement i.e. they won’t try and deviate from the global standards but they inevitably will do, in an unpredictable way, as any individual evaluator necessarily does when imposing a rule in particular cases.

So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.

This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards

https://www.washingtonpost.com/news/the-intersect/wp/2016/01/28/the-big-myth-facebook-needs-everyone-to-believe/

Is there any accountability here? It’s certainly possible to influence the global community standards but, as the article notes, this influence is profoundly opaque. Meanwhile, there are good reasons to think that challenge and adjudication simply couldn’t work at this scale. How would it operate? Given it seems content moderators might compromise as much as half the workforce of social media sites, it’s worth thinking about how labour intensive a potential appeals process would be. Why go to that trouble when you can err on the side of simply taking down something on the grounds that someone thinks it’s offensive? Without finding some way to solve the normativity problem described earlier, how to underwrite legitimacy within an aggregate characterised by low social integration, there’s also no obvious ethical counter balance to this organizational tendency.

An absolutely fascinating account of developments in the newsfeed algorith at Facebook since its introduction:

Adam Mosseri, Facebook’s 32-year-old director of product for news feed, is Alison’s less technical counterpart—a “fuzzie” rather than a “techie,” in Silicon Valley parlance. He traffics in problems and generalities, where Alison deals in solutions and specifics. He’s the news feed’s resident philosopher.

The push to humanize the news feed’s inputs and outputs began under Mosseri’s predecessor, Will Cathcart. (I wrote about several of those innovations here.) Cathcart started by gathering more subtle forms of behavioral data: not just whether someone clicked, but how long he spent reading a story once he clicked on it; not just whether he liked it, but whether he liked it before or after reading. For instance: Liking a post before you’ve read it, Facebook learned, corresponds much more weakly to your actual sentiment than liking it afterward.

After taking the reins in late 2013, Mosseri’s big initiative was to set up what Facebook calls its “feed quality panel.” It began in summer 2014 as a group of several hundred people in Knoxville whom the company paid to come in to an office every day and provide continual, detailed feedback on what they saw in their news feeds. (Their location was, Facebook says, a “historical accident” that grew out of a pilot project in which the company partnered with an unnamed third-party subcontractor.) Mosseri and his team didn’t just study their behavior. They also asked them questions to try to get at why they liked or didn’t like a given post, how much they liked it, and what they would have preferred to see instead. “They actually write a little paragraph about every story in their news feed,” notes Greg Marra, product manager for the news feed ranking team. (This is the group that’s becoming Facebook’s equivalent of Nielsen families.)

“The question was, ‘What might we be missing?’ ” Mosseri says. “‘Do we have any blind spots?’” For instance, he adds, “We know there are some things you see in your feed that you loved and you were excited about, but you didn’t actually interact with.” Without a way to measure that, the algorithm would devalue such posts in favor of others that lend themselves more naturally to likes and clicks. But what signal could Facebook use to capture that information?

http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.single.html

From The Boy Kings, by Katherine Losse, pg 201. Losse was asked to write blog posts about Mark Zuckerberg’s philosophy, something which he outlined to her in general terms:

“It means that the best thing to do now, if you want to change the world, is to start a company. It’s the best model for getting things done and bringing your vision to the world.” He said this with what sounded like an interesting dismissal of the other models of changing the world. I could imagine, like he may have, that countries were archaic, small, confined to one area or charter. On the other hand, companies— in the age of globalization— can be everywhere, total, unregulated by any particular government constitution or an electorate. Companies can go where no single country has gone before. “I think we are moving to a world in which we all become cells in a single organism, where we can communicate automatically and can all work together seamlessly,” he said, by way of explaining the end goal of Facebook’s “big theory.”

My sense is this view of ‘companies over countries’ is a relatively common one amongst the digital elite. It’s also key to understanding philanthrocapitalism: the political complexity of the world melts away in the face of a single minded concern to discover the most efficient way of bringing your vision to the world.

But what are the political implications of this? As Losse goes on to write, “It sounded like he was arguing for a kind of nouveau totalitarianism, in which the world would become a technical, privately owned network run by young “technical” people who believe wholeheartedly in technology’s and their own inherent goodness, and in which every technical advancement is heralded as a step forward for humanity.” This is roughly speaking what I’m trying to explore with the techno-fascism idea: how will what is currently a vague musing on the part of digital elites develop as part of a broader set of social transformations currently underway? How will their own vested interests, against a background of growing social upheaval and threats to their accumulated wealth, shape the development of what is at present little more than a mystical faith in solutionism and the singularity?

The extent to which this idea is discursive can be overstated, as Losse explains, she told Zuckerberg that she struggled to articulate his philosophy for him in a coherent way:

The question was what did any of these values actually mean, and why should we want them? This was something only Mark could explain. I told him that I was having trouble coming up with satisfactory essays on the topics he’d assigned, and asked him to schedule time to explain his ideas in more detail, but he was too busy or wasn’t inclined to explain further— it was hard to tell. I came to the conclusion that perhaps he thought I could invent these arguments of whole cloth, or that we already were cells in a single organism and I should be attuned enough to intuit what he meant, but I couldn’t, and so the essays were never written or posted

From The Boy Kings, by Katherine Losse, pg 200:

Most employees I talked with seemed not to be particularly bothered by the company’s decision to forcibly adjust people’s expectations of privacy, preferring instead to focus on the light and almost childlike- sounding goals of sharing and connecting people. “She just doesn’t get it,” a user support manager told me about one employee who was soon to be terminated. “She doesn’t believe in the mission. She thinks that Facebook is for people without any real problems and isn’t actually changing the world. Can you believe that? This afternoon I’m going to have to let her go.” 

I wondered who the heretic employee was. I guessed that she must have been like all of the user support team members: well- educated in the humanities at an Ivy League school, and probably unaware when hired that she had walked into a new kind of technical cult. At any rate, her awareness of issues beyond Facebook was a problem. The company wasn’t paying anyone to be aware of the world beyond the screen. The only questions you were supposed to ask or ideas you were supposed to have at work, as a good citizen of the Facebook nation, were about new ways to technologize daily life, new ways to route our lives through the web.

Do other prominent social media companies have a comparable sense of mission? Do they demand and/or inspire similar loyalty amongst their staff?

One final snippet from The Boy Kings, by Katherine Losse, that I can’t resist posting. It seems that Mark Zuckerberg has a secret back room in his private Facebook office, allowing him to retreat into opacity while sustaining the glass fronted and open plan layout of the corporate offices:

Mark’s office sat adjacent to our pod, with its secret back room (for especially important meetings, because the front room of his office had a glass window onto the hallway that made meetings transparent) hidden behind a wallpapered door and a single table illuminated by a Mad Men –style modern lamp, receiving a constant stream of celebrities and tech luminaries and wealthy Russians in silk suits. (Pg 196)

This is the same Zuckerberg who bought four homes adjacent to his in order to ensure his own privacy. His own power dramatically illustrates the politics of transparency and opacity in digital capitalism. We can see this even more dramatically in the private retreats of the digital elites: if transparency gets tiring, why not just head off to your super yacht or Hawaii estate for a while? As Zuckerberg describes it, quoted on pg 198: “We are pushing the world in the direction of making it a more open and transparent place, this is where the world is going and at Facebook we need to lead in that direction.” The key terms here are pushing and lead. The pushers and the leaders are able to take a break when they’d like, without worrying about someone else perpetually trying to push and lead them.

I think this could be analysed in a similar way to how Bauman explored mobility in his work on globalisation: those at the bottom of the hierarchy are transparent because they lack the resources to escape the filter bubble, while those at the top of the hierarchy are usually transparent as a function of their own commercial success. But one condition is forced, leaving the people in question susceptible to manipulation, while the latter is chosen and can be voluntarily withdrawn from in private life.

From The Boy Kings, by Katherine Losse, pg 194:

The floor around Sheryl’s desk was piled with the endless gifts that she received from business contacts in lofty positions at Fortune 500 companies. People sent her Louboutin heels and Frette candles the diameter of dinner plates, which she unpacked while on speakerphone with some CEO or another. Sometimes, she passed them over the desk to me offhandedly, just trying to get rid of them, but usually they just sat in piles under the desk until someone cleared them away, to be replaced by new, just as superfluous, luxury gifts. Mark’s desk was similarly surrounded by boxes and gifts, but they were more boyish: a sports jersey signed by a soccer star, some video game that hadn’t been released yet. I didn’t have any presents (other than Sheryl’s cast- offs), but I had a front- row view of the business lives of the extremely rich and powerful, whom I now knew spend much of their days managing the world’s desire to be close to them.

From The Boy Kings, by Katherine Losse, pg 191-193:

The catch for Facebook was that the more successful we became (and we were still, despite all the competition, dominant), the more likely employees were to be distracted by money and the new pastimes it enabled: fine dining, bar hopping, five- star vacations, expensive cars. In this sense, winning the game completely was a bit of a curse, because as our user numbers climbed quickly to 250 million in July 2009 and 350 million in December 2009, early employees had less incentive to work constantly, and more leeway to play games and party earlier in the night instead of waiting until the dead hours of two in the morning to socialize like we used to. New engineers were being hired all the time to take up the slack of bug fixing and code development from employees who had been there longer. The Facebook product itself made staying on task difficult: With the steady stream of pictures flowing down our pages, how could we be expected to focus on anything but planning our next photo opportunities and status updates? Looking cool, rich, and well- liked was actually our job, and that job took a lot of work.

I’d like to know if this goes hand-in-hand with a ratcheting up of corporate perks, at least for the engineers and executives, in an attempt to at least keep the hedonism in house. 

From The Boys, by Katherine Losse, pg 146:

My career upgrade from dungeon department to quasi- technical role meant, along with a better salary and more respect from the technical echelon of the company, that I was now on engineering time. This meant that while I could come to work later, as late as lunchtime, I was expected to stay up until all hours answering emails and devoting myself even more monastically to our new enterprise. However, even as the respect and pay were higher, which was a huge relief, genuflecting to external application developers, even if I didn’t agree with what they were doing, felt a lot like the eternal reverence we nontechnical employees were all expected to exhibit for Mark and the engineering department.

From page 152:

Becoming a fully fledged member of the engineering team that winter felt, as I long dreamed of doing, like going from being slave to being conqueror. Suddenly, I could arrive at work on my own time, as long as I was working late into the night, because it was assumed that I, like all the engineers, was upholding and advancing a whole new world, even if sometimes we were just sitting around in the office eating snacks and playing games. In engineering, getting to work late was cool, even necessary. It meant, in the ideology of the lone and maverick hacker, that you weren’t beholden to authority, and that you might have been up late coding something brilliant and life- changing and disruptive (even if you were just trolling Facebook or watching porn). Being in engineering wasn’t an escape from the game so much as the ultimate playground.

From page 155 to 156:

I spent days with the professional translators while they read through pages of translations and made corrections as needed. They were working by the hour, clocking out at six o’clock, and thought it strange that I seemed perennially online the entire week, answering chats, reading Facebook, talking with them, answering questions, and responding to emails at all hours. When they left the office at the end of the day, they were done until the next morning. That, in turn, seemed strange to me. I couldn’t remember when the last time was that I wasn’t within spitting distance of my computer and smart phone. As much as I had once made fun of the Facebook boys for staring at their phones more often than they looked up, I had become one of them.

A fascinating snippet from The Boy Kings, by Katherine Losse, describing the approach of a new operations director joining Facebook in 2007. From pg 144:

The next week, Chamath asked me and my management colleagues in customer support to do an evaluation exercise in which we ranked everyone on the Customer Support Team from highest to lowest. Sitting up late that night in the office, I assigned a score to each person on the team. Some were easy to score: They were either spectacularly hard workers or rather lazy, preferring to play company- sponsored Beirut games to the alternately hard and tedious work of solving user problems, but for most it was a queasy and difficult process of comparing apples to oranges, which, in this case, might be one person’s quickness at answering emails versus another’s thoroughness and accuracy. 

When the results were in, Chamath came back to deliver a speech. “Look around you,” he told us. “In a few weeks, some of the people in this room won’t be here. They will be moved to other departments, because they’ve worked hard and have made themselves valuable to the company. Other people in this room won’t be here, because they haven’t worked hard enough. I’m telling you this because you need to understand that this is how it works: You are always being ranked, and it’s your job to perform. How you do here is up to you, but no one’s going to let you get away with not pulling your weight.”

From The Boy Kings, by Katherine Losse, pg 134:

That Sunday, after I’d slept off our long night, I logged in to Facebook to see an endless stream of videos that the boys had filmed at the club. In them, the boys were not chatting up or kissing girls they had met, as I had expected. Instead, they were performing an elaborate ritual only they would have the strange, cold vanity to invent, in which they would methodically chat up and reject girls that the bouncers had brought to their table. “Leave! You’re not pretty enough!” one of them seemed to say over the din of the club as he shooed the girls away in succession like so many servants. Even though I had been living in this boys’ world for almost two years, I was still a bit shocked. Their products ultimately reflected their real- life behavior. Instead of making a technology of understanding, we seemed sometimes to be making a technology of the opposite: pure, dehumanizing objectification. We were optimizing ways to judge and use and dispose of people, without having to consider their feelings, or that they had feelings at all.

The intruiging suggestion made by Losse is that these tech bros represent an epochal transformation in American alpha masculinity. She doesn’t really follow it up but I’m completely persuaded that tech bros, as well as bro culture in general, represent something of profound sociological significance.

From The Boy Kings, by Katherine Losse, pg 25:

For example, on Mark’s birthday, in May 2006, I received an email from his administrative assistant telling me that it would be my job that day, along with all the other women in the office, to wear a T- shirt with Mark’s picture on it. Wait, what? I thought, he’s not my god or my president; I just work here . The men in the office were told that they would be wearing Adidas sandals that day, also in homage to Mark. The gender coding was clear: women were to declare allegiance to Mark, and men were to become Mark, or to at least dress like him. I decided that this was more than I could stomach and stayed home to play sick that day. I was the only one. The other women in the office, including Mark’s girlfriend, who did not work at Facebook, but had come to the office to celebrate his birthday, happily posed for pictures wearing identical shirts printed with Mark’s picture, like teenage girls at an *NSYNC concert or more disturbingly, like so many polygamous wives in a cult.

Gawker apparently posted photos of this at the time but I’m struggling to find them.