This was an exciting day for Digital Sociology, as an esteemed group of speakers gathered in the august surroundings of the Churchill Room in the Treasury to discuss sociology’s contribution to understanding and defining our digital future. As BSA President Susan Halford explained in her introduction, the event is intended to pool the expertise of (digital) sociologists and bring this into dialogue with officials. It’s important to have these conversations because digitalisation is a much more open process than conversations about it tend to assume. Halford put this very powerfully, reminding us “there is nothing inevitable about digital society and there is nothing inevitable about digital futures” because “technologies on their own do nothing”. The event was co-chaired by Phil Howard from the Oxford Internet Institute and David De Roure from the Turing Institute who each explained their sense of sociology’s importance. Howard described sociology as among the most agile disciplines, well suited to working with new domains of data which didn’t exist only a decade ago. He described sociology as being at the leading edge of crafting new forms of data and well suited to producing action-orientated research. He reflected on the rewards and risks of sending out research without peer-review, filtered through internal review but with the advantage of getting findings out to policy makers and others at speed. De Roure stressed how computer science is insufficient for building contemporary systems, involving a combination of computers and people as they do. These ‘social machines’ require an understanding of the social if we are to grasp their operations. Three panels over the course of the afternoon went a long way to illustrating what that understanding looks like and how it can be applied.

Panel 1: Youth Futures

The first speaker was Sonia Livingstone from the Department of Media and Communications at LSE. Reflecting Phil Howard’s claim that sociology is bridging the quantitative/qualitative divide, Livingstone’s work draws on qualitative and quantitative data to elucidate what digital technology means for parents and childhoods. Parents seek to equip their children for what they imagine will be a digital future, often framed in terms of exaggerated risks which digital technology is assumed to carry for children. Media and policy debates make extreme claims with weak groundings in research, exasperating the problems found in families over issues such as how much screen time is suitable for children each week. Underlying these challenges is the question of who is meant to guide parents in negotiating the challenges and opportunities of digital parenting? Livingstone explained how parents don’t know how to offer positive messages to their children about technology and the major finding of her research has been that parents are effectively on their own when it comes to the potential of digital technology to enrich their futures. This gap has created a huge market for tools and services which aim to help parents, but it’s extremely difficult for them to assess these offers and know which might be beneficial to their children.

The second speaker was Huw Davies from the Oxford Internet Institute who is also co-convenor of the BSA’s Digital Sociology group. He identified two reasons why it’s important to study how children and young people use social media. Firstly, researching young people can help us anticipate the future of media consumption. Secondly, teens often use media in ways which subvert attempts to control and regulate user, in the process offering strategies from which all of us can learn. His research into how young people understand the internet has found that many inhabit a profoundly appified web, with little sense of how the internet works beyond the particular apps they use. However there is also evidence of a remarkable literacy amongst at least some of this cohort, with a well developed capacity to use the functionality which tends to be subsumed into the unhelpful category of the ‘dark web’. Nonetheless, teens are often not as savvy as they assume they are and their capacity to enter these semi-legal online spaces can leave them vulnerable to some of the ill-motivated actors which can be found within them.

The third speaker was Josie Frasier from the Department of Media, Culture and Sport. She began by talking about the digital charter and the importance of supporting people to participate in digital spaces. There are huge benefits to digital participation but as the speakers thus far have stressed, it can also exacerbate social inequalities in ways which are immensely important to recognise. Her talk covered a range of initiatives currently underway within government which seek to recognise this duality, informed by a growing awareness that ‘online’ problems inevitably have ‘offline’ manifestations. For this and other reasons, the problems posed by digitalisation are interconnected. As Frasier put in response to a question, “These are not internet problems, these are social problems which are acted out in the space fo the internet”. Frasier stressed how DCMS is building on the work of digital humanities and is looking to the sociological community for further conversations. The upcoming white paper offers an immediate means through which we can do this.

Panel 2: Work Futures

The next session began with Phil Brown from Cardiff University talking about the reality underlying the rhetoric of automation. Claims about the impending reality of mass unemployment driven by automation circulate widely, with a significant risk of exaggeration. Nonetheless, the general direction of travel is clear and there will be a declining demand for labour, posing problems of how we divide up the fruits of that labour in terms of productivity and wealth. The real problem we have today is not skill scarcity, explains Brown. It is a jobs mismatch rather than a skills mismatch which will create social problems as automation proceeds. The decisions made (or not) today already shape the future and there is a real risk they will concentrate diminishing rewards from labour in the hands of the few. Rather than the digital economy being a bounded phenomenon, it represents a transformation in the whole policy process. The only way we can address this is by being clear about what our institutions are for and what they stand for. If we can’t address these fundamental questions then we will inevitably address these problems in a piecemeal way. He ends with a fascinating argument about the potential of digital analytics for an active industrial policy, no longer reliant on asking employers what they want. It is a powerful idea with some exciting consequences.

The second speaker was Jacqueline O’Reilly from the University of Sussex Business School. She recently completed a major work, Work in the Digital Age, offering a comparative outlook of digital development across Europe. O’Reilly went on to do discuss the Digital Economy and Society Index (DESI) ranking that “summarises relevant indicators on Europe’s digital performance and tracks the evolution of EU member states in digital competitiveness”. On this measure, Denmark, Sweden, Finland, and the Netherlands have the most advanced digital economies in Europe. Looking to these and other measures through a comparative framework helps open up a range of crucial policy questions, cutting through the clutter which usually gets in the way of our conversations  about about practical responses. For instance the UK does well on digital skills but evidence suggests that employers are not taking up these skills, inviting analysis of why this is the case. O’Reilly ended with a discussion of how to produce something akin to a DESI ranking that extended beyond Europe and what this would mean for our capacity to address the global challenges which digitalisation is producing.

The final speaker was Xander Mahoney who is a Policy Advisor at the Department for Digital Culture, Media and Sport talking about the longer term challenge of automation. He argued that it is unlikely that we are seeing the ‘end of work’ and we need to be realistic about how advanced technology is going to become. Nonetheless, the rate of development of technology is ever-increasing and this means we are going to be left with different jobs but the same workers. What support should be offered to the workers who have been made technologically redundant in the workforce? They will need training and welfare, directed towards opportunities which are difficult to predict in advance.

Panel 3: Data Futures

The final session kicked off with Ben Williamson from the University of Edinburgh talking about how digital data is transforming the university. These institutions are increasingly imagined as ‘smart’ organisations built around data infrastructure, with a whole range of innovations being pushed by a diverse array of actors. This has included the Department for Education commissioning developers to produce apps to provide students with data-driven ways to navigate the application process. The problem from a sociological perspective is that the data involved is being treated as an objective window onto the reality of higher education. Data is produced through a range of activities and expresses prior interests, obscured by platforms and services which present it in naive way e.g. data visualisations distance our attention from the organisational process which produce them. This means a narrow quantitative representation of a university comes to replace the messy organisational reality, leading to profound limitations for policy and practice. Williamson discusses how we can respond to this through developing new methodologies which better represent the complexity of the university, while replicating some of the advantages which the aforementioned data-driven methods are seen to have.

The second speaker was Helen Kennedy from the University of Sheffield, reflecting on why understanding people’s perceptions and experience of data matters for data futures. While it’s true that we won’t get data policy and practice right unless we listen to expert views on them, unfortunately there’s not a lot of evidence about how data practices are perceived by non-experts. We see increasing evidence from sociological research about the capacity of digital systems to reinforce and entrench existing inequalities. Kennedy describes her current research about public perceptions of the BBC’s uses of personal data, undertaken with the BBC itself. This research has found that trust in an organisation’s data practices has little to do with its data practice and in fact reflects the broader perception of the organisation’s values and activity. Nonetheless, there is not a clear relationship between them, as high levels of trust in an organisation doesn’t necessarily lead to high levels of trust in the organisation’s data usage, a state of affairs described by Kennedy in terms of ‘complex ecologies of trust’. For these reasons, we need to do data literacy differently, involving people’s emotional relationship to data rather than relying on narrow cognitive models.

The final speaker was Farah Ahmed, Head of Data Ethics at DCMS, who began with the important question of whether we overestimate the novelty of these questions. They described the experience of working on the government’s open data strategy, one of its kind at that point in time, reflecting on the challenges that they faced at the time. Similar issues can be found now in the Centre for Data Ethics recently setup and already having undertaken a range of projects. It was an engaging way to end the day, bringing us back to the realities of the policy process and the role which research can play if it can cross the academic/government interface in an effective way. It left me feeling extremely optimistic about the future influence of sociology, as the policy officials were consistently responsive to the work presented and were keen to expand the conversation beyond the day itself.

The robots are coming! The robots are coming! After watching More Human Than Human, I’ve woken up preoccupied by the rise of the robots narrative and how inadequate it is for making sense of the cultural politics and political economy of automation. The film is an engaging exploration of artificial intelligence and its social significance. While its analysis is often superficial, it foregrounds the agency of the roboticists and thinkers who are shaping emerging technologies and this feels important to me. Nonetheless it sits uneasily with the film’s tendency to frame technological change as inexorable, able to be steered for good or evil but nonetheless impossible to constraint. This is a tension at the heart of disruption rhetoric, celebrating innovation as a form of creativity while holding it to be unavoidable. But this is just one way in which the film so starkly embodies a broader trend.

One reason it is important to see the figures shaping these developments is that it makes clear how white, male and Anglo-American they are. As Jana Bacevic observed, the film manifestly fails the Bechdel test. There are three women with speaking roles in the film, only of whom talks about her own work but does so in a way framed through the lens of the man whose memory powers it. As far as I can recall, every single person in the film is white, mostly American with a few northern Europeans thrown in for good measure. The only exception is a Russian-born women in the film who now works as an entrepreneur in Silicon Valley. This is problematic for many reasons, not least of all that much cutting edge work in artificial intelligence is taking place in China. By ignoring these developments, not only does the film undermine its own investigative mission but it further evacuates the political questions it raises by robbing them of their geopolitical dimension. Disruptive innovation is bound up in techno-nationalism, as machine learning becomes an arms race with epochal significance at a time when American power seemingly enters a state of terminal decline after years of domination without hegemony.

The film ends in a contemplative mode, reiterating familiar rumination about our future. Every sentence in the closing scene repeatedly invokes ‘we’ and ‘our’. Who are we? How does the white American author in his early 30s who provides the intellectual narration for the film come to articulate the agenda of this we? How does the older white American director who provides its substantive narration, with the film framed around his own personal project in disruptive innovation, come to articulate the agenda of this we? The ‘we’ here is devoid of politics. It is a we without a they as Chantal Mouffe would put it. At a time when the liberal order is in chaos, we ought to be suspicious to the point of paranoia about the emergence of a powerful narrative of civilisational renewal in which we can save ourselves or we can doom ourselves. It is Anglo-American capitalism mystifying its own bleeding age, making a religion out of its own products and celebrating them as world-making or fearing them as world-breaking. None of this is to deny hugely significant technological advances are occurring. But the rise of the robots narrative actively frustrates our understanding of it, systematically shutting down the intellectual space in which it becomes possible to think through the cultural politics and political economy of automation. Provincialising disruption is unavoidable if we want to understand the reality of putatively disruptive technologies.

There’s a full explanation of this on Russ Kick’s blog. If I understand correctly, there a formal process in which federal agencies coordinate with the national archive to determine the status of public records. These requests are usually green lit by the National Archives & Records Administration, though they theoretically have the power to refuse them. This is how Russ Kick describes what is happening: 

The Department of the Interior has sent NARA a massive Request for Records Disposition Authority.

Interior’s request involves documents about oil and gas leases, mining, dams, wells, timber sales, marine conservation, fishing, endangered species, non-endangered species, critical habitats, land acquisition, and lots more.

The request covers these categories of documents from every agency within the Interior Department, including the Bureau of Land Management, National Park Service, US Fish & Wildlife Service, US Geological Survey, Bureau of Safety and Environmental Enforcement, Bureau of Indian Affairs, and others.

The request covers already-existing documents going back more than 50 years. Thousands of cubic feet of paper documents. Gigabytes of digital documents. Besides existing documents, as usual the proposed schedule will also apply to all future documents created in these categories (whether on paper or born digital).

It’s hard not to wonder if this might be the start of requests by other agencies. For all the centrism running through the latest book by Michael Lewis, I still found it a powerful account of the institutional vandalism currently underway. People who don’t understand the federal agencies they have been appointed to are nonetheless committed to eviscerating them from the inside, undertaken with varying degrees of direct self-interest. In the context of these appointments, it would be naive to assume anything other than the worst in response to this request.