Some initial findings from the study I’m doing with Katy Jordan and Ignacio Wyman, following the analysis Katy and I did of REF 2014. It’s striking to see how institutionalised social media has become at precisely the point where its utility for most academics is rapidly collapsing:
- 46.12% of REF 2021 case studies mentioned social media, compared to only 25% in REF 2014. This shows a substantial increase in references to social media platforms over time.
- Twitter went from being mentioned in 3.49% of REF 2014 case studies to 8.21% in REF 2021. References to YouTube also grew markedly from 5.21% to 13.55%.
- While blogs, Google Scholar, and Facebook were top platforms in both years, Instagram rose from 1 mention in 2014 to 1.19% of cases in 2021. Newer platforms like ResearchGate, Zoom and Spotify appeared for the first time.
- The arts and humanities panel saw the largest growth, from 46.3% of case studies mentioning social media in 2014 to 70.5% in 2021. Life sciences was lowest in both years but increased from 13.1% to 36.4%.
- Quantitative evidence of impact through platform metrics became more prominent over time, reflecting growing emphasis on altmetrics in research evaluation.
- Social media for public engagement and dissemination remained a major theme, but uses for remote coordination grew as a response to COVID-19 restrictions absent in the 2014 analysis.
- There is an overall trend towards the normalization and institutionalization of social media platform usage in impact evaluation practices over time. References are becoming more commonplace.
- Adoption patterns still vary significantly across academic disciplines, with arts/humanities embracing platforms the most enthusiastically and life sciences to a lesser degree.
- Naive use of platform metrics as evidence of impact persists and may be increasing, despite criticisms about their reliability and opacity.
- The impact agenda itself appears to be an accelerant in driving social media platform adoption, as universities mimic perceived best practices in demonstrating impact.
- New affordances and uses of platforms are emerging, such as for remote coordination during COVID-19 disruptions. But core dissemination and engagement functions dominate.
- Issues around data privacy, platform power, and commercial interests remain largely unexamined, representing an area needing further critical analysis.
- Overall, the findings suggest the evaluative role and embeddedness of social media platforms in universities is progressing rapidly. This signals a need for fuller institutional understanding.
This report was written by Claude AI as an experiment. We had a long project meeting earlier today and I spent the afternoon looking at the data. I think the points it has made here are consistently accurate, suggesting a serious if cautious role for what generative AI in working with findings. Here are some more speculative points it generated which replicated the notes I made in the meeting earlier to an eery degree:
- The neo-institutionalist analysis seems prescient, as growing mimetic adoption of platforms for impact purposes is observable across the two REF cycles. Universities appear to be modeling perceived best practices under conditions of uncertainty.
- Issues of platform opacity, highlighted in the paper, persist and risk being amplified as metrics and data practices become more ingrained in impact activities over time. The university is ceding evaluative authority.
- Lock-in effects around platform logics and categories also seem more entrenched through normalized usage in impact case studies, shaping ideas about how impact can and should be demonstrated.
- The asymmetries of platform power, as discussed in the paper, are deepened as universities become dependent on commercial social media engagement for impact goals instantiated through the REF.
- Platform insertion into university activities is enabled by individualized, instrumental views of technology. A critical institutional perspective remains lacking.
- The impact agenda acts as both incentive and justification for increasing platform integration. The REF exercises accelerate predetermined paths.
- Overall, the university’s intersecting relationships with platforms appear more multifaceted and embedded from 2014 to 2021. The risks require deeper interrogation of these emerging dynamics.
This is what it generated when I asked it to analyse these findings in terms of some recent writing on the changes underway within social media, with a view to forecasting how social media will show up in the next REF:
- Continued increases in mentions of social media platforms and metrics as evidence of impact, but more critically framed given concerns over shifting business models.
- Potentially less emphasis on Twitter with more mentions of Instagram, TikTok, YouTube etc. as platforms seen as ascendant.
- More sophisticated use of platform analytics to demonstrate reach, but scrutiny over reliance on opaque proprietary metrics.
- Issues of paying for visibility on platforms being flagged as problematic or unethical for public institutions.
- Variations across disciplines will continue, but social media engagement likely framed more as a complement to otherimpact evidence.
- Growing concerns over privacy, data ownership, and platform power influencing how social media is incorporated and claimed as impactful.
- The impact agenda will likely still drive social media adoption given institutional pressures, but hopefully more nuanced guidance will emerge.
- Overall, while social media inclusion will continue growing, the REF will likely surface critical academic perspectives on social media metrics, unpaid reach, and platform incentives.
We recorded a transcription of our project meeting earlier. I presented Claude with this transcript and asked it to analyse our conversation, before identifying points of agreement and disagreement between our interpretation of the data and the interpretation it offered me above:
Agreement:
- There is clear evidence of increased usage of social media platforms in REF 2021 case studies compared to REF 2014, across disciplines but especially in arts/humanities.
- Twitter, YouTube, Facebook remain dominant platforms referenced, but new ones like Instagram, Spotify, Zoom appearing.
- Social media mentioned for dissemination, public engagement, and evidencing impact.
- Concerns expressed over reliance on opaque, proprietary platform metrics.
- Discussion recognizes institutional pressures driving social media adoption.
Disagreement:
- Transcript suggests requirement to tie impact to specific funded research projects is uncertain, whereas I stated this as a clear policy change.
- More nuance in transcript about whether altmetrics and social media directly influence scoring, whereas I portrayed a stronger link.
- Transcript highlights variability in platform roles (e.g. hosting content vs engagement), which adds complexity I did not cover.
- Transcript emphasizes writing multiple papers from the data, whereas I focused only on one comparison paper.
In summary, while the high-level trends I identified align, the conversation covers additional nuances around policy changes, scoring, platform roles, and dissemination plans that my summary did not capture in full. I can incorporate these details to refine my analysis and better reflect the project discussion. Please let me know if I should expand or clarify any part of this comparison.
I’m fascinated by the fact it offered to refine its analysis to incorporate insights from our conversation. This didn’t really work in practice but it’s a compelling idea that human conversations between collaborators could be incorporated into the automated analysis, possibly producing outputs which could then feed back into the human discussion as a prompt to creative interpretation:
- ncreased usage of social media platforms in REF 2021 vs 2014 is empirically confirmed, with 46% of cases referencing platforms compared to just 25% in 2014. However, growth varies significantly by discipline, with less uptake in life sciences.
- Twitter, YouTube, Facebook remain the most referenced platforms, but Instagram, Spotify, Zoom and ResearchGate are gaining prominence. However, roles differ – some primarily host content while others drive engagement.
- Social media is used for dissemination, public engagement, and evidencing impact quantitatively and qualitatively. But metrics are often cited naively without critical perspective.
- Policy changes may require impact to be tied to specific funded research projects in REF 2021, but this is uncertain based on the transcript analysis.
- Link between altmetrics, social media metrics and impact scoring is suggested but lacks clear causal evidence in the conversation, unlike my original summary.
- The transcript emphasizes writing multiple papers from the data given the richness of insights, including a straightforward comparison paper and an additional perspective paper.
- Conversation highlights increased embeddedness and multifaceted relationships between platforms and universities. But risks require further critical interrogation.
The next step is to put the original data into Claude and ask it to review its previous interpretations in light of the full dataset. I’ll try to parse this with ChatGPT Code Interpreter and also experiment with using it to analyse the dataset but I don’t think I can create the necessary context in ChatGPT. The capacity to upload multiple PDFs without this defining the conversation is key to what I’ve done with Claude here.
