
ResearchPod
ResearchPod
How are sociodigital futures being claimed?
This second of three podcasts exploring ‘Claiming Tomorrow – Sociodigital Futures in the Making' is asking 'How are sociodigital futures being claimed?'
Claims about the future shape government policies, shape investments that are made by companies, and how all of us think about our lives. But how are these claims being made?
Listen to Susan Halford, Jessica Pykett, Debbie Watson, Paul Clarke and Beckie Coleman as they explore this timely subject.
This podcast is brought to you by the Centre for Sociodigital Futures – a flagship research centre, funded by the ESRC and led by the University of Bristol in collaboration with 12 other Universities in the UK and globally. The support of the Economic and Social Research Council (ESRC) is gratefully acknowledged.
00:00:11 Narrator
This podcast is brought to you by the ESRC Centre for Socio Digital Futures or CENSOF, a flagship research centre led by the University of Bristol in collaboration with 12 other universities exploring socio digital futures in the making. The support of the economic and Social Research Council is gratefully acknowledged.
00:00:31 Susan Halford
Welcome to Claiming Tomorrow, sociodigital futures in the making. I'm Susan Halford, codirector of the ESRC Centre for Socio Digital Futures or CENSOF, as we call it. And before I introduce the panel that I have with me today, just a quick word about what we're discussing in this podcast. In this series of podcasts.
00:00:52 Susan Halford
We've been talking about digital technologies, social life and possible futures. Our starting point in all of them is that socio digital futures are impossible to predict in any detail. Nonetheless, we're surrounded all the time by claims about the future.
00:01:08 Susan Halford
And these claims are important. What kinds of claims are made? Who's making them, and how those claims are made about digital technologies society in the future has really big effects in the present. For example, it shapes government policies. It shapes the investments that are made by companies, and quite possibly it shapes.
00:01:28 Susan Halford
How all of us think about our lives and the decisions we make. In turn, this opens up some kinds of futures and closes down others.
00:01:39 Susan Halford
In this podcast, what we're going to focus on is how claims about socio digital futures are made and why that really matters.
00:01:48 Susan Halford
Before we get into it, I want to introduce the panel. Actually, I'm going to ask the panel to introduce themselves. Just tell us a little bit about who you are.
00:01:57 Debbie Watson
So I'm Debbie Watson. I'm professor of child and family welfare, based in School of Policy Studies.
00:02:02 Becky Coleman
I'm Becky Coleman. I'm professor in Bristol Digital Futures Institute in CENSOF and in the school of sociology, politics and International Studies here at Bristol.
00:02:13 Jessica Pykett
I'm Jessica Pykett, professor of social and political geography at the University of Birmingham and Co, director of the Centre for Urban well-being.
00:02:20 Paul Clarke
And I'm Paul Clarke, professor of performance and Creative Technologies, one of the Co directors of the Center for Creative Technologies here at Bristol and in the center I work on design.
00:02:33 Susan Halford
Well, what a wonderful panel. Welcome, everybody, and thank you for being here. So let's start with an example to try and bring all of this to the life. I'm wondering, Becky and Debbie, can you give us some examples of the kinds of claims that are made about socio digital futures and in particular why we're interested in how those claims are being made?
00:02:54 Becky Coleman
Yeah. So before I get to an example, I think it's interesting. I've been thinking about kind of claims as a term and often I think we think about it as an announcement or a speech or somebody saying something. But perhaps what's new and what's definitely important about socio digital future claims and kind of how they're being made now.
00:03:14 Becky Coleman
Is that a claim can be a kind of doing. So I'm thinking about hey, how AI for example, is being rolled out and that this might be a a kind of a way of thinking about how claims are.
00:03:27 Becky Coleman
Being made so we're currently seeing AI being added to all kinds of things that we - that's we need to think about who that we is - but that that we use in the course of our day-to-day lives. So things like WhatsApp or at the top of Google searches or Microsoft products. So either they're they're kind of new additions or they're making.
00:03:48 Becky Coleman
Kind of AI and and kind of what AI can do really visible in new ways.
00:03:53 Becky Coleman
So I think this is a kind of claim making that these tools or these softwares are important. That's one claim that they're needed perhaps, and also that that we want them and that that, yeah, that this is being doing as they're being rolled out. So there's no kind of checking or kind of consultation beforehand.
00:04:13 Becky Coleman
Actually, that happens once they're already in use
00:04:16 Becky Coleman
So I also think this idea around kind of how claims are being made also makes us think about things like testing and and kind of how we're evaluating these sorts of things. So it's being done in the wild. I kind of put that in inverted commas or testing kind of beyond the lab or or in the streets.
00:04:36 Becky Coleman
And that this opens up lots of new questions for us. I think in terms of the how of claims.
00:04:43 Debbie Watson
That's yeah, really interesting. I think ‘claims as doing’ is very much how I'm thinking about social digital futures as well. And I think certainly in the areas of work that I'm interested in that extends to claims that actually people's personal data has become a kind of public property and a public right and that those data that we have.
00:05:04 Debbie Watson
The data points we have about us are accessible to governments, to local governments, to companies, et cetera, to be able to build a picture about us and make claims about us.
00:05:17 Debbie Watson
And there's very little critique of that, and particularly in the context of children and young people where children and young peoples data is being brought together under claims of efficiencies and resource saving and safeguarding. You know, we're going to look after children better by knowing them better through their data.
00:05:36 Debbie Watson
And operating through AI systems and so forth in order to make assumptions about people and those claims - they matter because they predict certain futures
00:05:47 Debbie Watson
and we see, you know the kind of worries about predictive policing, those sorts of things are happening in every aspect of social life and across the kind of social care and welfare and context which touches all of our lives. You know, predictions are being made and that for me is really deterministic. We end up in a situation where.
00:06:08 Debbie Watson
People can't escape the data that has defined them at certain points in their lives. It means that children who are born in certain circumstances, who have certain difficulties in early life, there is always an assumption that they're not gonna do better. And for me that goes completely against the grain of.
00:06:27 Debbie Watson
Being an educator, being passionate about children's outcomes and the possibilities for their futures.
00:06:34 Susan Halford
Thanks very much. I mean that's really, really important point I think you're making, the claims and not just embedded in spoken words or policy statements, although they might be, but they also become embedded in the material infrastructures, the technical systems.
00:06:48 Susan Halford
Of our everyday lives and they become embedded in the way that things get done.
00:06:52 Susan Halford
So they become embedded in healthcare or in education or in social work and become part of the set of ways in which things just happen. And I really liked when both of you were talking. Yes, that's happening in the present, but its
00:07:06 Susan Halford
Legitimated in some ways by ideas about a better future all the time.
00:07:12 Susan Halford
He gets to name that better future and at the same time it's making a future because investments are being put into these systems, they're not going to just be thrown away, almost certainly, and reinvented tomorrow. There are legacies and sunk costs to these things. So thank you so, so much. I thought that was really, really interesting and.
00:07:32 Susan Halford
Probably all of us, including people listening, can start to think about the many, many ways in which that happens. It becomes huge. You know, these kinds of future claims both spoken and embedded and done are all around us in the centre. Of course, we're we're grappling with how to analyse that and think about that. I'm wondering. Paul. Debbie.
00:07:51 Susan Halford
If you've got any thoughts on how we come to grips with that in terms of whether there's kinds of patterns or analytical themes that we can use to think about all this diversity of claim making?
00:08:03 Paul Clarke
Yeah, I'm going to pick up on some of the things that Debbie said that Becky said and also that you've picked up on particularly around claims being made through doing. And I'm gonna use the example of OpenAI and and their models and think a little bit more about those models like ChatGPT, which I think we're all pretty familiar with. They're becoming fairly ubiquitous.
00:08:24 Paul Clarke
And I Keep hearing lots of conversations and opinions about them, including in, in pubs and and on buses.
00:08:30 Paul Clarke
So Open AI, it's not just researchers within universities that are publishing, Open AI are publishing their own research and stories in which they're making claims about the potential of their Gen. AI models and and how they'll transform different sectors, our jobs and our lives.
00:08:51 Paul Clarke
But as we've already talked about, the claiming is not just about stating claims in in publications. A claim can also be made by creating something, by releasing a new software tool for people to.
00:09:04 Paul Clarke
Views which is in the case of this software, pretty disruptive and changes our practices like how we work or how we write. So we can think about how those claims are materialised or physicalised if you like. They're embedded in hardware and in software that performs or enacts those claims and
00:09:25 Paul Clarke
Also changes what we do and that makes the claims, the stories and the the narratives that Open AI are telling the futures they want more likely. And it also brings in investment, as you've said.
00:09:38 Paul Clarke
It's interesting, I think with platform business models, that the software is often released for free and we're we're given free access. It's a it's a gift to us and
00:09:48 Paul Clarke
at least to start with
00:09:49 Paul Clarke
It's given for free and then of course we're encouraged to subscribe and the developer or the technology company doesn't necessarily know what the use of their tool is
00:10:00 Paul Clarke
until it's released into the world.
00:10:03 Paul Clarke
And in a way, we become the kind of unpaid testers and also come up with use cases for their software. And then investment follows those use cases. So in a way, the tech company puts something out to discover if or how it can be used. And in some ways I would say AI is a solution in search of problems.
00:10:25 Paul Clarke
Which users are working backwards to invent, and it's also a solution that's generating new problems, including social, legal, moral and ethical ones.
00:10:36 Paul Clarke
It's pretty hard to resist, we're all probably finding it's pretty hard to resist AI for some of the reasons that Becky's mentioned. Using it becomes inevitable and it's difficult to make an informed choice about because it's built into other platforms like the example you mentioned, like Google search engine.
00:10:54 Paul Clarke
We’re probably all used to now seeing the AI summaries that appear at the top.
00:10:58 Paul Clarke
ChatGPT is is integrated into our devices like the iPhone I have, so I bought into open AI's claims about its value. I bought the new iPhone even though I don't feel like I've decided to so in the centre as I as I said in my intro, I work on design. I'm exploring the role of design in socio digital futures.
00:11:21 Paul Clarke
And I'd say that that design is always political and ideological. So ideologies or claims, if you like, are built in by design, they're baked into the software, they're embedded into our devices. And, for instance, existing biases, which we know AI tends to amplify.
00:11:41 Paul Clarke
And I looked it up the other day and Open AI talked about their mission. So they say their mission is to ensure that artificial intelligence benefits all of human.
00:11:51 Paul Clarke
And they want to build AI tools that help people solve really hard human problems. So that's what their mission is. And I'd say it's pretty techno utopian mission that AI will help create a better society, but seems really important. And it's been coming up and some of the things that.
00:12:11 Paul Clarke
That Debbie was saying to ask who will benefit and who will be better for.
00:12:17 Paul Clarke
It would be hard to say that everyone is included in or shares open AI's vision. Just think, for instance, about how threatened creatives are feeling by Gen AI, Songwriters or screen writers. And the issues it's creating around copyright at the moment.
00:12:34 Paul Clarke
I noticed that earlier this year, even the Daily Mail picked up on this, so the Daily Mail was encouraging readers to e-mail their MP about protecting creatives copyright from AI
00:12:45 Paul Clarke
and campaigning against the idea that musicians, authors and film makers would have to opt out of their work being used in data sets to train AI models. Opt out of their creative work being included and claimed, if you like, for free.
00:13:04 Paul Clarke
Daily mails, typically dramatic headline was don't let big tech steal the UK's creative genius.
00:13:12 Paul Clarke
And I guess the issue of consent for artwork to be in training data sets and permissions is really important, particularly for me since I work in the creative industries. But with AI being built into our devices and platforms, we're all having to opt out rather than in.
00:13:30 Susan Halford
Thanks Paul. That says a lot in there to digest and really interesting. I was just thinking about the question about patterns of how claims are made and I heard a few things in what you said, one was around.
00:13:40 Susan Halford
Markets and business models, so some repeated ways in which business models harness markets in order to claim the future. Actually, with your free software and then subscription, but also hoovering up people's data as part of that process. And there was another cluster, I thought around design and design values and that that kind of cuts across the kind of.
00:14:00 Susan Halford
The ideologies and politics that are embedded in those future claiming which come largely from, well, let's say Silicon Valley, but that as a metaphor for broader ideas about technical innovation. Also aesthetics, I thought maybe was in there as well. The appeal of the iPhone and that's another kind of claim they didn't quite say that.
00:14:20 Susan Halford
Just two other things. Briefly, one is about knowledge and ownership and who owns what and how claims are made in that way. So the states backing of a change in copyright rules, that's a pretty big claim on the future, isn't it? And that applies to lots of different areas. And then lastly, and I think possibly Debbie, this will speak really directly to you.
00:14:40 Susan Halford
There's not one set of claims we might be able to identify a strong narrative and maybe a dominant narrative, but there are actually competing claims here. For example, Daily Mail.
00:14:53 Susan Halford
Like you're saying, don't that generative AI steal our creatives? On the other hand, the government is saying we're going to be at the forefront of developing and implementing generative AI. So you know, we've got this kind of quite competitive landscape. Debbie, I wonder if you had anything to from your own work to think about those patterns of future, how futures are claimed.
00:15:16 Debbie Watson
Sure, I think there's also patterns of social control that are coming through, and particularly in the context of some of the kind of predictive behaviours of of AI systems? And, you know, Paul was kind of pointing towards the social inequalities in terms of access to AI and and Gen. AI in particular and and very much Paul was talking about.
00:15:37 Debbie Watson
You know the kind of consumption of AI, be that through creative industries, be it through aesthetic kind of activity or through everyday ownership of devices, and how we're using them. But there's other patterns which are.
00:15:50 Debbie Watson
Kind of a notably driven claims that are kind of coming through the systems in terms of this will make society fairer because people can't make fair judgments, but computers can. And I think we've really really got to resist that sort of, you know, claim of objectivity.
00:16:09 Debbie Watson
That a social worker can't objectively make an assessment of the child and family, but actually, if we compile their data together, the system itself can make a better judgment about what is needed for that family.
00:16:21 Debbie Watson
And we know that professionals are really important in all sorts of social, health, welfare, kind of context, you know, they know the people they work with, they understand the complexities of people's lives and what we're doing is we're jettisoning a lot of that information, that kind of intuitive knowledge that people bring.
00:16:42 Debbie Watson
And we're allowing computer systems to take over. We're allowing the data to define who these children and families are and what they.
00:16:50 Debbie Watson
Require, and often those claims are around increased efficiencies. You know, better resource allocation to those that really need support, don't want any more children falling through the net. Every local authority understandably does not want a child death on their hands. But there's always a threshold. There's always a cut off point at which your data.
00:17:10 Debbie Watson
Either is in a system or it's not in a system. And actually if you look at the kind of data that is held by local authorities by statutory services.
00:17:18 Debbie Watson
Not all of us have that data about us, so it really exacerbates the implications for children, families who are already surveyed are being increasingly surveyed by the data that is held on them as well. So I think there's sort of social impacts are absolutely massive and is about social control.
00:17:39 Debbie Watson
And it is about defining who gets access to resources and who doesn’t.
00:17:42 Debbie Watson
And we've got similar systems emerging, not just for child welfare, but also in terms of making disability assessments and making assessments around homelessness. And who has the right to to bed for the night. All of these systems are being driven by data, which you could also unpick and say well.
00:18:02 Debbie Watson
It's often incomplete.
00:18:04 Debbie Watson
It's often kind of missing data. It's often quite inaccurate, and you know there's lots and lots of stories about data being used to make decisions about people's futures that is really questionable. So we're relying on this objectivity and I think we really need to be much, much more critical of that as a public.
00:18:24 Susan Halford
Jess, If you want to add to that.
00:18:24 Jessica Pykett
Could I pick up on that? Yes, because I think what you're talking about there is these patterns and systems and we see that very much repeating across different sectors. And I think Paul used the phrase of.
00:18:34 Jessica Pykett
These systems going in search of problems, that's something we see in the education sector that we're searching. So the same patterns are emerging across these different spheres and in education, for instance, we see claims about efficiency, automation being a more efficient way of organising learning claims about interoperable systems. So platforms which run across and.
00:18:55 Jessica Pykett
Do lots of different jobs and seamless technologies, so there's a claim there that this is going to improve the effectiveness of learning, and then there's all kinds of claims around immersive learning
00:19:05 Jessica Pykett
and that's billed as being much more engaging, much more emotionally kind of important, and brings the whole body into the kind of educational relationship. And then another claim is being made that that, that education should be more biological should be informed by genomics and genetics, should be more precise. So there's a search for accuracy.
00:19:27 Jessica Pykett
As well. So in all these situations, these technologies are seen as tools which can solve problems, but we could. I think what we do across the centre is try and rethink these technologies as social systems, which have their own histories and which capture certain kinds of ways of developing futures.
00:19:45 Becky Coleman
Fantastic, Becky. Yeah, that's really interesting. And I was just thinking about one of the distinctive things I think about the CENSOF is about the social and the digital together. And so a lot of maybe what we've kind of been discussing so far are the way in.
00:20:00 Becky Coleman
Which technology is seen as this solution to social problems, but actually
00:20:05 Becky Coleman
What we need to do is think about these as also social problems or the social and digital or the social and.
00:20:13 Becky Coleman
Technical or kind of entwined in some ways. So I was thinking about that in relation to some of... the again kind of claims that are made about what AI can do and that it can replace humans because it is more objective as well as being kind of quicker and kind of more efficient. And it also introduces.
00:20:33 Becky Coleman
Lots of questions about what we think humans can do, what we value about what humans can do, what we might want to preserve and kind of keep maintaining about what humans can do in those kinds of technological systems, I think.
00:20:48 Susan Halford
Really interesting when we're thinking about the patterns across how claims are made. This question of power and politics and inequalities, so that there's recurrent patterns there in terms of how inequalities are embedded in and worked through the claiming, the kinds of claims, but also how the claims are made. In particular I think.
00:21:08 Susan Halford
Whose knowledge counts? So what kinds of discourses I'm going to say, you know, like technology will make the world a better place, but also whose knowledge counts in this situation seems to be a really repeated pattern across.
00:21:22 Susan Halford
The mechanisms of future claiming, so I'd like to just turn the conversation a bit round to that more directly because so far we've concentrated on the claims that are made by big companies or governments or perhaps the mass media the daily mail for example. And those are really important. You know, we need to look to those places to understand how socio digital futures are in the making.
00:21:43 Susan Halford
But claims might also be made by individuals or by social groups who don't fit into those big, powerful institutions in more everyday sense.
00:21:51 Susan Halford
I'm wondering if we could hear a bit more about if and how those claims are made and where the centre is involved in that, and I'd like to ask you, Jess, to comment on that.
00:21:59 Jessica Pykett
Yeah. So we've been looking particularly at the sphere of education, formal learning, also informal learning, and that's taken us to some interesting spaces, looking at cultural institutions like museums, art galleries, but also kind of creative producers, artists, activists whose aim is, is to kind of cultivate.
00:22:19 Jessica Pykett
Different ways of looking at the future to evade this kind of.
00:22:21 Jessica Pykett
Sort of capture that you described. So we've been looking at their role in shaping collective imagination. Some of them informed by historical trains of thought around utopianism. Black utopias for instance. Queer futures, feminist futures, and so on. And we've been thinking about how this focus on imagination helps to is a kind of essential.
00:22:42 Jessica Pykett
Human kind of trait to think imaginatively and a way of organising societies as well, so highly relevant to the social science.
00:22:50 Jessica Pykett
It's so we're looking at the kind of techniques that these kind of organizations and activists are using to kind of cultivate different alternative understandings, use of storytelling, different communication devices, but also themselves using technologies, particularly immersive technologies, in, in new kinds of ways in order to build movements around.
00:23:10 Jessica Pykett
A particular kind of desirable or preferable future that they see.
00:23:15 Jessica Pykett
So they’re very much these groups interested in thinking about how certain groups are marginalised, how the structural systematic reproduction of disadvantage and inequalities, and how the collective imagination could help tackle those kinds of things and evade this capture. So how can we generate kind of new features through different ways of collectively imagining futures? Some of the spaces that we've been looking at include.
00:23:37 Jessica Pykett
Museums of the future, so a whole new and recent trend within the museum sector to move away from heritage and towards futures. So there's some examples, for instance, in Brazil, the Museum of Tomorrow.
00:23:50 Jessica Pykett
Where we've been looking at certain exhibits which actually playing with generative AI. So in in one exhibit that we visited, there was a generative AI interactive table that people could go and ask the AI to come up with a future utopian vision for love or for justice or for democracy or freedom. So that's one example.
00:24:10 Jessica Pykett
Another example is the world of kind of immersive theatre and kind of pop up performances.
00:24:15 Jessica Pykett
Commences. So some examples, including the Museum of Austerity, which is using augmented reality with real time performances to take participants into the future, and in particular shedding light on the decisions and political decisions that are made today and how they might affect the future of people with disabilities, for instance. So some really interesting.
00:24:35 Jessica Pykett
Examples within the creative sector and these worlds, new worlds being made.
00:24:39 Susan Halford
Fantastic examples and I I think there's another one that we might bring into the conversation, Paul, which is your work with the future places toolkit?
00:24:47 Paul Clarke
Well, I'm also going to pick up on some of the things that Jess has been talking about. I'm coming, as I said from a performance and design background. So I'm also interested in creative methods and how those can involve people in futures thinking who might not be otherwise. One of the approaches that we've been taking is is using augmented reality.
00:25:07 Paul Clarke
For which, it has to be said, make some of the claims that.
00:25:11 Paul Clarke
That that Jess has been talking about, and maybe we should also be critical of.
00:25:15 Paul Clarke
But yeah, we've been using augmented reality. People probably know of augmented reality from playing Pokémon Go. We've been using it to imagine futures with communities. There's been less monsters involved in the futures that they've imagined. But yeah, that's probably how you've previously encountered augmented reality, which kind of layers the real world.
00:25:36 Paul Clarke
With a digital layer. So yeah, with the centre, we've been using augmented reality to do some place based futures work with different communities. Those have included Birmingham settlements and work that we did with Jess in the context of their Neighborhood Futures Festival and also we've been working in Knowle West in in Bristol with communities there and future places toolkit which you've just mentioned. Susan is an augmented reality tool for for planning.
00:26:06 Paul Clarke
Consultation and getting people involved in shaping the future of their places to explain a little bit what happens as people describe their vision for their neighborhood, they see illustrations of what they imagine start to appear around them in 3D.
00:26:23 Paul Clarke
There's an artist who's listening in on the conversation that we're hosting and drawing what people describe in VR. So the people participating can then explore these future possibilities spatially, and they can get a sense of what they would be like in their neighborhood and what they'd be like to live with.
00:26:43 Paul Clarke
So I guess the idea with future places talk it is to help people imagine futures. They want preferable futures. As Jess has said, and to feel a sense of agency over these and also to consider the changes including technologies that would enable them or make these.
00:27:01 Paul Clarke
Desirable futures possible. And the aim is to open up the dominant narratives about socio digital futures to diversify the claims and what's envisioned or imagined. As we've been talking about, it tends to be governments and corporates, including tech companies who use horizon scanning and futures practices.
00:27:21 Paul Clarke
We want to intervene in how futures is done and by whom. So to build futures literacies and scaffold imagination elsewhere in society to inspire people we collaborate with in communities to make their own futures claims. Counterclaims, if you like,
00:27:41 Paul Clarke
That intervene in the corporate stories we were talking about earlier, the corporate stories about futures that impact their digital and social lives.
00:27:47 Paul Clarke
So we're exploring, and yes, there's also, I know more participatory and inclusive ways of doing futures, particularly socio digital futures.
00:28:01 Susan Halford
Very much, Becky. I just wanted to bring you in on that same question because I know you've done a lot of work, not exactly the same, but in this same space about opening up other kinds of claims to the future, once you could tell us a bit about how that's done.
00:28:15 Becky Coleman
Yeah. Thank you. And it picks up a lot on what Jess and Paul have been saying. So I'm really interested in how creative approaches can enable people to step back from the futures that seem kind of inevitable or that this is just kind of the way it is or that this is just what's.
00:28:31 Becky Coleman
Happening and to see that maybe this is just one way that things could be organised and that there are alternative features that that can be made. So I'm thinking especially of work that I've been doing with Knowle West Media Centre that's based in Knowle West, as Paul just mentioned a minute ago and their mission is to make fairer futures through art.
00:28:51 Becky Coleman
Tech and care. So really thinking about how arts led approaches can frame things differently and also really importantly bring different people into those kind of imaginations and and kind of the the making of futures.
00:29:04 Becky Coleman
They have so many brilliant examples of how they're doing that, you know 1 is that they've been experimenting a few years ago with a community chat bot. So what would a Knowle West chatbot, what look like or and feel like and sound like like who's voice should be there? What information should it kind of include. They've also been experimenting with feminist chatbot
00:29:27 Becky Coleman
in the same similar kind of spirit. Yeah, thinking about those questions around data and ownership and and where disdain to travel and who does it benefit. They've also done projects around women data in the future as well. Yeah, kind of thinking about how how we share our lives online and kind of what that actually involves.
00:29:48 Becky Coleman
I think the the other part of that work is really thinking about what it means and what's involved in doing the kinds of collaborative work that I think all of us here are doing so.
00:30:00 Becky Coleman
Working with different partners, how do you do that in a in a kind of sustained and meaningful way without kind of assuming that certain futures are necessarily, you know that they're inevitable or that certain futures are desirable? So how do you kind of open up those sorts of questions as a researcher? Who do you need to work with?
00:30:21 Becky Coleman
What approach is what methodologies do you need to kind of do that?
00:30:25 Susan Halford
That's a fantastic place to end, although it's really sad to end, it feels like we just started. I think to summarise or to sum up, you know the the is this interplay that we're interested in between digital technologies, social life and emerging futures, what I've heard across what all of you've said is that there are some really strong dominant repertoires of of future claiming.
00:30:45 Susan Halford
And those really matter.
00:30:48 Susan Halford
But the futures that they describe are not inevitable. The future the socio digital futures as, as you reminded us to come back to this interplay, this, this embeddedness of the social and the technical, it's not a done deal. And that repurposing the methods that are used by the very powerful institutional actors, with other groups or imagining new methods.
00:31:09 Susan Halford
New ways of claiming futures, which is really strong in something that all of you have said as well.
00:31:15 Susan Halford
It is a really important activity both for doing research but also for thinking about the actual futures that might emerge. So thank you so much for drawing attention to this question of the how in particular how socio digital futures are claimed. If our listeners have enjoyed that aspect, concentrating on the how.
00:31:35 Susan Halford
But also have heard a little bit in what you've all said about what futures and who's claiming those futures.
00:31:41 Susan Halford
Then please do listen to some of the other podcasts that have been recently recorded and you'll find those on our website, along with lots of other information about our work and the research aims at the centre. But for now, huge thanks to the panel. Thank you so much for joining me and thank you for a really wonderful and interesting discussion.
00:32:02 Narrator
To find out more about the centre for Socio Digital Futures, visit the University of Bristol's website, where you can read about our research, follow us on social media and sign up to our mailing list. Thanks so much for listening.