Alternative configurations
This project set out to explore the way that ethical AI is being figured in healthcare with a particular focus in the UK, with the aim to open up the possibilities for alternative configurations. Such a sentiment is at the heart of the adopted feminist Science and Technology Studies (STS) perspective which seeks to unsettle the dominant narratives and voices, and open up possibilities for new ways of understanding, designing and living with technology.
As the Covid-19 pandemic has been unfolding, putting healthcare professionals and similar stakeholders, along with all the rest of us, under immense pressures, an unexpected opportunity was presented to think more carefully about questions, such as, where do we find these alternative figurations and stories, and what is their nature? Evidenced by our research which demonstrated that stories, even the dominant one, are messy and full of contradictions – but can still travel far- and with the help of professional designers, we developed a workshop that would allow us to experiment with such messiness and openness and recognise that these stories do not simply pre-exist but are performative. They are coming into being through our very own methodological explorations.
For the project’s workshop, we collaborated with professional designers, Dr. Joe Lindley and Hayley Alter, who used Design Research methods to create and pilot a new methodological research study which aimed to:
- explore and reflect on the stories that figure and shape ethical AI in healthcare
- invite participants to co-create alternative ones
- test a new design methodology for reflecting on, capturing and creating new AI stories.
The study comprised of two phases. Both phases were designed to be remote and required no preparation.
Phase 1
Phase 1 invited the participants to write, with our guidance, their own short fictional story/stories on ethical AI in healthcare. These insights would help us understand more about (a) the messiness and complexity of the stories that shape ethical AI, and (b) the work that is needed to imagine and create alternative ones.
Phase 2
Phase 2 invited participants to a scheduled facilitated online workshop. This was a playful workshop where participants were asked to join one of 2 teams ―a ‘generator’ or a ‘discriminator’― to imaginatively approximate the dialogic workings of an AI programme in a healthcare setting. Through interplay between these two teams and using as our basis the stories co-created in Phase 1, we deliberately moved away from questions of ‘what is an ethical AI’ by inviting participants to discuss and reflect on what is an ethical world and how we can build/imagine/reconfigure ethical technological worlds within a healthcare context.
Alternative visualisations
Some of the discussions and reflections from both phases were then captured as illustrations. Their aim is to act as methodological instruments that can be freely shared and disseminated, and hence ‘travel far and wide’, as a way to reflect and challenge the dominant AI stories and figurations in healthcare and beyond, and a further invitation to imagine alternative ones.
To find out more about the process and the data collection, click here [Participant Information Sheet]. For any questions, please contact the PI at xaroula.kerasidou[at]lancaster.ac.uk.