This is a written interview provided by Semeli Hadjiloizou to Premiere’s Communication and Dissemination team, between December 2024 and January 2025. It is part of our efforts to bring forward critical perspectives about the culture of technology, in order to enrich the conversation about the relationship between the arts and technology.

We invited Semeli Hadjiloizou, Researcher in Data Justice and Global Ethical Futures at The Alan Turing Institute, to contribute to our blog, in order to approach two main topics of interest: How does the concept of data justice resonate to art professionals? What is the role of the arts, as practice and as institutions, with regards to the development and deployment of emerging technologies?

We’re very happy about this collaboration, grateful to Semeli, and wish you an enjoyable read!

Elena Ananiadou: Can you give some references of projects that you and The Alan Turing Institute’s Ethics and Responsible Innovation team are involved in?

Semeli Hadjiloizou: The projects I work on range from co-developing a human rights risk and impact assessment for AI systems with the Council of Europe’s Committee on Artificial Intelligence to working with UNESCO’s Ethics of AI team to co-design the Global AI Ethics and Governance Observatory, which aims to foster more inclusive knowledge exchange in this area.

It has also been amazing to contribute to the ‘Advancing Data Justice Research and Practice’ (ADJRP) project, which has fostered incredible partnerships with civil society organisations based across the globe and has resulted in a diverse range of resources, including a three-part documentary series. Building on this project, we are currently curating this work into an openly available learning course as part of the Turing Commons. There is always such interesting and important work happening within the team, and it is a privilege to be able to contribute to these projects alongside incredible colleagues.

Elena: What do you do in the AI & Arts Interest Group? What is its scope regarding AI development and ethical considerations?

Semeli: The Turing’s AI & Arts interest group convenes a community of over 400 creative practitioners, researchers, and arts organisations across the world. As a Co-Organiser and an interdisciplinary social sciences, arts, and humanities researcher, I am particularly interested in exploring how artistic practice offers an important route to interrogating the crucial questions arising with emerging technologies, such as generative AI.

These concerns do indeed overlap with the work my team in Ethics and Responsible Innovation carries out, especially as they relate to examining how sociocultural narratives inform data- and AI-intensive practices and vice versa.

Elena: Can you explain what the concept ‘data justice’ means and what questions are brought forward by initiatives that adopt this lens?

Semeli: Data justice can be understood as a framework for examining the widespread impacts of data-intensive practices through a social justice lens.

“Although advances in data-intensive technologies, such as AI, may offer various opportunities, the data justice movement highlights how the proliferation of these systems often exacerbates longstanding structures of discrimination, inequality, and exploitation.”

This framework emphasises the inherently sociotechnical nature of data-intensive practices and technological innovation more broadly, demonstrating how these processes are inevitably influenced by human values, interests, and norms (Leslie et al., 2022).

There is an extensive body of work that captures the range of these concerns as well as ongoing initiatives and interventions that advocate for more just innovation ecosystems.

For example, the Data Workers’ Inquiry is a community-based research project run by the Distributed AI Research Institute (DAIR) that is shedding light on the lived experiences of the people that perform the crucial data work, such as labelling and sorting, that is required for the operation of AI systems. It is common for large private technology companies, that are largely based in the Global North, to outsource this labour to low paid data workers that are routinely subject to precarious and exploitative working conditions (see the journalistic investigations of Karen Hao). The Data Workers’ Inquiry exposes this hidden labour and how it functions within larger systems of unjust power relations, providing a platform for the data workers to share their own stories firsthand.

In this way, the data justice movement has been integral to revealing how seemingly ‘neutral’ or ‘apolitical’ technical decisions are inextricably entangled within social practices, raising a host of ethical, political, economic, material, and cultural concerns. Importantly, it provides avenues for exploring how sociotechnical practices can be proactively reshaped to cultivate more equitable and inclusive innovation ecosystems.

Elena: What are the practical and ethical issues of art professionals that are related to the data justice movement, and how?

Semeli: The relevance of the data justice movement to artistic practice and the cultural industries has been brought to the fore more recently with the rise of commercial generative AI tools, such as Stable Diffusion or DALL-E. As multimodal systems, these machine learning-based technologies are trained on massive amounts of data that are largely scraped from the internet to produce a variety of outputs, including text, image, video, voice, or music.

On the one hand, the rapid roll-out of generative AI tools has been viewed as a way to widen access to advanced technologies for cultural production and dissemination. A survey of creative professionals by Inie et al. (2023) indicates how some practitioners are optimistic about the integration of generative AI in their work as it can be used to automate repetitive tasks or offer inspiration throughout the creative process.

In our latest AI & Arts event, the sound artist and composer, Foteini Tryferopoulou, illuminated how she integrates AI tools to enhance her process of musical composition. Foteini comments on how AI systems helped to expand her creative process, explaining how for her, “AI generators function less like experimental tools and more like collaborators similar to how performers interpret a score”. Thinking about cultural heritage for example, AI-intensive technologies may offer researchers and art historians the opportunity to analyse large amounts of data to “uncover precious patterns, connections, and insights that might otherwise remain elusive” (Tiribelli et al., 2024, p. 293), opening up new possibilities for cultural heritage preservation practices.

On the other hand, the proliferation of generative AI tools is situated within a wider technological ecosystem that is dominated by a limited number of high-resourced private companies largely based in the Global North, raising a host of ethical concerns around unequal power structures.

“Linking back to the labour issues mentioned earlier, the mainstream discourse around generative AI tends to abstract the labour-intensive nature of artistic practice, reducing creativity to a form of capital that can seemingly be automated (Lee, 2022).”

This reduction disregards the fact that these technologies have been trained on the free labour of creative practitioners—whose work has been labelled and sorted by an exploited data workforce—and contributes to the displacement of creative labour while technology companies rake in millions in profit (Jiang et al., 2023). The repercussions of these extractive business models for creative practitioners not only include economic losses but also significant challenges for creative ownership, reputation, and autonomy, with historically marginalised individuals and communities disproportionately impacted (Goetze, 2024).

Paying close attention, therefore, to the intersection of the arts, technology, and society is crucial as these advanced systems become increasingly integrated in our day-to-day practices and lead to significant impacts on people’s livelihoods. At the same time, the sociocultural shifts occurring with the proliferation of generative AI tools may serve as a motivator for shaping alternative, more equitable pathways that ensure practitioners and their work are respected and valued. Leaning towards this optimistic view, I particularly like how Payal Arora’s understanding of ‘creative data justice’ (2024) frames this turning point as an exciting chance to ensure we co-design sociotechnical systems that are culturally diverse and inclusive.

Elena: The archives of cultural institutions have recently received renewed attention as large-scale datasets valuable for training AI systems. What responsibilities do cultural institutions hold within this technological landscape?

Semeli: Since cultural institutions, such as galleries, libraries, archives, and museums, play an integral role in recording and maintaining society’s artefacts across time—serving essentially as stewards of cultural knowledge—it is essential that the public interest is at the heart of these endeavours.

This necessitates inclusive and equitable approaches to collecting, managing, curating, and providing access to heritage collections, ensuring that these practices serve the common good rather than being driven solely by current market forces.”

There have already been instances of cultural institutions outsourcing their digitalisation efforts to large technology companies, even by public media bodies (Declercq, 2024), which diminishes control over how the data from these archives will be used within proprietary systems.

Given the overwhelming accumulation of resources and infrastructure by multinational enterprises in the technological innovation space, a key concern that comes up for the cultural sector is how it can stimulate alternative pathways for a flourishing creative technology ecosystem. This may take many forms; from bolstering independent capabilities for creative practitioners to access and build advanced technologies, and strengthening multidisciplinary collaborations for forging new ownership models (Ivanova et al., 2024) to reshaping cultural policies that highlight the undeniable value artistic practice provides in furthering AI innovation (Andrews & Hawcroft, 2024).

For example, the latest volume of the Future Art Ecosystems series by the Serpentine Arts Technologies department, summarised in a blog post by Alasdair Milne on our Medium, illustrates this opportunity to nourish public resources and infrastructures for the creative industries. The authors of the report outline how the cultural sector can “benefit from forming coalitions around mutual objectives, new use cases, and governance practices” (Ivanova et al., 2024, p. 86) for fostering public AI at the intersection of the arts. Artistic practice and the cultural sector have always been integral to driving creative innovation throughout history; it is essential that these contributions are foregrounded and for the sector to assert its leading role in furthering the critical arts-technology landscape.

Elena: Mainstream narratives around emerging AI technologies tend to oscillate between utopian and dystopian visions. What would a more productive approach, potentially informed by the creative arts, look like?

Semeli: It has become clear that the cycles of hype and doom around AI essentially work to achieve the same goal; to conceal the power structures at play and prevent meaningful interventions from the wider public. Instead, these popular narratives reinforce the necessity of AI for one purpose or another, whilst deftly shifting attention away from the real-world impacts data-intensive technologies are having on vulnerable individuals and communities.Narratives of technosolutionism assume we should be applying AI to nearly every issue, failing to engage with the longer-term root causes of ongoing societal challenges. Even the term “AI” itself is applied so ambiguously that researchers such as Meredith Whittaker argue that it is essentially a marketing gimmick (The New Institute, 2024).

“Fortunately, there are a plethora of practitioners, researchers, artists, and community-led organisations that are foregrounding alternative ways of seeing and thinking about data-intensive technologies such as AI.”

Oftentimes these initiatives merge disciplines to interrogate, expose, or generate new sociotechnical relations. For example, Mimi Ọnụọha is an artist and academic whose work pries into the assumed separateness of the social and the technical through installations such as ‘The Library of Missing Datasets’ (2016) that reveals the politics of blank spots in our technological systems. Similarly, installation work by Jake Elwes uses machine learning systems to reveal discriminatory patterns of representing queer identities. Taking a broader ecosystem-level perspective, Vladan Joler and Kate Crawford collaborated to create Anatomy of an AI System, which underlines the vast interrelatedness of labour, data, and planetary resources in producing AI technologies. These are just a few examples of how interdisciplinary creative practice stimulates generative questions and discussions around the shaping of emerging technologies.

For creative practitioners or arts organisations looking to dive deeper into practical guidance for navigating the sociotechnical concerns raised by data and AI technologies, my team at the Ethics and Responsible Innovation team have produced a series of Data Justice in Practice guides (2022) for impacted communities, policymakers, and developers. These provide both the conceptual foundations for exploring these questions as well as resources for putting these ideas into practice.

Semeli Hadjiloizou is a Researcher in Data Justice and Global Ethical Futures at The Alan Turing Institute, the UK’s national institute for data science and artificial intelligence (AI).  As part of the Turing’s Public Policy Programme, her work is centred around building ethical foundations for the design, development, and deployment of data- and AI-intensive technologies. Specifically, she is based in the Ethics and Responsible Innovation team, which works with public sector bodies across the UK and internationally to operationalise equitable approaches to AI ethics, governance, and policy. She works on a variety of projects that promote the equitable, inclusive, and responsible design, development, and deployment of emerging technological innovation.

References

Andrews, H. & Hawcroft, A. (2024). Articulating arts-led AI: artists and technological development in cultural policy. European Journal of Cultural Management and Policy. https://doi.org/10.3389/ejcmp.2024.12820

Arora, P. (2024). Creative data justice: A decolonial and indigenous framework to assess creativity and artificial intelligence. Information, Communication & Society. https://doi.org/10.1080/1369118X.2024.2420041

Declercq, B. (2024). Neck-deep in digital oil? Public broadcaster’s archives as AI training datasets. FIAT/IFTA. https://fiatifta.org/broadcast-archives-as-datasets

Goetze, T. S. (2024). AI art is theft: Labour, extraction, and exploitation: Or, on the dangers of stochastic pollocks. The 2024 ACM Conference on Fairness, Accountability, and Transparency, 186–196. https://doi.org/10.1145/3630106.3658898

Inie, N., Falk, J., & Tanimoto, S. (2023). Designing participatory AI: Creative professionals’ worries and expectations about generative AI. Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA ’23). Association for Computing Machinery, Article 82, 1–8. https://doi.org/10.1145/3544549.3585657

Ivanova, V., Jäger, E., Milne, A., & Zhang, G. Z. (2024). Future art ecosystems: Art x public AI. (Report No. 4). Serpentine Galleries. https://reader.futureartecosystems.org/briefing/fae4

Jiang, H., Brown, L., Cheng, J., Khan, M., Gupta, A., Workman, D., Hanna, A., Flowers, J., & Gebru, T. (2023). AI art and its impact on artists. Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society (AIES ’23). Association for Computing Machinery, 363–374. https://doi.org/10.1145/3600211.3604681

Lee, H. K. (2022). Rethinking creativity: creative industries, AI and everyday creativity. Media, Culture & Society, 44(3), 601–612. https://doi.org/10.1177/01634437221077009

Leslie, D., Katell, M., Aitken, M., Singh, J., Briggs, M., Powell, R., Rincón, C., Chengeta, T., Birhane, A., Perini, A., Jayadeva, S., & Mazumder, A. (2022). Advancing data justice research and practice: An integrated literature review. The Alan Turing Institute in collaboration with The Global Partnership on AI. https://doi.org/10.5281/zenodo.6408304

Leslie, D., Katell, M., Aitken, M., Singh, J., Briggs, M., Powell, R., Rincón, C., Perini, A., Jayadeva, S., & Burr, C. (2022). Data justice in practice: A guide for developers. http://dx.doi.org/10.2139/ssrn.4080058

Leslie, D., Katell, M., Aitken, M., Singh, J., Briggs, M., Powell, R., Rincón, C., Perini, A., & Jayadeva, S. (2022). Data justice in practice: A guide for impacted communities. http://dx.doi.org/10.2139/ssrn.4080046

Leslie, D., Katell, M., Aitken, M., Singh, J., Briggs, M., Powell, R., Rincón, C., Perini, A., & Jayadeva, S. (2022). Data justice in practice: A guide for policymakers. http://dx.doi.org/10.2139/ssrn.4080050

The New Institute. (2024, June 25). Meredith Whittaker on AI, Tech Power and Surveillance [Video]. YouTube. https://www.youtube.com/watch?v=kHL_Z0EERlM&t=21s

Tiribelli, S., Pansoni, S., Frontoni, E., & Giovanola, B. (2024). Ethics of artificial intelligence for cultural heritage: Opportunities and challenges. IEEE Transactions on Technology and Society, 5(3), 293-305. https://doi.org/10.1109/TTS.2024.3432407