Curating Data: Infrastructures of Control and Affect … and Possible Beyonds

Magda Tyzlik-Carver

My body is a structure that carries my feelings. I have nothing else to do this job. But in the recent months, my feelings have not been able to cope with the task of holding my body. The tension that has taken over my body since the start of the Covid pandemic changes in intensity depending on too many factors. Sometimes these are obvious, but most often the task of identifying them all is laborious, time consuming and simply boring, and risks losing sight of life as it happens. The feeling of being split, divided, extended and spread across networks, Zoom calls and business still unfinished is overwhelming. I take vitamin C, D and echinacea; I make ferments to grow microbes that nourish my body. And I am increasingly attached to my computer, which has become a window onto life that happens somewhere else, while becoming my reality.

Yet, of course, my life also happens here, where I sit. I keep some of my interactions away from the computer, but the number is falling. Even now, in writing this text, I am capturing my feelings with my computer. I have situated my body in time and space while mapping feelings and attaching them here: selecting sensa tions, archiving moods, displaying moments. Curating. And in the process I remain connected and react to tweets, to Whatsapp or Signal messages from friends, looking at Instagram pictures via Facebook and so on. I choose tools and apps to make connections or simply to stay connected, weaving networks, engaging with systems and becoming part of them. I am sensing how it feels to become posthuman, a body of data and affect.

To suggest that (digital) life today revolves around curating might be too much of a reduction. Life, even the digital one, is so much more complex than a process of selection and arranging things. It is more than an archive or a database. Such a contraction, however, allows us to consider two important elements in contemporary curating and information networks: control and affect. On the one hand, curating is a method of exerting some form of control in a system that requires arranging relations into directories. In the process of generating data and creating folders, daily curatorial decisions include naming files, deciding on their format (.txt, .jpg, .png, .avi, etc.), organising them and choosing storing devices, and so on. Users collect and archive data routinely, and in the process, their lives too become data to be managed and organised. On the other hand, while network connections allow expansion beyond one’s own computer or mobile phone, the volume, velocity and veracity of links made, files exchanged and data captured is such that trying to make sense of these data is a task optimised predominantly for machines with a platform profit as the main objective.[1] It is simply not possible for users to make sense of big data, especially if such sense-making is based on models that require ‘the capture and manipulation of people’s activities and environments’.[2] And as curating becomes increasingly posthuman it takes place at different levels.[3] Curating has become an organised form of control executed by algorithms and made possible by big data, while also directly affecting people whose lives have been incorporated into digital infrastructures that maintain the system, a necessary element for the profitable performance of Facebook, Google, Amazon, Microsoft and Apple, to name only the biggest five.

No wonder that today sense-making has become a project that is engineered by computer science. For many users of digital devices, feeling the world is synonymous with checking social media status. The main form of interaction with the distributed information system that is the worldwide web is based on algorithmic models for news delivery and purchasing products. This is a structural transformation. But affect and emotions are as much part of this system as algorithms and data. News feeds and personalised results offered when logging in online are managed by optimisation models designed to ‘treat the world not as a static place to be known, but as one to sense and co-create’, which too often results in ‘social risks and harms such as social sorting, mass manipulation, asymmetrical concentration of resources, majority dominance, and minority erasure’.[4] The poverty of possibilities that computer science models offer are currently displayed online and made visible as bias in AI systems and image data sets, polarisation of opinions that feed and display hate while dividing communities and people, sensationalist conspiratorial narratives that feed click-bait economy, and many more effects that play out algorithmic scenarios and forms of co-creation of the new world. Making sense is about truth games played by computer science, driven by solutionism without ever asking ‘What is it for?’, ‘Who will it harm?’, ‘Who does it serve?’, ‘Should it be introduced into the world?’.

These structural transformations of life take place through processes of digitisation, digitalisation, datafication, AI and Big Data. And while predominantly driven by poor computer science in the service of a market logic that wants more clicks and even more data with which to play and which to model for growth that profits a few, we are reminded to think about transitional forms and structural transformations. For Lauren Berlant, structure is ‘that which organizes transformation’ and infrastructures maintain it.[5] Infrastructure is ‘the living mediation of what organizes life: the lifeworld of structure binds us to the world in movement and keeps the world practically bound to itself.”[6] Infrastructures are living forms for relating and connecting to other humans and nonhumans. That’s where living happens, where lives collide and encounter harm and pleasure, damage and nourishment. And today, in the time of pandemic and a fight for power where people are continuously counted, the daily practice of being alive is becoming a practice of being online, en masse, yet alone, often at home, or migrating, but always counted.[7] It is experience of digital life on social-media platforms where there is no community but algorithms released by the union of computer science with market.

Figure 1. Maintaining Indeterminacy, Loren Britton (2020), image courtesy of the artist.

This damaged (digital) life needs other imaginaries that can take into account the terms for transformation with digital structural forms that hold the conditions for ‘infrastructures of sociality’.[8] Different ways of knowing, and not knowing need to be part of this. We need to imagine what could be possible, how to draw connections, and to be in touch, yet with less certainty. These are the questions that Loren Britton and Helen Pritchard ask in their call for:

multiple CS practices: computer science, chance and scandal, committed survival, care and shelter, chocolate and strawberries, cushions and support, collective strategies, chancer scientist, cohabitation and sharing, conditions and structures, choice and scandal, careful slug, collective scandal, crip studies, composed silliness, compulsory sleep, cancelled stories, crying sabotage, carceral states, cut and scale, considerable scaffolding, collapsing species, collective suffering, companion story.[9]

Britton and Pritchard recognise that computer science is a practice that can be queered, that can be challenged, and that it does not need to be in the service of world destructions. With their challenge to Computer Science, they explain:

What we are making space for with our CS figure is the destabilization of what CS is defined and known to be. CS has been debated historically and can be debated still, but who gets to determine what comes to matter is still very much dependent on moments of translation, moments in which that which might not be recognized as CS becomes so. Instead of asking who gets to have a voice, we ask: Which practices get to produce knowledge?[10]

These questions and their dream of ‘computer science otherwise’ mobilise desires for another CS beyond the field of informatics and computation. [Fig.1] Reclaiming CS as a figure and not just a discipline resists its normalisation and turns it into an active proposal with indeterminacy as the value that defines the possible, and unknowing as a process of situating and recognising of what one knows while choosing to move away from it and to start anew.[11]

The point of not knowing makes space to account for knowledges that have been excluded by a Eurocentric tradition of science that has been defined by a colonial mindset. Indigenous Protocols and Artificial Intelligence Working Group offers another challenge to computer science and its impotent imaginaries. Situated within and speaking from the position of many different Indigenous concerns and communities in Aotearoa, Australia, North America and the Pacific, the group asks a series of questions that situate Indigenous knowledges and their communities as directly engaging in ‘conceptual and practical approaches to building the next generation of A.I. systems’.[12] This is a project not undertaken for diversity’s sake, but rather because of the belief ‘that Indigenous epistemologies are much better at respectfully accommodating the non-human’.[13]

One such epistemology is Lakota ethical protocols that look seven generations ahead, thus already establishing ethical relations with the future from a situated present. As Suzanne Kite explains, her research into Lakota protocols is based in the ontological status of stones for the Lakota people.[14] Kite, herself a member of Lakota Nation, works with Lakota stone epistemologies as a framework that also defines relations with raw materials used in building computers by asking at what point materials, objects or nonhumans are given respect, and how ethical relationships with raw materials are established when building computers of all kinds. Kite explains how Lakota knowledge is not static and so working with that knowledge is a practice of change and identifying shifting networks of relations that have to be accounted for. As she says:

The effects our decisions – and technologies – have on the world can help us identify the stakeholders in what is being made and how it is used. Stakeholders in Indigenous communities are identified as our extended circle of relations, while stakeholders in technology companies are identified as the board of directors, shareholders, employees and consumers. It is necessary to identify how all those – both human and nonhuman – are affected by what is made, and to take responsibility for those it affects.[15]

This possibility to account for change by accounting for relations with humans and nonhumans is part of the Lakota responsibility that manifests as a commitment to do things the Good Way.[16] Protocols offer a specific guide and Kite, in collaboration with Corey Stover and Melita Stover Janis, and with notes from Scott Benesiinaabandan, gives an example of how to build a physical computing device in a Lakota way. But such a guide extends to all elements of the AI system, from training sets to interfaces, because ethical AI is not possible if any one of its elements is extracted through exploitation.[17]

CS otherwise and Indigenous Protocols are examples of what structural transformations are imagined and desired, and how they can be made. The two are epistemological practices that build on histories and embedded knowledges of people in queer and Indigenous communities in art and science. And they are also a call to extend these as living experiences that inspire other imaginaries for practices and knowledges, creating infrastructures of computer science and AI that otherwise are inflexible and harmful.

The practice of curating data is also an epistemological practice that needs interventions to consider futures but also to account for the past. In curating data, knowledge results from engaging with data, and establishing material relations with and between them. Decisions about how to process, organise and name data can be identified, and the practice itself can be made accountable. When organising data such as images, videos and texts on a personal computer, this seems like a straightforward task, since one simply has to take the time and effort to organise these files into directories. When curating bigger collections of data and other digital things, to sustain accountability becomes a task of sharing responsibilities not just for the future of a collection but also for its origins.

This can be done by asking where do data come from? How are they connected to the communities that are their source, how is the data set/database created, what is the reason for curating these data, and how will this curating take place? It is true that similar questions could also be asked of colonial objects taken from communities and locations and moved into imperial museums to narrate the story of colonial empire. Here too we can ask about the conditions of this takeover, and how these histories are part of displaying objects. Responding to such questions accounts for the fact that data, too, come from somewhere, and that as well as being part of a database or a set, data start with bodies, human or not, and index relations between them in the most abstract way. The task in curating data is to reclaim their traceability, and to account for their lineage.

Figure 2. The Intimate Earthquake Archive Sissel Marie Tonn (2016-ongoing), Photo by Peter Cox, Van Abbemuseum, 2016, NL

Curating data is a material practice of embedding data and also exploring how it could be embodied. In other words, curating data is about how data becomes part of the world. The interactive installation The Intimate Earthquake Archive (2016 ongoing) by artist Sissel Marie Tonn and developed in collaboration with sound artist Jonathan Reus engages these questions in an attempt to understand the effects of long-term gas mining in the Groningen area of Holland.[Fig.2] The installation[18] and related ongoing research project involves data collected during man-made earthquakes that have occurred in the area over the last thirty-four years. The core sample data includes sand and soil tests and digital records of seismic activity stored by the Dutch Meteorological Institute.

The artists put data sets together with stories of how earthquakes are experienced by inhabitants of this area, some of whom regularly report waking up seconds before the earthquake, or whose personal perceptions of the place are affected by continuous anxiety about the possibility of an earthquake occurring at any time. By connecting digital data of seismic activities with the experience of the sensing body, Tonn and Reus produce a multimedia installation that offers a very different experience of this geo-located phenomenon from one that could be extracted from meteorological records with traditional forms of data analytics and visualisation.[Fig.3] 

Figure 3. The Intimate Earthquake Archive, Sissel Marie Tonn (2016-ongoing), Photo by Stella Dekker, Grasnapolsky Festival, 2019, Groningen, NL. Courtesy of the artist.

Earthquake data is mediated into sonic vibrations that can be sensed or felt with the use of a wearable interface – a body vest with embedded surface/skin and bone conduction transducers that produce tremors on the surface of the body, mirroring the way the seismic waves move across the land. Datasets are manipulated into sonic vibrations and encourage deep listening with and within the body while moving through the installation area. According to the artist, this is a testing ground for attunement to future living in the Anthropocene. As such, the project offers a practical exploration of transitions that can be tested, and where different kinds of data can perform together and can be sensed.

In the computational regime, resources such as minerals and data are extracted, bodies moved, geological phenomena undergone, social realities constructed. In the environment simulated by The Intimate Earthquake Archive, not only does the body not end with the skin but it becomes part of the archive, extending it and transforming towards a practice of sensing, accessing affect, constructing affective data bodies. The archive becomes an environment for feeling and being moved by data that themselves are residues of movements of the ground. Here, data is curated into shifting, vibrating and sonified relations that roam fastened onto the bodies of visitors. And so here data can be felt not as emotion, but as a sensation in the body, a feeling of the body being moved. Artwork’s structure accommodates motion as infrastructure for transformation of data back into what can be sensed and located in the body. [Fig.4]

Figure 4. The Intimate Earthquake Archive, Sissel Marie Tonn (2016-ongoing), Photo by Stella Dekker, Grasnapolsky Festival, 2019, Groningen, NL. Courtesy of the artist.

The overwhelming grip of corporations over networked infrastructures, driven by a desire for exponential growth, disregarding injustice and distributing inequity through the system with the help of computer science, displays colonising logic.[19] Such logic harms already vulnerable groups, communities and their environments and there is an urgent need to look for other models and visions of life and its organisation online and off. Curating data as the process of feeling with data rather than data optimisation for profit, opens new political possibilities for enacting common becoming with CS as computer science, chance and scandal, committed survival, care and shelter, chocolate and strawberries, cushions and support, collective strategies, and many more. Britton and Prichard, Berlant, Kite, Indigenous AI Working Group, and Tonn and Reus, among many others, challenge and direct us towards our sensing in common. Curating data is common and it needs to be accounted for as a material practice and intervention into computational realities that can be otherwise, while also demanding responsibility (from ourselves and others and especially CS) to do things the good way.

[1] These are three of the five Vs that define Big Data. For an overview of the features of Big Data See ‘5 V’s of Big Data’, 2019, GeeksforGeeks (blog), 10 January 2019. For critical questions see danah boyd and Kate Crawford, 2012, ‘Critical Questions for Big Data’, Information, Communication & Society 15 (5): 662–79.

[2] Seda Gürses, Rebekah Overdorf and Ero Balsa, ‘POTs: Protective Optimization Technologies’, Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 27 January 2020, 177–88,

[3] Magdalena Ty?lik-Carver, ‘Curating in/as Common/s. Posthuman Curating and Computational Cultures’, PhD (Aarhus: Aarhus University, 2016); Magdalena Ty?lik-Carver, ‘Posthuman Curating and Its Biopolitical Executions: The Case of Curating Content’, in Executing Practices, ed. Helen Pritchard, Eric Snodgrass and Magdalena Ty?lik-Carver (London: Open Humanities Press, 2018), 171–89.

[4] Gürses, Overdorf and Balsa, ‘POTs’.

[5]Lauren Berlant, ‘The Commons: Infrastructures for Troubling Times’, Environment and Planning D: Society and Space 34, no. 3 (June 2016): 394, Italics in the original.

[6] Berlant, ‘The Commons’, 394.

[7] As I am writing this, the counting of ballots in the US 2020 presidential election is still going on, and there are already calls to stop counting in some of the states.

[8] Berlant, ‘The Commons’, 394.

[9] Loren Britton and Helen Pritchard, ‘For CS’, IX Interactions (blog), July 2020,

[10] Britton and Pritchard, ‘For CS’.

[11] Ibid.

[12] INDIGENOUS AI’, INDIGENOUS AI (blog), (accessed 11 February 2020); Jason Edward Lewis et al., ‘Making Kin with the Machines’, 16 July 2018,

[13] Lewis et al., ‘Making Kin with the Machines’.

[14] Suzanne Kite, ‘How to Build Anything Ethically’, in Indigenous Protocol and Artificial Intelligence Position Paper(Honolulu: The Initiative for Indigenous Futures and the Canadian Institute for Advanced Research, 2020), 75–85.

[15] Ibid., 76.

[16] Ibid.

[17] Ibid.

[18] The installation received an honorary mention at Ars Electronica in 2020.

[19] Some of the recent scholarly work on these matters include Simone Browne, Dark Matters: On the Surveillance of Blackness (Durham: Duke University Press Books, 2015); Safiya Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York: NYU Press, 2018); Mél Hogan, ‘Big Data Ecologies’, in Ephemera: Theory & Politics in Organization 18 (3), Landscapes of political action (August 2018), pp. 631–57; Ariella Aïsha Azoulay, Potential History: Unlearning Imperialism (London: Verso, 2019); Ruha Benjamin, Race after Technology: Abolitionist Tools for the New Jim Code (Medford, MA: Polity, 2019); Catherine D’Ignazio and Lauren F. Klein, Data Feminism, Ideas Series (Cambridge, Mass.: The MIT Press, 2020).

Download this article as PDF
Magda Tyzlik-Carver

Magda Tyźlik-Carver is Assistant Professor in the Department of Digital Design and Information Studies at the School of Communication and Culture at Aarhus University. She is also an independent curator. Her recently curated exhibitions and events include ScreenShots: Desire and Automated Image (2019), Movement Code Notation (2018), Corrupting Data (2017), Ghost Factory: performative exhibition with humans and machines (2015) and Common Practice (2010, 2013). She is co-editor (with Helen Pritchard and Eric Snodgrass) of Executing Practices (2018) a collection of essays by artists, programmers and theorists engaging in a critical intervention into the broad concept of execution in software. She is a member of Critical Software Thing group and of the editorial board for Data Browser series. She is also Associate Researcher with Centre for the Study of the Networked Image at the London South Bank University.