In the seven-chapter long book, Loukissas sets out to demystify the notion of digital universalism by emphasizing data and their local connections. Therefore, the guiding question for the book is: How do local conditions matter for understanding data in everyday practice? Simply put, All data are local is the title of the book, but also the main statement that guides the book and is one- if not the most important- take-away messages of the book.
Data universalism : the ideology that leads us to falsely believe that despise our varying circumstances, “once online, all users could be granted the same agencies on a single network, all differences could dissolve, and everyone could be treated alike” citing Anita Chan. This perspective might institute a new form of colonialism, where practitioners at the periphery have to conform to set standards and expectations of a dominant technological culture (p.10)
In the introduction, Loukissas proposes to alter the views on data sets and instead discuss data settings. Reasons for this are that the term data set evokes the notion of a “something discrete, complete and readily portable” according to the author.” But this is not the case. I contend that we must rethink our terms and habits around public data by learning to analyse data settings instead of data sets”(p.2). Loukissas further states, that “too often we attempt to use a given data set as a complete work, such as a book, rather than viewing it as an index to something greater” (p.3). Data, therefore, are indexes of local knowledge. In chapter 2, Loukissas further discusses data, describing that “in common parlance, the term data can be used to mean secondary, digital representations of objects that hold scientific and cultural import. But data can also create an ontological “looping effect” whereby they help to shape practices and institutions that create them “ (p.52).
Overall, the book has six theoretical principles, four of which are exemplified by reference to concrete case studies:
- All data are local
- Arnold Arboretum: data have complex attachments to the world
- DPLA: data are collected from heterogeneous sources, with local attachments
- NewsScape: data and algorithms are entangled
- Zillow: Interfaces recontextualize data
- Data are indexes to local knowledge
Based on the fours case studies as well as additional practical examples in chapter 6, Loukissas provides a set of practical guidelines/implications to follow (see below).
Loukissas designates the first chapter to discuss data: the terminology, the language around data as well as ontological, epistemological and theoretical implications and approaches. Among them are STS scholar Latour and his approach to “immutable mobiles” (which Loukissas disagrees with)and inscription as well as CDS approach “data assemblages”.
In addition to this, Loukissas discusses Big Data and presents contrasting arguments. According to the author “the ideology of Big Data has infiltrated workaday practices with datasets that measure only in the tens of millions of entries. Big Data is not anxiety-inducing big, as big data also is represented by small, discrete signal, represented as 0 and 1s in computers” (p.16). This perspective on Big Data is connected to Loukissas disagreement with the opinion that anything can become data. In this particular context, Borgman is cited: “if it is taken up as evidence in an argument, including text, photographs and even traces of pigment from an archaeological field site. Making data means bringing a subject into a preexisting system, defined by durable conditions of data collection as well as storage, analysis and dissemination. Aspects of the original subject are inevitably lost in translation” (p.17). Instead, Loukissas presents his perspective on data, which he views as plural, small, operational and material.
After setting the stage for the understanding of data, the central point of the book is discussed: what it means to be local. According to the author, the term local has a long tradition in social sciences and describes that “knowledge practices [are] grounded in particular places, usually those inhabited by small, indigenous, marginal or non-Western cultures” (p.19). The deep connection to knowledge becomes even more apparent when looking at local knowledge as way to notice and acknlowedge different meaning-making practices of different communities. This notion of knlwoedge practices seems to be similar to the (widely synonymously) used term “situated”. However, situated “is sometimes interpreted as being about social and material conditions exclusively, local puts more weight on the relevance of place”(p.19). While place is emphasised in local, local is more than place “as local is contingent on experience, defined by meaning and susceptible to changing social designations […] local transcends geolocations” (p.20). Place comes up again in chaper 2, where place is defined as “an institutionally defined framework with social technological and spatial dimensions, in which data are created, displayed and/or managed, and that reciprocally, is shaped by those practices.” (p.52).
In the fourth chapter, In chapter 4, Loukissas describes the tension between global and local: “Understanding the relativity of the local also helps us to understand the ways the local is connected to the global. The local creates data, but the data produced may travel globally, running the risk that local origins become obscured when data is examined out of context. It is these contexts or local data settings that are examined to explicate the remaining principles of All data are local” (Tupling, 2020, p.1)
Here, Big Data comes into play, as “seemingly impersonal, large-scale data sets are also local” (p.21) and therefore “we should talk about all data sets in terms of their data settings” (p.1). Furthermore, local coexists with the global, “as data do not serve exclusively local needs, however, there is no global experience of data, only an expanding variety of local encounters. Data travel widely, but wherever they go, that’s where data are. For even when data escape their origins, they are always encountered within other significant local settings” (p. 22). While Loukissas highlights the necessity to consider the local perspective when talking about data, he also advises caution, as local not necessarily equals good. It might also “mean exclusionary, narrow, or even oppressive” (p.22) and can lead to so-called filter bubbles.
The second chapter focuses on the “complex attachments to the world” exemplified by a case study of a data setting in the Arnold Arboretum at Harvard University. The chapter demonstrates the complex local attachments of data by diverse visualisations. By creating these visualisations, Loukissas shows how different visualisations can highlight variations in data, while at the same time also challenges conventions in data visualisations. Instead of removing anomalies and glitches, in short cleaning and therefore obscuring data, data visualisations should embrace these “artefacts”. In this chapter it becomes explicit, what Loukissas meant in the introduction, when he wrote that “in engaging with visualisations, the reader should be ready (as they must be with any evidence) to do to some of their interpretative work. Visualisations are also texts (p.8)”.
The 3rd chapter focuses on the principle “data are collected from heterogeneous sources, with local attachments” and looks at the Digital Public Library of America. In this chapter, data are recognized as cultural markers of data collection practices, different from era and between institutions (Tupling, 2020). Loukissas shows that “even displaced and agglomerated data retain traces of their origins, embedded in classifications, schemata, constraints, errors, absences and rituals that resists simple translation and normalization” (p.53).
Following this, the 4th chapter looks at NewsScape as a case study about the entanglement of data and algorithms.
While Loukissas accredits the power of algorithms, he simulateneously opposes the common characteristics ascribed to algorithms such as being anonymous and opaque, “concealing the local conditions in which algorithms are produced” (Tupling, 2020, p.2). Furthermore, “algorithms are not processes, but artefacts, created by humans, whose decision making becomes ‘delegated’ or embedded into algorithms” (Tupling, 2020, p.2). Loukissas also reminds us that algorithms are a form of ‘human reading’ (p. 115) and are local, since they are reliant on data to function. In addition to that, algorithms and data are symbiotic, “together, data structures and algorithms are two halves of the ontology of the world according to the computer” (p. 104).
Drawing on some of the theoretical approach introduced in chapter 1, Loukissas demonstrates that neither realist nor constructivist approaches have a clear ground to explain agorithms, databases and reality. Instead, he proposed network perspectives, such as ANT, as an alternative entry point, where reality is not viewed as being at the beginning (input) nor its end(output). In a network perspective, elements such as reality, data, databases and reality, algorithms instead coevolve.
The fifth chapter focuses on interfaces and how they recontextualize data. “Users engage most often with the interface layer formed by tightly curated user experiences, to shield the audience from the messy sociotechnical conditions data collection as well as the implications of its use. Interfaces delocalised existing data sets, remove all traces of the places in which they are made, managed and otherwise put to use. Then they present uprooted data within new contexts: unimpeded by the details of data production, unburdened by ethical quandaries that might accompany their use, free from concerns about their unintended consequences. Such interfaces are known by user experience designers as being “frictionless” (p.125) Interfaces establish the subjects positions that users of data are expected to adopt” (p.126).
As data need interpretation, letting data speak for itself is challenged by interfaces, in particular frictionless interfaces, which recontextualise data, and users make decisions that are led by how the data are presented to them through the interface.
In this chapter, Loukissas draws together the six principles and presents six (methodological) implications nd stresses the importance of viewing data as cultural artefacts, that are entangled with local settings and that they are not given. Furthermore, he highlights the necessity to think critically about the data we encounter, to reflect on the otherwise inviable attachments, values, absences and biases in data. One approach to make the invisible visible is by taking a comparative approach to data in which local data settings are compared to another locality as local condition is most productively understood not in relation to some imagined universal bu instead relative to another locality (p.8)
Loukissas furthermore reminds us that “we are all creating data just by living. But data aren’t simply a by-product of life. They are deliberately designed, as much as any visualization, in order to represent selected events or experiences” (p.169).
In the last summarizing chapter, Loukissas unites the principles of the book, and sets out to create practical and applicable ethics of data. Loukissas ends by outlining five steps towards achieving this: read, inquire, represent, unfold and contextualize.
The book by Loukissas is an excellent first read for someone that is interested in social sciences, critical approaches and digital data. It covers crucial, theoretical points and approaches, without become too “dry” too read. On the contrary, Loukissas provides vivid case studies to support his points and beautifully crafted visualizations. In addition to this, the literature list that comes with the book covers many of the seminal texts in e.g. Critial Data Studies and is a good starting point to find additional literature. For me, personally, the book did not deliver substantially new or revolutionary insights, however, I have to pint out that I have been reading quite a bit in the area od critical data studies and science and technology studies. While I did recognize some of the arguments and literature, I applaud Loukissas for writing a book that makes this topic approachable for the broader public as well. Some of the points he made were definitely new to me, e.g. using the data setting instead of the data set and other mtheoretical and methodological implications he presented. Nevertheless, there are a couple of questions I am left with after finishing the book and/or parts I disagree or just other parts I wanted to point out.
- Local or indexes of local knowledge
Loukissas talks about data being local but at other parts of the book he writes that data should be vewed as indexes to local knowledge. To me there is a difference between the statement that all data are local or indexes to local knowledge. While reading the book, the distinction did not become clear to me.
First of all, I do understand, that the book that Loukissas has written is not meant as a “cookbook” or “roadmap” to how practically implemented the claims and implications he describes. Nevertheless, describing data settings is a cumbersome approach to using data, which would defy the positive aspects of using data in certain contexts. I do not believe, that it is possible to actually put anything to use, on a larger scale. On a small scale, in specific workshops or in research projects- definitely yes, but in general? What exactly is the impact of the book, despite being thoguth-provoking? It does not challenge the fact of accessability TO data, around data interpretation or the relative powerlessness of users.
3. Data is interpreted by frictionless interfaces that make decisions “for” the user.
Specifically in the chapter about interfaces, Loukissas does highlight the impact of design on users. How would designers design e.g. platform that could cause friction? Is that feasible? While Loukissas ascribes some of the responsibility to desginers, Williamson (2017) for example ascribes programmers with the responsibility in relation to data and ethics.
4. Lost in translation
Throughout the book, Loukissas problematizes data and algorithms and their local origins and connections. He views data as text and favours reading them in an interpretative context, taking e.g. the historical attachments into considerations. However, I am wondering if that is necessary. When writing scientific articles or books (language in general!), there is also a translation step involved, and attachments are lost. I am not sure how that has been problematized and to what extend historical attachments are taken into consideration. Looking at a books’ content from a more oucauldian perspective, they do contain concepts and notions that might have been appropriate at a specific era, palce and so forth, so I guess whenever there is a dissonance between what is accepted now and what is not accepted in a book might be when we take other attchments into consideration- a kind of temporal comparative approach I guess. However, even though we know OF the attacahments we still interpret the book from our current point of view.
5. Algorithms and Data
I would agree with Loukissas and his view that algorithms are artefact that are created by humans- to a certain extent. I would agree to a certain extend. While this might be true for algorithms written by a person, I doubt that this is applicable to machine-learning. Yes, they are also programmed by a human, but they are also designed to train independently, not even their initial programmers understand what is going on. In addition to this, Loukissas writes that “data aren’t simply a by-product of life. They are deliberately designed, as much as any visualization, in order to represent selected events or experiences” (p.169) and again, I would say that this might still be applicable nowadays, but- to get weird ideas- this might not be true in the future anymore. Simply viewing data as “plural, small, operational and material” might not change the fact that there might be deliberately designed, however independently developing machine learning teachnologies that will generate data as a by-product of life. HOW this data then is viewed, as a means for “objective” decision making or more as an index do local knowledge is crucial though.
6. Nobody is perfect
The book, to me, calls for a bigger change than actually expliclty stated in the book. While it demands to not clean data from “artefacts”, I would take a couple of more steps here. We should not stop at NOT clearning data, instead we should admin failure in more general terms. What I mean is that academic articles for example should not only report the positive findings, not only what worked, but also what did NOT work. Report failures, admit that things go wrong, basically accept that we are not perfect, the world is not perfect. This, in a way collides with the current trend of self-optimzation, the interest in “object” decision making based on data and the notion that AI is going to be a revolution. If we do not start by integrating failures, admitting prejudices, deeply embedded strucxtural discrimination, how on earth are we expected that algorithms and also AI are going to learn? What did we expect them to learn from us? We are not just good, there are negative things we need to address, and if we do not rethink our own attitude on a larger scale, it will not be enough to view scattered data settings.
7. The user?
Loukissas focuses almost excluslively on data, however, data, interfaces, platform are produced and used by programmers, usera, teachers, researchers… Who sits at the other end, according to Loukissas? The “user” in Loukissas is quiet generic and there is not much written about who the user is, not are the skills necessary to e.g. interpret data visualization or enganging with data addressed, discussed or problematized. If Loukissas would have done that, the book would be twice as long though- maybe a second edition is coming 😉
Loukissas, Yanni A. All Data Are Local: Thinking Critically in a Data-Driven Society. Cambridge, Massachusetts: The MIT Press, 2019.
Tupling, Claire. “All Data Are Local: Thinking Critically in a Data-Driven Society: By Yanni Alexander Loukissas, London, The MIT Press, 2019, Pp. 272, Cloth: £24.00, ISBN 978-0-262-03966-6.” Information, Communication & Society, September 10, 2020, 1–3. https://doi.org/10.1080/1369118X.2020.1817523.