Authors:Morten Hansen Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. The economic logic of the attention economy is frequently used to critique and respond to the dangers of unfettered technological expansion, including nascent platforms and products powered by generative artificial intelligence. This commentary warns that while large parts of the internet have been financed through such business models, there is no guarantee that emerging generative artificial intelligence products will be commercialized in this way too. Instead, I argue, we must look beyond the attention economy to predict the future of monetization of an industry already mired in anti-competitive practices. Using popular large language models such as OpenAI’s ChatGPT as a case, I discuss how some platforms are developing computational dependencies between technology and their users. I propose the term ‘cognitive lock-in’ to help us unpack the implications of such technological dependencies, and redirect the study of this nascent business model. Citation: Big Data & Society PubDate: 2024-08-19T06:05:03Z DOI: 10.1177/20539517241275878 Issue No:Vol. 11, No. 3 (2024)
Authors:Aaron Tucker Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. Beginning in the 1990s, the National Institute of Standards and Technology (NIST) leveraged the 1980s’ American War on Drugs to improve and expand facial recognition technology (FRT) infrastructure, including the domestic building of FRTs reliant on mugshots. When examining mugshot databases gathered by the NIST, such as the Multiple Encounters Dataset (MEDS) I and II (2010) and Special Database 18 Mugshot Identification Database (SD-18) (2016), it is clear that the same gendered and racialized dynamics present in policing practices related to the War on Drugs is reflected in the mugshot databases that continue to use for FRT research and evaluation into the contemporary moment. This paper details the SD-18 and MEDS databases, as well as the MORPH database, showcasing how their representational, technical and political protocols operate. The desires for frictionless interoperability built into the images’ technical protocols supersede concerns for eugenic political and representational protocols, resulting in a current moment where the deployment of mugshot datasets cannot be contained to their original intended use with FRTs, but leak into other forms of algorithmic governance as well as into algorithmic image-making and visual culture, including generative artificial intelligence systems such as DALL-E. Citation: Big Data & Society PubDate: 2024-08-16T11:06:34Z DOI: 10.1177/20539517241274593 Issue No:Vol. 11, No. 3 (2024)
Authors:Lindsay Poirier Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. This article documents the “context cultures” underpinning efforts to develop regulations for collecting and reporting data in a United States public database known as Open Payments. Open Payments is a dataset published annually by the US Center for Medicare and Medicaid Services that documents the transfers of value from pharmaceutical and medical device manufacturers to physicians, prescribing non-physicians, and teaching hospitals. In the article, I show context became a manifold concern as differentially-situated actors engaged in modes of public advocacy and social action around not only what data meant, but also what it meant to make data meaningful. I show how “context” took on multiple meanings as it was brought into relationship with certain concepts (such as “light,” “transparency,” and “interpretation”) and as stakeholders developed arguments for where they believed meaning should originate. In presenting this case, I call for further ethnographic attention to the ways in which meaning-making is enacted in relation to datasets—particularly those datasets intended to hold institutions accountable. I conclude the article meditating on the political significance of attending to various “context cultures” when putting data signification in context, along with the implications for how critical data studies scholars historicize big data epistemologies and rhetoric. Citation: Big Data & Society PubDate: 2024-08-13T06:43:09Z DOI: 10.1177/20539517241270656 Issue No:Vol. 11, No. 3 (2024)
Authors:Semire Yekta, Daniel Neyland Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. Recent years have seen significant increases in online fraud. The fact that online fraud represents a major challenge to law enforcement due to its complexities and global impact has led to the emergence of other organisations – such as Customer Service Centres – taking a key role in ‘policing’ fraudulent activities. However, the responses made by these specialist organisations remain opaque and outside the scope of regimes that regulate law enforcement agencies. In this article we address this opacity through our study of a Customer Service Centre that makes decisions on what constitutes online fraud in cases of card-not-present payments. We carefully work through these decision-making processes to explore the immediate pressures made manifest in decisions on fraud around, for example, cost and timeliness. These pressures become apparent in the particular arrangements of accountability and responsibility in decisions on online fraud and cut what might otherwise be lengthy procedures following these decisions. As a result we suggest that accountability in these fraud cases is managed and held ‘in the moment’ within the Centre. The article contributes to our understanding of online fraud and to the growing debate on digital accountability. The article provides avenues for further exploration of the challenges of moving from internal to external accountability in relation to largely opaque and data-sensitive settings where accountability relations are held ‘in the moment.’ Citation: Big Data & Society PubDate: 2024-08-13T06:42:30Z DOI: 10.1177/20539517241266403 Issue No:Vol. 11, No. 3 (2024)
Authors:Anders Kristian Munk, Mathieu Jacomy, Matilde Ficozzi, Torben Elgaard Jensen Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. Mounting critique of the way AI is framed in mainstream media calls for less sensationalist coverage, be it jubilant or apocalyptic, and more attention to the concrete situations in which AI becomes controversial in different ways. This is supposedly achieved by making coverage more expert-informed. We therefore explore how experts contribute to the issuefication of AI through the scientific literature. We provide a semantic, visual network analysis of a corpus of 1M scientific abstracts about machine learning algorithms and artificial intelligence. Through a systematic quali-quantitative exploration of 235 co-word clusters and a subsequent structured search for 18 issue-specific queries, for which we devise a novel method with a custom-built datascape, we explore how algorithms have agency. We find that scientific discourse is highly situated and rarely about AI in general. It overwhelmingly charges algorithms with the capacity to solve problems and these problems are rarely about algorithms in their origin. Conversely, it rarely charges algorithms with the capacity to cause problems and when it does, other algorithms are typically charged with the capacity to solve them. Based on these findings, we argue that while a more expert-informed coverage of AI is likely to be less sensationalist and show greater attention to the specific situations where algorithms make a difference, it is unlikely to stage AI as particularly controversial. Consequently, we suggest conceptualising AI as a political situation rather than something inherently controversial. Citation: Big Data & Society PubDate: 2024-08-07T01:34:26Z DOI: 10.1177/20539517241255107 Issue No:Vol. 11, No. 3 (2024)
Authors:Sophie Mützel, Markus Unternährer Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. Research on the digital economy has highlighted the assetization of data. This article argues for expanding existing research on data and datafication processes by focusing on how relationships are made and unmade through and from data. We introduce a general analytic model of “relationing” and show how relationships between users, companies, and products are created in three different moments—entanglement, dissection, and matching—first in the digital economy, then in physical stores. We show how payments with mobile phones connect the digital to the brick-and-mortar economy. Applying our model, we illustrate how a mobile phone's various data streams, money's record-keeping function, and retailers’ loyalty programs produce qualitatively and quantitatively new relations between customers, retailers, banks, app providers, and payment intermediaries. We argue that “relational embedding” captures the inherent relationality between users, their data points, and other economic actors: algorithmically relating users’ data profiles to other users’ profiles yields personalized recommendations, ads, or rebates, continuing the relationship between retailers and customers. Citation: Big Data & Society PubDate: 2024-08-06T09:20:10Z DOI: 10.1177/20539517241266432 Issue No:Vol. 11, No. 3 (2024)
Authors:Nora A Draper, Christian Pieter Hoffmann, Christoph Lutz, Giulia Ranzini, Joseph Turow Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. The growing trend of collecting data about individuals to track past actions and infer future attitudes and behaviors has fueled popular and scholarly interest in the erosion of privacy. Recent shifts in technologies around machine learning and artificial intelligence have intensified these concerns. This editorial introduces the articles in the special theme on digital resignation and privacy cynicism: concepts developed in the past decade to explain the growing powerlessness individuals feel in relation to their digital privacy even as they continue to experience consternation over the collection and use of their personal information. The papers in this special theme engage and extend existing research on these topics. The original articles and commentaries pose theoretical and practical questions related to the ways people confront the powerful institutional forces that increasingly shape many aspects of the information environment. They employ several methodologies and theoretical perspectives and extend the range of geographic, political, cultural, and institutional contexts in which privacy cynicism and digital resignation can be identified and examined. In addition to contextualizing these contributions, this editorial maps a range of related concepts including digital resignation, privacy cynicism, privacy apathy, surveillance realism, privacy fatigue, and privacy helplessness. It concludes by identifying key themes across the papers in this collection and provides directions for future research. Citation: Big Data & Society PubDate: 2024-08-06T09:19:38Z DOI: 10.1177/20539517241270663 Issue No:Vol. 11, No. 3 (2024)
Authors:Hannah Knox Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. Building on work which has shown the role of digital technologies in reframing environmental relations, this paper explores ethnographically how environmental data is reconfiguring the concept of place. The paper takes as its focus an action-research project within a UK based, citizen-oriented initiative called Newtown Energy Futures, in which we sought to enfold climate and energy data into a social-justice informed attempt at climate action. By exploring how the project used data as an invitation for citizens to engage with and participate in local infrastructural and environmental dynamics, the paper sheds light on how environmental data came to participate in the making of place and in doing so raised questions about how to rebuild the socio-material relations through which ‘a sense of place’ might be reproduced. As climate and energy data increasingly demand that places become enrolled into environmental projects our findings suggest that data enables place to emerge as a ‘socio-technical potentiality’ an observation that has implications both for both engagement with, and the study of data and place. In practical terms, we suggest that this refiguration of place has the effect of creating hopeful trajectories for change, whilst also posing difficult questions about the limits of participation in a data-infused form of place-based politics. Citation: Big Data & Society PubDate: 2024-08-06T09:19:08Z DOI: 10.1177/20539517241266404 Issue No:Vol. 11, No. 3 (2024)
Authors:Lena Ulbricht, Simon Egbert Abstract: Big Data & Society, Volume 11, Issue 3, July-September 2024. Organizations increasingly rely on digital technologies to perform tasks. To do so, they have to integrate data banks to make the data usable. We argue that there is a growing, academically underexplored market consisting of data integration and analysis platforms. We explain that, especially in the public sector, the regulatory implications of data integration and analysis must be studied because they affect vulnerable citizens and because it is not just a matter of state agencies overseeing technology companies but also of the state overseeing itself. We propose a platform-theory-based conceptual approach that directs our attention towards the specific characteristics of platforms—such as datafication, modularity, and multilaterality and the associated regulatory challenges. Due to a scarcity of empirical analyses about how public sector platforms are regulated, we undertake an in-depth case study of a data integration and analysis platform operated by Palantir Technologies in the German federal state of Hesse. Our analysis of the regulatory activities and conflicts uncovers many obstacles to effective platform regulation. Drawing on recent initiatives to improve intermediary liability, we ultimately point to additional paths for regulating public sector platforms. Our findings also highlight the importance of political factors in platform regulation-as-a-practice. We conclude that platform regulation in the public sector is not only about technology-specific regulation but also about general mechanisms of democratic control, such as the separation of power, public transparency, and civil rights. Citation: Big Data & Society PubDate: 2024-08-06T09:18:49Z DOI: 10.1177/20539517241255108 Issue No:Vol. 11, No. 3 (2024)