Participants

  • Kathleen Fitzpatrick, Director of Digital Humanities and Professor of English, Michigan State University

  • Trevor Muñoz, Interim Director of the Maryland Institute for Technology in the Humanities and Assistant Dean for Digital Humanities Research at the U of Maryland Libraries, U of Maryland

 

  • Lisa Nakamura, Gwendolyn Calvert Baker Collegiate Professor of American Cultures, U of Michigan

  • Catherine D’Ignazio/kanarinka, Assistant Professor of Civic Media and Data Visualization at Emerson College, (and a Faculty Director at the Engagement Lab and a research affiliate at the MIT Center for Civic Media & MIT Media Lab)

Data Feminism

As visualization designers and as scholars of information more generally, we (Catherine D’Ignazio and Lauren Klein) are keenly aware of how data visualizations can dazzle, inform, and persuade. It is precisely this power that makes it worth asking: “Visualization by whom? For whom? In whose interest? Informed by whose values?” These are some of the questions that emerge from what we call data feminism, a way of thinking about data and its visualization that is informed by the past several decades of feminist activism and critical thought. Data feminism prompts questions about how, for instance, challenges to the male/female binary can also help challenge other binary and hierarchical classification systems. It encourages us to ask how the concept of invisible labor can help to expose the invisible forms of labor associated with data work. And it points to how an understanding of affective and embodied knowledge can help to expand the notion of what constitutes an effective data visualization and what does not.

Using visualization as a starting point, this paper works backwards through the data-processing pipeline in order to show how a feminist approach to thinking about data not only exposes how power and privilege presently operate in visualization work, but also suggests how different design principles can help to mitigate inequality and work towards justice. To do so, we draw upon examples of our own and others’ activist projects–such as D’Ignazio’s Mapping the Globe (2013), pictured below, which employs natural language processing techniques and mapping to call attention to socioeconomic disparities in local newspaper reporting; and Klein’s Floor Chart (2018), which employs physical computing materials to reimagine a lost historical visualization technique. We also dram from feminist scholarship from the fields of Science & Technology Studies, Geography/GIS, the Digital Humanities and Human Computer Interaction, among others (e.g. Bardzell and Bardzell 2011, Hill et al. 2016, Kwan 2002, and Wernimont and Losh 2018). And we build upon our prior work on the subject of feminist data visualization (2016) in order to elaborate six principles of data feminism: rethink binaries, consider context, legitimize embodiment and affect, embrace pluralism, examine power and aspire to empowerment, and make labor visible. Here, we will introduce these principles and explain how they can help to mitigate and re-balance the unequal distribution of power and privilege in the world today.

  • Erik Loyer, Creative Director of The Alliance for Networking Visual Culture and Founder and Director of Opertoon

What You See Is Not What You Get: The Virtues of Delaying Design

A look at how two composition tools, Scalar and Stepworks, encourage authors to focus on content and structure independently of visual form, with the goal of increasing transparency, mobility, and reinterpretation.

  • Matthew Jones, Professor of Contemporary Civilization, Columbia University

Learning to Love Black Boxes: From AI to Machine Learning and Back Again

In the last two decades, a highly instrumentalist form of statistical and machine learning has achieved an extraordinary success as the computational heart of the phenomenon glossed as “predictive analytics,” “data mining” or “data science.” How and why has this happened? The current movement of data-focused computational analysis emerged from the loose confederating of a range of areas of inquiry focused on data that developed through the Cold War on both sides of the Iron Curtain, domains that have exploded in the commercial, national security, and academic worlds since the early 1990s.

  • Safiya Noble, Assistant Professor, Annenberg School for Communication and Journalism, University of Southern California

A Social Framework for Understanding Algorithmic-Determination

In this paper, I discuss some of the consequences of algorithmic-determination, which is often thought of as offering automated ease in computation, yet holds cultural import and meaning as well. I offer frameworks for thinking about how “big data” projects might perpetuate the uneven distribution of resources, or may obscure important features of the social, political, and economic landscape.

  • Liz Losh, Associate Professor of English and American Studies, William and Mary

#

Celebration and criticism of so-called “hashtag activism” rarely addresses the hashtag itself as an artifact or tries to locate its place in the history of information design. Although the story of the hashtag tends to be associated with Silicon Valley invention myths or power users like celebrities, the hashtag is actually the result of accreted sets of practices and invisible labor involving negotiating competing claims about identity, ownership, and naming conventions.  This talk discusses how the #hashtag actually exists in two pieces, with two separate but related design histories. The # is a special kind of character used to facilitate non-human machine-to-machine communication that has a prehistory in teletype machines, touch-tone telephones, and IRC chat. The letters after the # also are part of a bigger narrative: the human-to-human story of metadata. The history of technological adoption and adaptation by social movements and hashtag feminism in particular offers a new way to think about theories of political performance and assembly by Judith Buthler and others.

  • Virginia Eubanks, Associate Professor of Political Science, University at Albany, SUNY

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor

In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. “This book is downright scary,” says Naomi Klein, “but with its striking research and moving, indelible portraits of life in the ‘digital poorhouse,’ you will emerge smarter and more empowered to demand justice.” Join us for a rousing conversation about this timely and provocative book.

  • Richard So, English/Cultural Analytics, McGill University

The Geometry of Whiteness: Bridging the Gap between Critical Race Studies and Cultural Analytics

This paper attempts to imagine a version of cultural analytics – the use of computers and quantitative models to study culture – focused on race that is commensurable with critical race studies. The two are often seen as incompatible, the former relying on the quantification and creation of artificial categories of identity and the latter fixated on the deconstruction of those categories. The long history of the use of statistics (and more recently computational algorithms) to degrade racial minorities looms over any new attempts to quantify race and racial difference. I develop a case study focused on

Random House publishing and race to argue that a conceptual and methodological meeting is possible between these two fields, one that generatively pushes the limits of both.

  •  Jo Guldi, Assistant Professor of History, Southern Methodist University

From Critical Thinking to Critical Search: Working between microhistory and macrohistory with big data

The tools of text mining promise the power to synthesize enormous swaths of time, but these overarching narratives leave out many of the local, specific, and personal interactions that historians most value for their power to reveal.  How does the critical-thinking scholar identify particularly symptomatic episodes within a large canon of text?  This paper introduces a model of “critical search,” which relies on an iterative engagement with secondary sources, critical theory, and digitally-enabled synthesis to move from macrohistorical questions to microhistorical episodes.