“It takes more than one gender to have gender
inequality; and more than one gender to work towards justice.” In their introduction to Data Feminism, researchers Catherine D’ Ignazio and Lauren Klein
elucidate the need to look at data inclusively. Using the theories of
intersectional feminists across decades, the duo of MIT professors shed light
on how data collection and analysis in the modern world is biased.
Data collection and analysis is biased against women, people of colour,
immigrants, queer people, people with disabilities and anyone who doesn’t fit
the mould of an upper class white male. Data feminism breaks these boundaries
down and builds on the common experience of exclusion. It is for everyone (just
as feminism is), isn’t only about women, is not restricted to gender, and most
importantly – is about power. The
concept explains who the gatekeepers of power are and which people have
historically had least access to power – applied to a new world built on data.
As a student of international relations; studio art, design and theory and
media arts and sciences, D’Ignazio is an impressively interdisciplinary
researcher. As Director of the Data + Feminism lab at MIT, it is only natural
for her to have developed the concept of data feminism with Lauren Klein. An
equally seasoned all-rounder, Klein holds a PhD in English and has segued from
early American literature to racial justice in the digital humanities. Through
data feminism, the authors have given a name and voice to the feeling of power
imbalance in science and technology.
In their opening chapter, they write about cautious optimism when thinking of
data that does ‘good’ in the world. The reader is urged to think of datasets
holistically – in terms of the bodies that constitute data and those who are
left out. Data Feminism is flush with examples of predictive models which use
exclusionary benchmarks. For instance, computer scientist Joy Buolamwini set
off an entire movement titled the ‘Algorithmic Justice League’ by exposing how
a JavaScript software library didn’t allow her camera to recognise her face –
because it wasn’t ‘white enough’. A free software library, built on a detailed
dataset didn’t have enough black people, since the facial analysis technology
used male and pale faces as benchmarks.
There is a larger, unsettling problem which is brewing as machine learning
progresses in leaps and bounds. A problem which speaks to the oversights of
society – of limited women in STEM and the lack of sensitization for engineers.
Unless the boys clubs of computer engineering rooms are made accessible to all,
our algorithms will hold a mirror to the shortcomings of our civilizations.
Shortcomings which not only replicate bias against women, but against all
minorities. As Data Feminism puts it,
“Feminism is unfinished and urgent work, in data and technology as well as in
our most powerful political institutions.”
Trisha Pande is a Policy Manager at The Dialogue and can be reached at trisha@thedialogue.co.