Why data is never neutral
The conversation is not over - we need to confront the important questions raised by any discussion around digital identity and data use, argues Amos.
By Amos Doornbos
Sage Journal recently published an article by the brilliant Keren Weitzberg, Margie Cheesman, Aaron Martin, Emrys Schoemaker. Entitled Between surveillance and recognition: Rethinking digital identity in aid , it raises vital questions we need to be considering when we collect, process and use data, as humanitarians.
Data about people is collected daily. This data has the potential to be helpful and harmful. The technology used to collect the data shapes the data and is shaped by it. And the creators of the technology shape the technology. And when we use technology to collect data we also bring our worldview into the process. Therefore, both the data and the technology are not neutral. However, we need various pieces of data to implement projects so throwing out data and technology is not a viable option.
Data and technology can be used for surveillance. And almost every project does surveillance – we just call it monitoring. However, we try to avoid surveillance capitalism. This journal article makes a similar argument. The article argues we need more research, especially in the humanitarian space, about the nuances of power and complexity in data collection in the humanitarian context. And that it is not an either or choice. As the article states:
…components of data collection and identification are essential in delivering aid, and there are potential benefits to using digital technology for aid distribution – both for humanitarian institutions and recipients of aid. Surveillance for purposes of care is not simply a narrowly medical practice. We need more nuanced research that recognises and unravels the complex motivations and practices of aid organisations as well as the variety of experiences and perspectives that aid subjects have with data and technology.
I couldn’t agree more. So much of this is wrapped up in issues around data and digital governance. But governance should not be defined by an organisation, but rather in collaboration with a diversity of contributors. Governance needs to think beyond just the collection, and address both the use AND the disposal.
For example, biometrics can be hugely invasive and their use has a high potential for harm. But it also can be the easiest and most efficient way to ensure a person is getting her whole treatment. However, governance thinking can help us ensure once her treatment is over, the biometric is deleted as it is no longer needed.
One of the critical nuances we need to explore more is the challenge of the long tail of data and technology, which is often caused by speculation and scope creep. And these sit within the circles of complexity and power. Or as the article states ‘the complex motivations and practices of aid organisations.’
And no, this is not about privacy. It is about human rights, justice, and how to live wisely in a digital world. And it is needed more than ever.
The choice is up to us.
This is an edited version of a personal blog by Amos, you can read the original here.