Data Colonialism: Critiquing Consent and Control in “Tech for Social Change”
Lack of ethical processes around data collection and management, and ongoing Western control over data, continue the legacy of colonialism within aid work.
Author’s Note: Some of the details have been altered to protect the identity of the author and partners the author has worked with.
While a lot of US-based startups say they’re “saving the world,†there’s a whole field of study for just that. ICT4D (information communication technologies for [international] development) is an interdisciplinary practice that combines tech with international development, human rights, and public health. Projects range from building out technology infrastructure, especially in ‘restrictive environments’ like Cuba,1 to monitoring election violence through social media. ICT4D can also refer to use of technology to improve operations (e.g., building out a data management platform) and building employees’ digital skills (e.g., training on using Excel for data visualization). Funding for ICT4D comes primarily from governmental funds,2 non-governmental organizations (e.g., UN), corporations, NGOs and private foundations, and private individuals (through micro-donation platforms and crowdfunding); the funding available is easily in the billions.3
Like in other sectors of tech, data is a core part of ICT4D inquiry and practice – and far from being a neutral process, the ways that data is collected, stored, processed, analyzed and shared in the field deeply reflects politics, power dynamics and ongoing patterns of privilege and marginalization on a global scale. For the purposes of this article, we argue that two specific trends in ICT4D: lack of ethical processes around data collection and management (in particular, informed consent and opt-out procedures); and ongoing Western control over data (including what technologies are used, where data is stored, etc.), continue the legacy of colonialism within aid work.
The Hidden Politics of ICT4D Data
Photo CC-BY the Submarine Cable Map.
ICT4D projects often involve collecting, monitoring, managing and analyzing large amounts of sensitive data about the people, communities and countries they are working in. While the specific data will vary depending on the project,4 data collected can include health care data, demographic information, and – with advancing technologies – communications data (i.e. phone metadata), sensor data and biometric data. Household surveys are also common, collecting information about the location of the house (sometimes through exact GPS coordinates), demographic information on the household, and data related to the project intervention (e.g., how many times children went to school per week).
Despite the fact that these data sets contain a great deal of personally identifiable information (PII), data protection is often misunderstood and not a priority for the field. Data is typically stored and managed in un-anonymized Excel workbooks and cloud platforms, most of which are unaudited and contain a number of basic security flaws.5 Since most ICT4D efforts are organized outside of the country, they are often able to collect and manage data in ways that would be unacceptable with the more rigorous privacy laws in the US —  (e.g., global health projects do not have to abide by HIPAA and local data protection policies might not exist). As Sean McDonald stated in the executive summary of “Ebola: A Big Data Disaster†(discussed later in this article): “[M]any humanitarian organizations actively encourage governments, charitable foundations, technology companies, and mobile networks to share data in ways that are illegal without user consent or the invocation of governmental emergency powers.†To this point, ICT4D projects often fail to seek or obtain informed consent; and project participants are unable to opt-out. The fact that data is often managed outside of the country of origin, without the input or participation or design of the populations they represent, is itself oppressive; as Anjuan Simmons discusses in “Technology Colonialism”: “there is a danger of setting up a form of imperialism based on personal data. Just as the royal powers of old reached far into the lives of distant colonized people, technology companies gain immense control with every terabyte of personal data they store and analyze.â€
Further, without a centralized governing body or method for creating, implementing, and managing ICT4D projects or research, ethics and review boards for project implementation and research can’t be standardized. While resources such as the responsible data forum and this recent data protection case study exist, they are typically driven by implementers. Where they do exist, policies around data protection are focused on the donor’s reputation, investment, and ability to operate… instead of protecting participant data. This is especially true for projects with foreign relations implications (e.g., funding democratization projects in Iran). This approach to data follows previous colonial models where protecting ‘Western’ interests is of paramount importance, despite risks that these interests intentionally or unintentionally create.6
The author has worked in ICT4D and helped develop systems both for headquarters and project country offices with particular emphases in data collection, management, and protection. They have observed these issues and heard similar experiences from concerned ICT4D practitioners across the globe. To provide some insight, this article will highlight two mini case-studies on how ICT4D data practices around local ownership and consent reaffirm colonialism on a global scale.
Data Ownership and Colonialism
A.) Background
International development has had a complicated relationship to US foreign policy, and has been more overt, in the past, about its partisan nature. Under the guise of international development, the US government funded Creative Associates International to support the Contra rebels in Nicaragua and then to assist the leaders of the 1991 Haitian coup against democratically-elected Jean-Bertrand Aristide.7 This partisan nature of international development work has and continues to be viewed as a threat to the sovereignty of States that are recipients of ‘aid’.
Local ownership and solutions have been a recent trend to separate current international development models from the overtly political past. USAID, for example, unveiled its new agenda in 2010 (USAID Forward), with greater emphasis on local solutions and partnerships. Under USAID Forward, there have been solicitations that require a local implementing partner;8 this emphasis on local solutions is one of the ways USAID is moving to create systems that can exist without aid and deny political venture as the primary purpose for international development. Despite these claims, and some movement towards building local systems across the ICT4D field, this focus has not been extended to data ownership and data systems, leaving a growing area of the industry enmeshed within a colonial system of Western and/or donor control. Â
B.) Mini Case-Study
The author was part of a USAID proposal that was aimed at strengthening an African country’s data management system. This project would work closely with the government and implementing partners in-country to strengthen monitoring and evaluation (M&E) systems, local capacity, and accountability. In other words, the implementer would shape data management and analysis systems for current and future projects in the country. Throughout the proposal, there was a strong emphasis on ‘local’ and improving local systems and capacity. Yet the solicitation explicitly referred to and asked for use of a US-based data management platform. That platform, DevResults, is a closed-source application with an entirely male, predominantly-white engineering team; none of the DevResults team is African or lives in Africa, making this solution inherently non-local.
Finding this problematic, the proposal team (including the author) instead suggested an open source platform (DHIS2) already in use in the country, with residents of the country as the key staff. There would be US-based companies involved as a support team, but the project would focus on building local capacity and local systems with local data ownership. Using an open source platform would also be cheaper than using DevResults.
In spite of these factors, USAID ended up going with the proposal to use DevResults.
While there could be other factors involved with the decision making process, budget and key personnel tend to be weighted highly during evaluation (even if not explicitly stated). Three conclusions stand out from this process:
- If “data is the new oilâ€, then USAID is invested in owning the data. Like much of international development programming, a valuable resource is at stake with an eventual goal of ‘Western’ ownership. Technology and data become a mechanism of a less overt colonialism.
- USAID would prefer to have non-local key personnel and management for data projects. If the goal were to build local capacity, then having more local staff in higher positions as key personnel would be greater valued. While this might be true for non-data related projects, there is a desire for ‘Western’ paternalism when it comes to managing data.
- There is a preference for white males at the top of the hierarchy. Much like in the international development and technology industries, marginalized people are typically given supporting roles to white male leaders and decision makers. While never explicitly stated, the donor’s choices and comments on the proposal indicated a pattern of discriminating against marginalized people in key leadership roles.
C.) Discussion
From a funder’s perspective, there is little incentive to truly allow local ownership of data. International development is an opportunity to create new markets that are monopolized or have a strong benefit for the funding organization–whether a government or corporation. Take Internet.org/Free Basics, Facebook’s effort to provide people in Africa, the Middle East, Asia Pacific and Latin America with free access to basic websites and web tools… including, of course, Facebook. While altruistic on the surface, this allows Facebook (a majority White, Western company) to create a new global market for social media and unilaterally push Facebook adoption. In this and other cases, local ownership of data threatens the ability of funders to freely benefit from the data local communities create; thus, we see non-local ownership prioritized throughout the ICT4D ecosystem.
Informed Consent and Colonialism
A.) Background
The idea of informed consent is not new to international development. However, new forms of data collection (via SMS, sensors, digital metadata) that do not involve human survey collectors have changed what informed consent means. Even when human enumerators are still present, informed consent is more complicated and not always received due to a lack of understanding what data is collected with digital tools.
One of the key concepts of informed consent is the availability of a simple opt-out where the project participant can choose not to provide sensitive data and still receive services. In implementation, this can be problematic because an opt-out is not always provided and if it is, might be perceived by project participants as a prerequisite to receiving services. Because most humanitarian organizations work or coordinate with governments, the inability to opt-out from giving up data could either put them at risk of the government, or being unable to receive needed services, such as food and water.
Use of biometric in refugee camps has been criticized for these very reasons, yet, due to the perceived success, biometrics are starting to be required primarily for other humanitarian assistance projects, particularly in the area of fraud prevention. As these and other emerging forms of technology converge and intersect with donor interests, issues of informed consent come to the forefront.
B.) Mini Case-Study
When the most recent ebola crisis in West Africa broke out in 2014, international organizations deviated from their typical way of coordinating the response (i.e., sharing data among responding organizations) to using telecommunication data (i.e., ‘big data’) as a new method for responding to humanitarian crises. In the case of the ebola response, call detail records (CDRs) were used to do contact tracing–a method of mapping (identifying and diagnosing) everyone who came into contact with a specific disease. A recent study of the use of data in the ebola response by Sean McDonald looked at the ethics of using detailed telecommunications data, which is typically illegal for use by non-governmental organizations. While McDonald only tacitly states that the humanitarian organizations acted illegally, he does note that CDR is typically not used due to a number of data protection and privacy laws. General illegality of using CDRs stems from its difficulty to de-identify/anonymize — in the case of the ebola response, CDRs for contact tracing needed to be able identify a single user and their locations, making any sort of de-identification dubious at best.
Contact tracing has been used successfully with ebola response in the past; however, McDonald notes that the technical capacity of responders using CDRs was not sophisticated and therefore of dubious value. The data were manually entered and therefore could not provide real-time data. His conclusion is that CDRs provided questionable, or no, real benefit to other sources of data, such as burial locations, which would have been more helpful and have been used in the past.9 If we follow McDonald’s conclusion that CDRs were ineffective, the only substantial value of their use is to create the infrastructure for large-scale surveillance of the citizens in those countries. Â
C.)Â Discussion
The desire of humanitarian organizations to reap the benefit of data, in the case of ebola, led to questionable practice that valued methodological experimentation over the human beings they were providing ‘aid’ to. The author has spoken to a number of accountability activists in the West African countries affected by ebola, where their work against government corruption puts their life in danger. ICT4D’s failed data experiment and the surveillance network it created further endangers these activists by providing the full infrastructure for surveillance to their governments.
Even if the use of CDR data was a resounding success, no one has asked the citizens if they would have been willing to give their consent given the issues around government corruption. By taking a patronizing perspective, ICT4D removes agency of the citizens for change and potentially damages their ability to organize against corruption in the future. By acting without consent, donors and implementers demonstrate their true interests do not lie in protecting project participants, but their own interests — even if their interests are merely to experiment with methodology.
Conclusion
Photo CC-BYÂ CWCS Managed Hosting.
Profound issues of local ownership and consent underlie ICT4D’s practice of data collection and management. It seems clear that the lack of protections are used as another form of exploitation on the ‘Global South’ under the guise of aid, and the primary benefit is not intended for the project participants.
At a minimum, local data ownership and consent are paramount to breaking this cycle. As a starting point, technology activists in the ‘West’ could work to create sustainable models that respect users’ data and provide them ownership. Since ICT4D and international development will follow ‘innovation’, providing alternatives to the current exploitative models reduces the excuses ICT4D funders and implementers can give to the objectification and exploitation of marginalized and vulnerable people around the globe.
If technology is going to be used for social good, whether in ICT4D or any other sector, the questions around data ownership and use need to be considered to prevent it being another mode of exploitation and colonialism.
Footnotes
- One such example is programming implemented by Development Associates International (DAI) to bring communication technology and infrastructure access to the Cuban Jewish community. This project led to the arrest of one of DAI’s employee, Alan Gross, who was viewed as a spy by the Cuban government. He was later released in 2014 as part of normalizing relations with Cuba.
- The US government is one of the largest funders (if not the largest funder) of International Development.
- Foreign assistance typically receives around 1% of the annual budget. For 2017 budget, that is $42.4 billion with $25.6 billion for “Economic and Development†funding, which would include USAID. More information: Washington Post, “The U.S. foreign-aid budget, visualized”; USAID 2017 Budget.Â
- For examples of datasets collected, see USAID’s Development Data Library. These datasets will most likely be de-identified and might be aggregated, but they show the extent of data collected for different project types.
- At least one of the organizations that the author worked at had a data collection platform that did not use HTTPS.
- To the author’s knowledge, there has only been a response to aid workers arrest or kidnapping when they are working for a US-based organization. As in the case of Alan Gross, mentioned above, the release of a US aid worker can be part of larger political deals indicating the value and extent a government will go to protect an aid worker that is heavily connected to the US.
- Dr Kenneth J. Saltman, “Creative Associates International: Corporate Education and ‘Democracy Promotion’ in Iraqâ€, The Review of Education, Pedagogy, and Cultural Studies, Volume 28, pp.25–65, 2006.
- There is at least one instance that the author knows of where at award, USAID made a US-based firm the contract prime instead of the local partner.
- In conversation with McDonald, the author learned that one of the main benefits of gaining access to CDRs is that a large scale surveillance system has been created within these countries. While this can be helpful in the case of medical emergencies and responses, it also provides a tool ready for abuse.