By Olivia Iannelli, Research Analyst at Trilateral Research Ltd.


itrack1.jpgRecent years have seen an increase in the development and implementation of information and communication technologies (ICTs) in the humanitarian sphere.  Non-governmental organizations (NGOs), international agencies, humanitarian organisations, governments, and private sector actors have in fact begun “designing, adopting, and employing ICTs including smartphone apps, remote sensing platforms such as satellite imagery analysis, surveillance drones and other forms of digital data collection and analytics, as standard components of sectoral and cross-sectoral responses to both the threat and alleged committal of mass atrocities in a variety of operational and geographic contexts.” [1] ICTs, have thus, become common components in the humanitarian sector [2] making up what is coined digital humanitarianism.
Although, they are created and deployed by humanitarian organisations with the best of intentions, these ICTs “are occurring in the absence of agreed normative frameworks and accepted theory to guide their ethical and responsible use.” [3] There have been instances where the presence and use of such, often experimental, ICT tools and technologies, could cause more ‘harm’ than good. [4] For instance, Rohingya refugees in Bangladesh fear that “smart cards” with biometric data (fingerprints iris scans) will be shared with their persecutors in Myanmar. [5]
Social and cultural differences are considered as related to the habits, beliefs and traditions which characterise a society. Such necessary considerations include: gender issues, social impact, liability, trust, and religious and cultural issues.


Although, there is a large amounts of literature covering the topic of digital humanitarianism, there is no framework for understanding the nuanced social and cultural differences and according to Kristin Bergtora Sandvik from the University of Oslo and Nathaniel A. Raymond from Harvard there seems to be “a very weak community-wide interest so far in the ethical dimensions of ICT use for mass atrocity-producing contexts.” [6]


iTRACK is a three-year project funded by the European Commission under the H2020 Secure Societies Programme. Its focus is on improving the protection of humanitarian personnel as well as the assets of humanitarian, public or private organisations through an integrated socio-technical tracking solution to support threat detection, navigation, logistics and coordination in humanitarian disasters.


itrack2.jpgFor this reason, the iTRACK consortium decided to provide an evaluation of the socio-cultural considerations that should be made in the future development and planned exploitation of the iTRACK system. To compile this evaluation, the consortium undertook desk-based research as well as interviews with internal partners to the project as well as external actors, including academics, humanitarian workers, data, privacy and ethics practitioners as well as technology developers. This evaluation culminated in a report entitled “social-cultural consideration for future development” which has been published on the iTRACK website and which this blog post seeks to summarise somewhat.



Amongst its components, iTRACK will allow humanitarian organisations to track convoys, vehicles and humanitarian workers during their missions in order to increase their protection. Amongst its findings, the report, discussed above, states that this tracking may affect the working patterns of humanitarian workers in the field. This is particularly problematic in a humanitarian context where the environment is unpredictable and volatile. In fact, in humanitarian missions, humanitarian workers often make unexpected detours or lengthy stops for a variety of different reasons which could be incredibly influential to the success or failure of the mission. However, the humanitarian organisation, which is tracking the convoy’s journey through iTRACK may not necessarily understand the cultural context and nuances and thus, the need for such pit stops or diversions.

For this reason, remote observers to the mission may consider such an apparent deviation to be an inefficiency and a disturbance in the mission. They may also consider that, for instance, the driver has placed the mission in jeopardy by carrying out personal errands or meetings during work hours. Drivers are one example, but actors which may be affected also include local and non-local humanitarian workers who may have to divert from a pre-determined route or make an unscheduled stop due to unforeseeable circumstances. Or perhaps use assets as bargaining chips to be able to cross a checkpoint.

One of the stakeholders interviewed for the report, stated that many missions are laced with social and cultural considerations and humanitarian technology may be unable to grasp these. This example was given instance:

“We were driving in Sri Lanka and our driver starts plucking leaves from a bush. He was collecting neems plants which if you give it to the local people it is considered an act of hospitality. The driver said that to enter into the community we should bring the women these neem leaves and they will take this as a sign that we are being culturally appropriate. When we arrived, they were totally at ease and happy to see us and we had a successful meeting. Now you run that through iTRACK you stopped by the side of the road to go play in the bushes for 25 minutes is that an inefficiency or is that a mission success factor? The fact is your system is never going to be able to detect that. There is no algorithm to detect whether you are hanging out with someone or doing something because you want to or because it is helping you get to your objective.” [7]


This divide and often tension between the reality of the experience of humanitarian workers on the ground and the expectations of other actors at headquarters is something which is not new to the humanitarian field. There are often disagreements on procedures and ways of operating. The introduction of new technologies however, has the potential to exacerbate this. One of the stakeholders interviewed for the report, outlined this divide and the difficulties that could come
with a tracking device:

“In the Netherlands or Geneva you will have people who have no context of what you are doing, so you are going to have 20-30 confounders a week saying ‘what the hell, why are you stopping at that coffee shop or hardware store’ you were supposed to be there three hours ago but you went to see his cousin. Well, his cousin is the local deputy agricultural minister and doing that means we get to go see the agricultural project one day earlier. Yes, we spent three hours talking to someone, but they made all the difference, were we off schedule or were we ahead of plan?”  [8]

As the consortium’s evaluation pointed out, however, there are other components within the iTRACK system, that may allow for the unexpected diversions, and the reasons behind these, to be successfully communicated to Mission Control Rooms and then to Headquarters.

The report outlines several other issues related to the implementation of iTRACK and its multiple components, revolving mostly around the issues of safety and security, privacy, trust and transparency. Other issues are also considered such as reliance on technology and potential misinterpretations. It concludes that many of these issues outlined are already present due to existing technologies. In fact, some stakeholders interviewed for the report expressed an indifference to being tracked, stating that tracking is already possible through mobile phones and other devices. The fitness tracking app, Strava, is a recent example. This app gave away the location of foreign military bases in Afghanistan, Djibouti and Syria. Other stakeholders even expressed a desire to be tracked more, in order to improve accountability and liability issues.

At the forefront of its recommendations, the report states that even before the implementation of new technologies, understanding the potential social and cultural affects should be at the centre of the work done by designers, developers and humanitarians. These require critical questions not just about the risks, biases, and negative consequences of bringing a new technology into practice, but also a deep understanding of the novel opportunities, openings, and promises that develop.


[1] Sandvik, Kristin Bergtora and Raymond, Nathaniel A. (2017) “Beyond the Protective Effect: Towards a Theory of Harm for Information Communication Technologies in Mass Atrocity Response,” Genocide Studies and Prevention: An International Journal: Vol. 11: Iss. 1: 9-24, pg. 8.

[2] Ibid

[3] See N. 1 pg. 9

[4] Lindskov Jacobsen, K., (2015), Humanitarian technology : revisiting the ‘do not harm’ debate, Humanitarian Practice Network, 18 September 2015, available on : https://odihpn.org/blog/humanitarian-technology-revisiting-the-%C2%91do-no-harm%C2%92-debate/ , last accessed on : 12 July 2018

[5] Rohingya Refugees, Strike for Rights, Press Release, 26 November 2018, available on: https://twitter.com/KenRoth/status/1067108125143121922

[6] Kate Crawford and Megan Finn, (2015) “The Limits of Crisis Data: Analytical and Ethical Challenges of Using Social and Mobile Data to Understand Disasters,” GeoJournal 80, no. 4, 491-502 in: Sandvik, K. B. and Raymond, N., A. (2017) “Beyond the Protective Effect: Towards a Theory of Harm for Information Communication Technologies in Mass Atrocity Response,” Genocide Studies and Prevention: An International Journal: Vol. 11: Iss. 1: 9-24

[7] Interview, May 2018, conducted by Olivia Iannelli.

[8] Ibid