Data are not dangerous: A response to recent MSF CRASH critiques
Abby Stoddard, Adele Harmer & Katherine Haver
Abby Stoddard has coordinated research on international humanitarian action since 2000 at New York University's Center on International Cooperation, where she now holds the title of Senior Program Advisor. Prior to that, she was Program Director for Doctors of the World USA, and worked in field and headquarters positions at CARE USA. She is the author of Humanitarian Alert: NGO Information and its Impact on US Foreign Policy (Kumarian Press, 2006), along with numerous articles, reports, and book chapters on humanitarian action and non-governmental organizations. As of late 2011 she is also serving on the Board of Directors of Doctors of the World (MDM-USA).
Adele Harmer has worked on humanitarian aid policy issues, as both a researcher and for the Australian Government. Prior to joining Humanitarian Outcomes, Adele was a Research Fellow with the Humanitarian Policy Group in the Overseas Development Institute. Her work includes research on donor behavior, humanitarian principles, civil-military relations and operational security. Adele previously worked for the Australian Government's international aid agency (AusAID). She has authored numerous publications on humanitarian policy issues.
Katherine Haver holds a Master degree in International Affairs (Columbia University). She has worked on humanitarian policy and practice, focusing on delivering aid in insecure environments as well as risk management, humanitarian financing and institutions. Before joining Humanitarian Outcomes, Katherine worked as a policy advisor for Oxfam GB based in DR Congo, where she authored reports on helping displaced people living in host families and better supporting community self-protection, among others. Katherine is currently serving on the US Board of Directors of ALIMA.
In a May 4 post, Michaël Neuman, MSF CRASH, warns that misleading data are suggesting humanitarian aid work has become more dangerous, taking particular aim at the Aid Worker Security Database (AWSD) for helping perpetuate this myth. As the creators of the AWSD and longtime researchers of operational security, we welcome the chance to respond to this latest piece, as well as the book it draws from, Saving Lives and Staying Alive, co-edited by Neuman with Fabrice Weissman. Starting with the points on which we agree:
Aid work is not becoming more dangerous overall. Correct. This is a point we have made in virtually every report released on AWSD figures. The rise in the total number of major attacks is driven by a small number of extreme cases, which for a long time has included Afghanistan and Somalia, and also currently includes South Sudan and Syria. On balance, the humanitarian presence in these countries has shrunk over successive years, as our multi-year field research study has demonstrated. Yet kidnappings, killings, and other attacks on aid workers in those countries have all increased. If we remove those cases, however, the overall attack rate remains fairly stable, in some years even declining, suggesting that in most places in the world providing humanitarian aid has become safer.
It is unfortunate that the media pays the most attention to our numbers when they are on the upswing. It is also frustrating that the nuance of our analysis is often lost in press coverage. However, it is good for political actors to be publicly reminded that their failure to resolve conflicts not only kills civilians, but also those seeking to help them. Yes, there always has been - and always will be - danger involved in aid work, but that doesn't mean anyone should stop loudly lamenting aid worker attacks.
Context is everything. Amen. A global dataset can never be truly useful for field-level decision-making, a point we ourselves have repeatedly made. In our field research we have never met a practitioner who treated the AWSD as anything but a single reference point, with only limited relevance to their particular situation.
In this regard, we have been encouraged by the recent development of field-level platforms to track and share information on security incidents. These platforms are more comprehensive in their scope than the AWSD; for example, they record incidents of threatening letters and harassment, which may be bellwethers of changing security conditions. At the same time, the AWSD offers practitioners the ability to compare differing contexts, which can help inform the allocation of scarce resources.
Now let's move on to the areas of the Neuman/Weissman analysis with which we fundamentally disagree.
Methodological issues. The authors' critique is weakest on the discussion of the methodology, revealing a very superficial familiarity with how the AWSD incident data are gathered, coded, and verified with the affected agencies and field security consortia. It further misstates how the aid worker population denominator is estimated to calculate attack rates. Truly, what is most disappointing about the book and associated articles, however, is that they do not attempt to provide a rigorous technical critique of the methodology or suggest more robust alternatives. Rather, the authors take a curiously ideological stance against global data-gathering and the formalised practice of risk management.
The article correctly observes that 'there is no consensus on what defines humanitarian work. It is often very hard to know on what basis the employee of a humanitarian organization was targeted by violence. Was it as a private individual, as someone from a particular country, as a member of a relief organization, or something else?' This conflates the different questions of unit of analysis and motive, but it is certainly true that humanitarian work and aid workers can be defined very differently depending on whom you speak to. The AWSD uses a very specific definition but for the purpose of measuring trends accurately, it is important mainly that definitions be clear and internally consistent.
The criticism is that this definition encompasses 'a very wide range of people,' leading to ambiguous findings. But this ignores the fact that the AWSD data are sorted - and hence can be disaggregated - by type of agency affected (UN, Red Cross/Crescent, INGO, NNGO), the gender and type of staff (national or international) as well as the means and context of the attacks and presumed motives (where possible to state). All of these data are verified with the agency concerned on an annual basis.
‘Studies that attempt to calculate attack and victim rates... are hampered by the lack of a reliable denominator.' Indeed, this is the most difficult component of all our data initiatives. For this reason, each year we endeavor to make a more rigorous estimate of the aid worker population. In pursuit of this aim, we have developed a methodology that combines human research and a technique called conditional mean imputation, which divides NGOs into similarly sized tiers; tier averages are then used to extrapolate missing data. Rather than making the imputation algorithm proprietary, we have published it in studies and online, and shared it extensively among colleagues in the sector, to encourage feedback and constructive critique to help guide its continual development. To date, the methodology has been peer-reviewed by three different statisticians who have deemed it sound, but we still feel it could be stronger. Our invitation to colleagues to improve upon these methods remains open.
Give aid workers a bit more credit. Rather puzzlingly, the authors seem to implicate the practices of incident tracking and risk management in what they see as increased risk aversion in the field. Apart from not presenting empirical evidence to support their assertion, they make some illogical assumptions about the influence and consequences of these practices and 'global figures' like AWSD. According to the book, ‘global figures convey the misleading notion that the violence is a global phenomenon obeying general laws.' Do they? We would argue that this is no more the case than knowing the number of war casualties would lead one to think that all wars are caused by the same thing. Numbers are neutral, and the fact that they can be misinterpreted or misused is no reason to dispense with them.
Similarly, the book implies that formalised systems of risk management, increasingly used by aid agencies, do more harm than good. Such systems, the authors maintain, bureaucratise decision-making and rob aid workers of their independence and ingenuity. This runs counter to the widespread opinion of both national and international humanitarian field workers, as evidenced in surveys and hundreds of interviews across multiple studies. Practitioners understand perfectly well that guidelines, such as the widely used Good Practice Review 8, are simply tools. It is ungenerous, at best, to suggest that the typical aid worker will abandon their own judgment and personal agency to blindly follow a manual, or will allow their situational awareness to be solely determined by numbers in a global dataset. And it betrays a lack of in-depth inquiry on how these measures are actually applied in the field.
Humanitarian values and risk management are not mutually exclusive. Fears that humanitarian action can be compromised by risk aversion and creeping corporatism are not unjustified. But fretting, without persuasive evidence, that risk management ‘has led to disenchantment with humanitarian action, whose chivalrous spirit has been drowned in the icy waters of actuarial calculation and remote control' just indulges romantic notions. The qualities of altruism, empathy, and compassion, which underpin humanitarian action, are not threatened by a clear-eyed appraisal of risks, and the means to manage them.
Making the problem about datasets misses the mark. The authors are right to point out that the humanitarian sector needs to get better at contextual analysis. But placing the blame on global databases and risk assessment templates is off base. The more direct causes - as over two years of sustained field research in our forthcoming study will show - are a lack of investment in negotiations with armed actors, inappropriate staff profiles, and the under-development of necessary skillsets.
Some final words on the construction and maintenance of datasets and denominators. It's boring. It's painstaking, tedious and time-consuming. On the whole, it is far more satisfying to write opinion pieces and to follow the ‘research by chatting' model that is standard in much of the humanitarian literature. But before the AWSD existed there was much rhetoric and speculation about the 'shrinking of humanitarian space' and the increasing targeting of aid workers, with virtually nothing in the way of supporting evidence. We would argue that the AWSD has contributed to the sector knowing a bit better what it is talking about. The answer is never less information, but more and better information, and more thoughtful interpretations of it.
To cite this content :
Abby Stoddard, Adele Harmer, Katherine Haver, “Data are not dangerous: A response to recent MSF CRASH critiques”, 11 mai 2016, URL : https://msf-crash.org/en/blog/war-and-humanitarianism/data-are-not-dangerous-response-recent-msf-crash-critiques-0
If you want to criticize or develop this content, you can find us on twitter or directly on our site.Contribute