Skip to main content

Five recommendations to advance implementation science for humanitarian settings: the next frontier of humanitarian research


Challenges in delivering evidence-based programming in humanitarian crises require new strategies to enhance implementation science for better decision-making. A recent scoping review highlights the scarcity of peer-reviewed studies on implementation in conflict zones. In this commentary, we build on this scoping review and make five recommendations for advancing implementation science for humanitarian settings. These include (1) expanding existing frameworks and tailoring them to humanitarian dynamics, (2) utilizing hybrid study designs for effectiveness-implementation studies, (3) testing implementation strategies, (4) leveraging recent methodological advancements in social and data science, and (5) enhancing training and community engagement. These approaches aim to address gaps in understanding intervention effectiveness, scale, sustainability, and equity in humanitarian settings. Integrating implementation science into humanitarian research is essential for informed decision-making and improving outcomes for affected populations.


Delivering evidence-based programming in humanitarian crises grappling with war, climate change, and disaster is a formidable challenge. To support this goal, rigorous impact evaluations are increasingly being implemented to deliver better aid from evaluating the impact of cash transfers [1], promoting better education for children [2], advancing mental health and psychosocial programming for people affected by disaster [3], and improving the safety and wellbeing of women and children in conflict-affected zones [4, 5]. Methodologies have also advanced substantially such that studies can be pre-positioned for acute crises to promote the effectiveness of anticipatory action or find creative approaches to build comparison groups that do not delay receipt of aid, but rather ethically exploit organic programmatic delivery cycles and targeting procedures [6] or use propensity score matching approaches [7]. Nonetheless, while the generation of rigorous evidence of impact has increased, the ability to reach scale, sustainability, and equity across settings within humanitarian programming has been suboptimal.

Identifying and addressing gaps in implementation science for humanitarian settings

To address this challenge, the recently published scoping review conducted by Leresche and colleagues (2023), focuses on how implementation science or operational research may be used to improve evidence-based decision-making by humanitarian practitioners [8]. The authors find only 22 studies across 34 countries experiencing conflict that provide data on implementation or operational research processes. To assess the implementation of new interventions, studies in the review employed mixed methods, with the vast majority employing qualitative methods (n = 21) and others using a mix of routine data such as retrospective records reviews (n = 9) or monitoring data (n = 10). Only a few studies used evaluative methods such as a cluster randomised trial (n = 1) or cost analysis (n = 2) to document implementation. The authors acknowledge that a major limitation of the scoping review is the exclusion of studies that were not peer reviewed. Indeed, in humanitarian settings, frontline providers collect valuable data to track progress on activities, record outputs, and monitor for potential adverse effects of interventions. These data are often synthesised into reports and lessons learned documents and sometimes used to refine implementation approaches in the short term. However, this information is rarely captured and stored systematically and almost never peer reviewed. Moreover, the tools to collect these data are not informed by implementation science frameworks. As such, there remains a gap in the approaches used by practitioners and researchers to systematically document implementation processes. This gap impedes our ability to build evidence on humanitarian interventions. The authors conclude by proposing an important contribution regarding an adapted normalization process theory that highlights the role of frontline workers and communities in the implementation of evidence-based practices.

Based on our collective experience evaluating and implementing research in humanitarian settings, we are unsurprised with the limited number of studies included in the review. We further hypothesize that challenges related to evidence-based decision-making could also arise because there is a lack of implementation science frameworks, models, or theories that account for humanitarian dynamics and that can be readily integrated into routine data collection efforts or (quasi) experimental designs examining effectiveness of programming. The lack of appropriate frameworks and their application diminish the ability to understand how, why, and under what circumstances interventions are effective—and ultimately decrease the humanitarian field’s ability to integrate new knowledge into programming approaches. Therefore, to improve the evidence base such that humanitarian action can achieve impact, scale, sustainability, and equity, we recommend the following research avenues in unstable humanitarian settings:

Expand current implementation science frameworks to more holistically address humanitarian crises factors that influence program impact, scale, sustainability, and equity

The findings of the scoping review indicated that the conflict context itself interrupts the uptake of evidence-based practices, and conflict-related disruptions require implementing actors to constantly adapt and negotiate to sustain the provision of humanitarian response. The authors identified the need for interventions to be flexible to account for these disruptions and balance effectiveness with feasibility, acceptability, and validity of interventions for both frontline service providers and communities. Considering the recurrent operational, organizational, and security hurdles that frequently impede or restrict research and implementation in conflict zones, it is imperative to evaluate the impact of the conflict context on the implementation process itself. However, current implementation frameworks are suboptimal in accounting for conflict-related disruptions and how these events also influence service providers and communities.

As such, several existing frameworks could be expanded to incorporate some of these considerations. For example, the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework evaluates multiple domains of implementation outcomes and their associated social behavioural outcomes [9, 10]. Though it has not been used in humanitarian settings, the RE-AIM framework has been applied to the evaluation of community-based health interventions [11, 12] and to assess stakeholder engagement as an outcome of feasibility [13] which could be beneficial for assessing the engagement of implementing actors early in implementation of humanitarian interventions. Determinant frameworks such as EPIS (Exploration, Preparation, Implementation, and Sustainment) and the Consolidated Framework for Implementation Research (CFIR) that consider the outer and inner context when examining implementation across multiple domains and stages could be used to assess the environment of the humanitarian context, specifically [14, 15]. While the EPIS framework has not yet been applied in a humanitarian setting, its inclusion of the outer context as a dynamic component of implementation for complex public service interventions could serve as a model for examining the influence of conflict-related factors on implementation [16], as well as their effects on frontline service providers and communities [17]. We contend that the Dynamic Sustainability Framework, with its particular attention to understanding how changes in context may influence sustainment of interventions, may be particularly advantageous to explore within studies and programs that are continually refined over time and place [18].

Leresche and colleagues’ Extended Normalization Process Theory (ENPT) proposal offers a promising avenue through which to incorporate humanitarian frontline worker and community insights into these frameworks. Building on determinant frameworks, the ENPT theorises how the dynamic elements of a context play a role in shaping service providers’ opportunity and potential for implementation of an intervention [19]. To apply this and other implementation science frameworks to humanitarian action, efforts are needed to operationalise the conflict context to adequately account for its influence on the agency and capabilities of service providers as they negotiate the many challenges of the contexts in which they live and work; for example, measuring disruptions, displacement, policy shifts, or constraints on resources and how these interact with the feasibility of an intervention.

Increase the use of effectiveness-implementation hybrid designs in experimental programming

Three types of hybrid designs have been used to examine the impact of interventions alongside their implementation [20]. Type 1 includes testing effectiveness of an intervention while gathering implementation data; Type 2 includes jointly testing effectiveness and implementation strategies, while Type 3 includes testing an implementation strategy while observing effectiveness outcomes. In experimental studies, researchers should be gathering data and publishing inferences on implementation alongside of their effectiveness to link intervention outcomes with implementation factors. To date, the limitations of the evidence base on implementation of humanitarian interventions stem largely from the lack of peer-reviewed studies examining implementation outcomes such as adoption, fidelity, sustainability, etc. Within humanitarian action where flexibility in intervention modalities, delivery, and resourcing is key, it is no longer sufficient to publish conclusions on intervention efficacy, alone. Strengthening these effectiveness-implementation designs can be achieved through the embedding of implementation tools into ongoing trials based on clear conceptual frameworks. Conversely, approaches used by frontline providers to produce practice-based knowledge on implementation can be strengthened through methodological advancements that enable the systematic documentation of implementation outcomes even in routine assessments. In doing so, these data can be used more effectively in answering implementation science questions.

Across all effectiveness-implementation hybrid designs, mixed methods approaches should be used to combine quantitative and qualitative data on both behavioural and implementation outcomes. For example, participatory qualitative techniques with frontline providers and communities such as rapid appraisal, concept mapping, and process analyses can be used to not only explore whether statistical findings reflect community perceptions of outcomes but also whether null or negative findings are related to design or implementation failure [21]. Mixed methods approaches can also enhance flexibility in research design and dissemination for different audiences, which is valuable for advancing humanitarian research and action.

Systematically test implementation strategies and core components within humanitarian programming

Partnering with local actors to deliver programming with supportive coaching, training, or other quality assurance processes is increasingly practiced within the larger humanitarian architecture. Simultaneously, delivery strategies have evolved to enable rapid response, such as providing phone-based information services or emergency cash transfers to reach more people in less time. Other new strategies focus on increasing the quality of the humanitarian workforce, including task shifting [22, 23], varying financial incentives, or other strategies such as training programmes for health providers supported by technology and artificial intelligence. Developed based on need rather than evidence, these implementation strategies must be tested so that leaders are able to make informed decisions about programming models, delivery mechanisms, and human resourcing while also understanding the impact these decisions may have on effectiveness, reach, sustainability, equity, and cost efficiency.

Relatedly, testing implementation strategies should occur alongside research that seeks to understand the core components and mechanisms of change so that interventions are more easily transferable across contexts and types of emergencies (e.g., protracted or acute, camp or urban setting, etc.) [24]. Transferability of evidence-based interventions to humanitarian programming assumes that experimental, precision-based treatments are appropriate in settings where frontline providers must continually and quickly adapt to meet the needs of populations in fragile contexts. Implementation science must be expanded to consider how interventions can be designed and evaluated to allow for flexibility across implementation domains. Complementary research designs such as the Multiphase Optimization Strategy [25] or A/B testing may be particularly well-suited for humanitarian contexts where identifying core intervention components can improve flexibility and increase the plasticity of implementation [19], allowing for scaling and sustainment within humanitarian systems. Practitioners are more likely to be open to this type of research given its increased contextual relevance and usefulness in making critical design and implementation decisions.

Leverage methodological advancements to support implementation science in humanitarian settings

Implementation science frameworks can be tested and strengthened through methodological advancements that will improve the quality and usability of evidence in humanitarian settings. For example, the incorporation of systems science could be used to model complex and interrelated factors to improve decision-making for humanitarian action [26]. Through the application of systems thinking tools, researchers could integrate practice-based knowledge in the conceptualisation and testing of causal relationships and underlying structures within intervention systems. Additionally, advancements in machine learning and data-driven artificial intelligence could be applied to implementation science frameworks to enable anticipatory humanitarian action and improve accountability and effectiveness of interventions for crisis-affected populations [27]. These techniques could also be applied to the analysis of non-experimental data to strengthen insights from routinely collected data, enabling better prediction of intervention effect. Combining these approaches with participatory methods with frontline workers and affected communities could improve the rigor of data while upholding considerations of equity and ethics and participation in the development and execution of these new approaches [28].

While such recent methodological advancements in social and data sciences spark innovation, at the most basic level, the dearth of systematic approaches for measuring, collecting, and analysing data on implementation substantially impedes our understanding of how different implementation domains support or hinder interventions from achieving their intended outcomes for affected populations. Few standardised tools for measuring implementation factors exist [29,30,31] and there is little understanding of the psychometric properties of existing tools for implementation evaluations. Measurement must be advanced to improve the usefulness and relevancy of implementation outcomes in real-world settings [32], but particularly in humanitarian settings where feasibility and context-validity of measures are essential. Improvements in data systems could also enhance multilevel modelling of complex interventions within dynamic humanitarian systems using some of the advanced data science techniques mentioned above [33].

Advance implementation science training approaches and systematize practice-based learning and community engagement in collaboration with researchers in humanitarian settings

Training programs for researchers from and living within humanitarian settings should be developed and invested in to elevate their leadership and advance contextually grounded implementation insights. Efforts to systematize practice-based knowledge from practitioners and meaningfully engage communities should also be undertaken to promote their role in the design, execution, and dissemination of humanitarian research [34]. As Leresche and colleagues identify, service providers and communities are essential to not only designing and adapting interventions but also for informing implementation and compensating for conflict-related disruptions. The successful completion of both research and implementation requires collaboration to overcome the numerous challenges that impede programming in humanitarian settings. Action- and practice-based approaches can promote co-design and community ownership and supervision through linkages with local provider systems that prevent the imposition of externally-designed interventions and impractical research designs while also reducing risk of harm [35, 36]. These approaches also promote the generation of evidence for action, whereby research is disseminated to a target audience with concrete programmatic steps and is rapidly used to inform implementation decisions [35].


While the humanitarian field must continue to build the experimental evidence base to understand what works to support populations affected by emergencies, it is critical that humanitarian research goes beyond effectiveness studies to systematically incorporate implementation science to better inform decision-making and strategies to reach impact, scale, sustainability, and equity. This can be achieved by advancing implementation science frameworks to account for time-varying macro-and micro-factors that typify humanitarian settings, elevating the use of effectiveness-implementation hybrid designs, testing implementation strategies and core components, leveraging methodological advancements from other fields, and supporting the training and leadership of researchers, practitioners, and communities living within humanitarian settings. This next frontier of humanitarian research should be supported by appropriate donor investments that recognize the utility of implementation science as part of rigorous research and learning [37]. Only then, can we bridge the gap from effectiveness studies to action and accountability to improve the lives and wellbeing of populations affected by war and disaster.

Data availability

Not applicable.


  1. Özler B, Çelik Ç, Cunningham S, et al. Children on the move: progressive redistribution of humanitarian cash transfers among refugees. J Dev Econ. 2021;153:102733.

    Article  Google Scholar 

  2. Tubbs Dolan C, Kim HY, Brown L, et al. Supporting Syrian refugee children’s academic and social-emotional learning in national education systems: a cluster randomized controlled trial of nonformal remedial support and mindfulness programs in Lebanon. Am Educ Res J. 2021;59(3):419–60.

    Article  Google Scholar 

  3. Jordans MJ, Kohrt BA, Sangraula M, et al. Effectiveness of group problem management plus, a brief psychological intervention for adults affected by humanitarian disasters in Nepal: a cluster randomized controlled trial. PLoS Med. 2021;18(6):e1003621.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Falb KL, Asghar K, Blackwell A et al. Improving family functioning and reducing violence in the home in North Kivu, democratic republic of Congo: a pilot cluster-randomised controlled trial of safe at Home. BMJ open 2023;13(3).

  5. Puffer ES, Annan J, Sim AL, et al. The impact of a family skills training intervention among Burmese migrant families in Thailand: a randomized controlled trial. PLoS ONE. 2017;12(3):e0172611.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Falb K, Annan J. Pre-positioning an evaluation of cash assistance programming in an acute emergency: strategies and lessons learned from a study in Raqqa Governorate, Syria. Confl Health. 2021;15:1–6.

    Article  Google Scholar 

  7. Burkey M, Kohrt BA, Koirala S, et al. Alternative approaches for studying humanitarian interventions: propensity score methods to evaluate reintegration packages impact on depression, PTSD, and function impairment among child soldiers in Nepal. Global Mental Health. 2015;2:e16. [published Online First: 2015/08/12].

    Article  PubMed  PubMed Central  Google Scholar 

  8. Leresche E, Hossain M, De Rubeis ML, et al. How is the implementation of empirical research results documented in conflict-affected settings? Findings from a scoping review of peer-reviewed literature. Confl Health. 2023;17(1):39.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Glasgow RE, Harden SM, Gaglio B, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7:64.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–27.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Schwingel A, Gálvez P, Linares D, et al. Using a mixed-methods RE-AIM framework to evaluate community health programs for older latinas. J Aging Health. 2017;29(4):551–93.

    Article  PubMed  Google Scholar 

  12. Shaw RB, Sweet SN, McBride CB, et al. Operationalizing the reach, effectiveness, adoption, implementation, maintenance (RE-AIM) framework to evaluate the collective impact of autonomous community programs that promote health and well-being. BMC Public Health. 2019;19:1–14.

    Article  Google Scholar 

  13. Jansen E, Frantz I, Hutchings J, et al. Preventing child mental health problems in southeastern Europe: feasibility study (phase 1 of MOST framework). Fam Process. 2022;61(3):1162–79.

    Article  PubMed  Google Scholar 

  14. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Mental Health Mental Health Serv Res. 2011;38:4–23.

    Article  Google Scholar 

  15. Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):1–15.

    Article  Google Scholar 

  16. Moullin JC, Dickson KS, Stadnick NA, et al. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implement Sci. 2019;14(1):1–16.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Becan JE, Bartkowski JP, Knight DK, et al. A model for rigorously applying the Exploration, Preparation, implementation, sustainment (EPIS) framework in the design and measurement of a large scale collaborative multi-site study. Health Justice. 2018;6(1):9.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):1–11.

    Article  Google Scholar 

  19. May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11(1):141.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Curran GM, Bauer M, Mittman B et al. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and Implementation Research to Enhance Public Health Impact. Med Care 2012;50(3).

  21. Bamberger M, Rao V, Woolcock M. Using mixed methods in monitoring and evaluation: experiences from international development. World Bank Policy Research Working Paper 2010(5245).

  22. Cohen F, Yaeger L. Task-shifting for refugee mental health and psychosocial support: a scoping review of services in humanitarian settings through the lens of RE-AIM. Implement Res Pract. 2021;2:2633489521998790. [published Online First: 20210317].

    Article  PubMed  PubMed Central  Google Scholar 

  23. Abujaber N, Vallières F, McBride KA, et al. Examining the evidence for best practice guidelines in supportive supervision of lay health care providers in humanitarian emergencies: a systematic scoping review. J Glob Health. 2022;12:04017. [published Online First: 20220227].

    Article  PubMed  PubMed Central  Google Scholar 

  24. Tol WA, Ager A, Bizouerne C, et al. Improving mental health and psychosocial wellbeing in humanitarian settings: reflections on research funded through R2HC. Confl Health. 2020;14(1):71.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Collins LM, Murphy SA, Strecher V. The multiphase optimization strategy (MOST) and the sequential multiple assignment Randomized Trial (SMART): New methods for more potent eHealth interventions. Am J Prev Med. 2007;32(5):S112–18.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Campbell L. Systems thinking for humanitarians: an introduction for the complete beginner. London, UK: ALNAP; 2022.

    Google Scholar 

  27. Beduschi A. Harnessing the potential of artificial intelligence for humanitarian action: opportunities and risks. Int Rev Red Cross. 2022;104(919):1149–69. [published Online First: 2022/04/25].

    Article  Google Scholar 

  28. Prabhakaran V, Martin D. Jr. Participatory machine learning using community-based System Dynamics. Health Hum Rights. 2020;22(2):71–4.

    PubMed  PubMed Central  Google Scholar 

  29. Martin M, Steele B, Lachman JM, et al. Measures of facilitator competent adherence used in parenting programs and their psychometric properties: a systematic review. Clin Child Fam Psychol Rev. 2021;24(4):834–53.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Michie S, Johnston M, Abraham C, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  31. Michie S, Pilling S, Garety P, et al. Difficulties implementing a mental health guideline: an exploratory investigation using psychological theory. Implement Sci. 2007;2(1):1–8.

    Article  Google Scholar 

  32. Lengnick-Hall R, Gerke DR, Proctor EK, et al. Six practical recommendations for improved implementation outcomes reporting. Implement Sci. 2022;17(1):1–8.

    Article  Google Scholar 

  33. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Mental Health Mental Health Serv Res. 2011;38:65–76.

    Article  Google Scholar 

  34. Bain LE, Ngwayu Nkfusai C, Nehwu Kiseh P, et al. Community-engagement in research in humanitarian settings. Front Public Health. 2023;11:1208684. [published Online First: 20230817].

    Article  PubMed  PubMed Central  Google Scholar 

  35. Wessells M. Reflections on Ethical and Practical Challenges of Conducting Research with Children in War Zones: Toward a Grounded Approach. Research methods in conflict settings: A view from below 2013:81.

  36. Brun C. ‘I love my soldier’: developing responsible and ethically Sound Research Strategies in a Militarized Society. Res Methods Confl Settings: View below 2013;129.

  37. Glasgow RE, Vinson C, Chambers D, et al. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012;102(7):1274–81. [published Online First: 20120517].

    Article  PubMed  PubMed Central  Google Scholar 

Download references


Not applicable.

Author information

Authors and Affiliations



KF and AB conceptualised the paper. KF and AB wrote the first draft with inputs from SK and CY. All authors contributed to the subsequent drafts and agreed on the final version. AB is corresponding author.

Corresponding author

Correspondence to Alexandra Blackwell.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors have no competing interests to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Falb, K., Kullenberg, S., T Yuan, C. et al. Five recommendations to advance implementation science for humanitarian settings: the next frontier of humanitarian research. Confl Health 18, 41 (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: