How Can We Make Disaster Management Evaluations More Useful? An Empirical Study of Dutch Exercise Evaluations

Ralf Josef Johanna Beerens , Henrik Tehler , Ben Pelzer

International Journal of Disaster Risk Science ›› 2020, Vol. 11 ›› Issue (5) : 578 -591.

PDF
International Journal of Disaster Risk Science ›› 2020, Vol. 11 ›› Issue (5) : 578 -591. DOI: 10.1007/s13753-020-00286-7
Article

How Can We Make Disaster Management Evaluations More Useful? An Empirical Study of Dutch Exercise Evaluations

Author information +
History +
PDF

Abstract

The evaluation of simulated disasters (for example, exercises) and real responses are important activities. However, little attention has been paid to how reports documenting such events should be written. A key issue is how to make them as useful as possible to professionals working in disaster risk management. Here, we focus on three aspects of a written evaluation: how the object of the evaluation is described, how the analysis is described, and how the conclusions are described. This empirical experiment, based on real evaluation documents, asked 84 Dutch mayors and crisis management professionals to evaluate the perceived usefulness of the three aspects noted above. The results showed that how evaluations are written does matter. Specifically, the usefulness of an evaluation intended for learning purposes is improved when its analysis and conclusions are clearer. In contrast, evaluations used for accountability purposes are only improved by the clarity of the conclusion. These findings have implications for the way disaster management evaluations should be documented.

Keywords

Disaster management evaluation / Evaluation design / Evaluation report / Exercise evaluation / The Netherlands / Usefulness

Cite this article

Download citation ▾
Ralf Josef Johanna Beerens, Henrik Tehler, Ben Pelzer. How Can We Make Disaster Management Evaluations More Useful? An Empirical Study of Dutch Exercise Evaluations. International Journal of Disaster Risk Science, 2020, 11(5): 578-591 DOI:10.1007/s13753-020-00286-7

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Abrahamsson M, Hassel H, Tehler H. Towards a system-oriented framework for analysing and evaluating emergency response. Journal of Contingencies and Crisis Management, 2010, 18(1): 14-25.

[2]

Atzmüller C, Steiner PM. Experimental vignette studies in survey research. Methodology, 2010, 6(3): 128-138.

[3]

Auspurg K, Hinz T. Factorial survey experiments, 2015, Thousand Oaks, CA: Sage

[4]

Aven T. On how to define, understand and describe risk. Reliability Engineering and System Safety, 2010, 95(6): 623-631.

[5]

Beerens, R.J.J. 2019. Does the means achieve an end—A document analysis providing an overview of emergency and crisis management evaluation practice in the Netherlands. International Journal of Emergency Management 15(3): Article 221.

[6]

Beerens RJJ, Tehler H. Scoping the field of disaster exercise evaluation—A literature overview and analysis. International Journal of Disaster Risk Reduction, 2016, 19: 413-446.

[7]

Biddinger PD, Orfaly Cadigan R, Auerbach BS, Burstein JL, Savoia E, Stoto MA, Koh HK. On linkages: Using exercises to identify systems-level preparedness challenges. Public Health Reports, 2008, 123(1): 96-101.

[8]

Birkland TA. Disasters, lessons learned, and fantasy documents. Journal of Contingencies and Crisis Management, 2009, 17(3): 146-156.

[9]

Boin A, ‘t Hart P, Stern E, Sundelius B. The politics of crisis management, 2017 2 Cambridge: Cambridge University Press

[10]

Borell, J., and K. Eriksson. 2008. Improving emergency response capability: An approach for strengthening learning from emergency response evaluations. International Journal of Emergency Management 5(3–4): Article 324.

[11]

Borodzicz E, Van Haperen K. Individual and group learning in crisis simulations. Journal of Contingencies and Crisis Management, 2002, 10(3): 139-147.

[12]

Bovens M, ‘t Hart P, Kuipers S. Moran M, Rein M, Goddin RE. The politics of policy evaluation. The Oxford handbook of public policy, 2008, Oxford: Oxford University Press 317-333.

[13]

Bradshaw CC, Bartenfeld TA. Exercise evaluation guides for public health emergency preparedness. Homeland Security Affairs Journal, 2009, 5(3): 1-41.

[14]

Calidoni-Lundberg, F. 2006. Evaluation: Definitions, methods and models: An ITPS framework. Östersund: Swedish Institute for Growth Policy Studies. https://pdfs.semanticscholar.org/6847/9ec9fc396e1a3f1ab859502e2e0cb0267d41.pdf?_ga=2.138291760.1448429144.1592574523-1144041233.1589189182. Accessed 15 Jun 2020.

[15]

Davis, F.D. 1989. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly 13(3): Article 319.

[16]

Duarte da Costa, A., J.O. Gomes, M.R. da Silva Borges, and P.V. Rodrigues de Carvalho. 2013. ASC model: A process model for the evaluation of simulated field exercises in the emergency domain. In Proceedings of the 10th International Conference on Information Systems for Crisis Response and Management (ISCRAM), ed. T. Comes, F. Fiedrich, S. Fortier, J. Geldermann, and T. Müller, 551–555. Baden-Baden: Karlsruher Institut fur Technologie (KIT).

[17]

Etkin J, Sela A. How experience variety shapes postpurchase product evaluation. Journal of Marketing Research, 2016, 53(1): 77-90.

[18]

Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE guide No. 67. Medical Teacher, 2012, 34(5): 288-299.

[19]

Gebbie KM, Valas J, Merrill J, Morse S. Role of exercises and drills in the evaluation of public health in emergency response. Prehospital and Disaster Medicine, 2006, 21(3): 173-182.

[20]

Hartley J, Sotto E, Fox C. Clarity across the disciplines. Science Communication, 2004, 26(2): 188-210.

[21]

Heath R. Looking for answers: Suggestions for improving how we evaluate crisis management. Safety Science, 1998, 30(1–2): 151-163.

[22]

Hertting N, Vedung E. Purposes and criteria in network governance evaluation: How far does standard evaluation vocabulary takes us?. Evaluation, 2012, 18(1): 27-46.

[23]

Heumüller, E., S. Richter, and U. Lechner. 2012. Towards a framework for command post exercises. In Proceedings of the 9th International Conference on Information Systems for Crisis Response and Management (ISCRAM), ed. L. Rothkrantz, J. Ristvej, and Z. Franco, 1–6. Vancouver: Simon Fraser University.

[24]

Hox J. Multilevel analysis: Techniques and applications, 2002, Mahmah, NJ: Lawrence Erlbaum Associates Inc

[25]

IBM Corp. 2019. SPSS. Armonk, NY: IBM Corp.

[26]

Imergis.nl. 2020. Map of 25 safety regions (Fysieke Veiligheid25 Veiligheidsregio’s 2020). Zeist: Imergis Organisatiebloei. https://www.imergis.nl/map/2020/2020-Veiligheidsregio-1200.png. Accessed 25 Jun 2020 (in Dutch).

[27]

Instituut Fysieke Veiligheid. 2017. Learning and the Fire Services: About how people learn—Also within the Fire Services (Leren and brandweer: Over hoe mensen lerenook binnen de brandweer). Arnhem: Instituut Fysieke Veiligheid. https://www.ifv.nl/kennisplein/Documents/20170316-BA-Leren-en-brandweer.pdf. Accessed 19 Jun 2020 (in Dutch).

[28]

ISO (International Organization of Standardization) International standard, societal security—Guidelines for exercises ISO-22398-2013, 2013, Geneva: ISO

[29]

Jasso G. Factorial survey methods for studying beliefs and judgments. Sociological Methods and Research, 2006, 34(3): 334-423.

[30]

Klein KR, Brandenburg DC, Atas JG, Maher A. The use of trained observers as an evaluation tool for a multi-hospital bioterrorism exercise. Prehospital and Disaster Medicine, 2005, 20(3): 159-163.

[31]

Lin L, Nilsson A, Sjölin J, Abrahamsson M, Tehler H. On the perceived usefulness of risk descriptions for decision-making in disaster risk management. Reliability Engineering and System Safety, 2015, 142: 48-55.

[32]

Lin L, Rivera C, Abrahamsson M, Tehler H. Communicating risk in disaster risk management systems—Experimental evidence of the perceived usefulness of risk descriptions. Journal of Risk Research, 2017, 20(12): 1534-1553.

[33]

Lincoln YS, Guba EG. The distinction between merit and worth in evaluation. Educational Evaluation and Policy Analysis, 1980, 2(4): 61-71.

[34]

Månsson P, Abrahamsson M, Tehler H. Aggregated risk: An experimental study on combining different ways of presenting risk information. Journal of Risk Research, 2019, 22(4): 497-512.

[35]

Nohrstedt D, Bynander F, Parker C, ‘t Hart P. Managing crises collaboratively: Prospects and problems—A systematic literature review. Perspectives on Public Management and Governance, 2018, 1(4): 257-271.

[36]

Perry RW. Disaster exercise outcomes for professional emergency personnel and citizen volunteers. Journal of Contingencies and Crisis Management, 2004, 12(2): 64-75.

[37]

Peterson DM, Perry RW. The impacts of disaster exercises on participants. Disaster Prevention and Management: An International Journal, 1999, 8(4): 241-255.

[38]

Qualtrics. 2019. Qualtrics. Provo, UT: Qualtrics. https://www.qualtrics.com. Accessed 15 Jun 2020.

[39]

Rathjens D. The seven components of clarity in technical writing. IEEE Transactions on Professional Communication, 1985, PC-28(4): 42-46.

[40]

Raudenbush SW, Bryk AS. Hierarchical linear models: Applications and data analysis methods, 2002 2 Newbury Park, CA: Sage

[41]

Savoia E, Agboola F, Biddinger P. A conceptual framework to measure systems’ performance during emergency preparedness exercises. International Journal of Environmental Research and Public Health, 2014, 11(9): 9712-9722.

[42]

Savoia E, Lin L, Gamhewage GM. A conceptual framework for the evaluation of emergency risk communications. American Journal of Public Health, 2017, 107(S2): 208-214.

[43]

Scriven M. The logic of evaluation, 1980, Inverness, CA: Edgepress

[44]

Scriven M. Evaluation thesaurus, 1991, Beverly Hills, CA: Sage

[45]

Sinclair H, Doyle EE, Johnston DM, Paton D. Assessing emergency management training and exercises. Disaster Prevention and Management: An International Journal, 2012, 21(4): 507-521.

[46]

Skryabina E, Reedy G, Amlôt R, Jaye P, Riley P. What is the value of health emergency preparedness exercises? A scoping review study. International Journal of Disaster Risk Reduction, 2017, 21: 274-283.

[47]

Suchan J, Dulek R. A reassessment of clarity in written managerial communications. Management Communication Quarterly, 1990, 4(1): 87-99.

[48]

Swedish Civil Contingencies Agency. 2011. Handbook evaluation of exercises. Stockholm: Swedish Civil Contingencies Agency. https://www.msb.se/RibData/Filer/pdf/25885.pdf. Accessed 13 Jun 2020.

[49]

Thomas TL, Hsu EB, Kim HK, Colli S, Arana G, Green GB. The incident command system in disasters: Evaluation methods for a hospital-based exercise. Prehospital and Disaster Medicine, 2005, 20(1): 14-23.

[50]

Uhr C, Johansson H, Fredholm L. Analysing emergency response systems. Journal of Contingencies and Crisis Management, 2008, 16(2): 80-90.

[51]

UK Cabinet Office. 1998. The exercise planners guide. London: UK Cabinet Office. http://www.cabinetoffice.gov.uk/resource-library/exercise-planners-guide. Accessed 19 Jun 2020.

[52]

UNDRR (United Nations Office for Disaster Risk Reduction) Words into action—Design and conduct of simulation exercises—SIMEX, 2020, Geneva: UNDRR

[53]

U.S. Departement of Homeland Security Homeland Security Exercise and Evaluation Program (HSEEP), 2020, Washington, DC: U.S. Departement of Homeland Security

[54]

van Duin, M., and V. Wijkhuijs. 2015. The flexibility of GRIP (De flexibiliteit van GRIP). https://www.ifv.nl/advieseninnovatie/Documents/201504-IFV-De-flexibiliteit-van-GRIP.pdf. Accessed 15 Jun 2020 (in Dutch).

[55]

Wybo J-L. Pasman HJ, Kirillov IA. The role of simulation exercises in the assessment of robustness and resilience of private or public organizations. Resilience of cities to terrorist and other threats, 2008, New York: Springer 491-507.

AI Summary AI Mindmap
PDF

200

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/