How Can We Make Disaster Management Evaluations More Useful? An Empirical Study of Dutch Exercise Evaluations
Abstract The evaluation of simulated disasters (for example, exercises) and real responses are important activities. However, little attention has been paid to how reports documenting such events should be written. A key issue is how to make them as useful as possible to professionals working in disaster risk management. Here, we focus on three aspects of a written evaluation: how the object of the evaluation is described, how the analysis is described, and how the conclusions are described. This empirical experiment, based on real evaluation documents, asked 84 Dutch mayors and crisis managem... Mehr ...
Verfasser: | |
---|---|
Dokumenttyp: | Artikel |
Erscheinungsdatum: | 2020 |
Reihe/Periodikum: | International Journal of Disaster Risk Science, Vol 11, Iss 5, Pp 578-591 (2020) |
Verlag/Hrsg.: |
SpringerOpen
|
Schlagwörter: | Disaster management evaluation / Evaluation design / Evaluation report / Exercise evaluation / The Netherlands / Usefulness / Disasters and engineering / TA495 |
Sprache: | Englisch |
Permalink: | https://search.fid-benelux.de/Record/base-29590658 |
Datenquelle: | BASE; Originalkatalog |
Powered By: | BASE |
Link(s) : | https://doi.org/10.1007/s13753-020-00286-7 |
Abstract The evaluation of simulated disasters (for example, exercises) and real responses are important activities. However, little attention has been paid to how reports documenting such events should be written. A key issue is how to make them as useful as possible to professionals working in disaster risk management. Here, we focus on three aspects of a written evaluation: how the object of the evaluation is described, how the analysis is described, and how the conclusions are described. This empirical experiment, based on real evaluation documents, asked 84 Dutch mayors and crisis management professionals to evaluate the perceived usefulness of the three aspects noted above. The results showed that how evaluations are written does matter. Specifically, the usefulness of an evaluation intended for learning purposes is improved when its analysis and conclusions are clearer. In contrast, evaluations used for accountability purposes are only improved by the clarity of the conclusion. These findings have implications for the way disaster management evaluations should be documented.