شکست پیش بینی در مدیریت بحران : تجزیه و تحلیل ثانویه از فاجعه ماری
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|1071||2012||17 صفحه PDF||سفارش دهید||14420 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Technological Forecasting and Social Change, Available online 11 November 2012
Foresight the ability to plan and think systematically about future scenarios in order to inform decision-making in the present has been applied extensively by corporations and governments alike in crisis management. Foresight can be complicated because dispersed groups have diverse, non-overlapping pieces of information that affects an organization's ability to detect, mitigate, and recover from failures. This paper explores the failure of foresight in crisis management by drawing on data on events that preceded and followed the Mari disaster in a naval base in Cyprus in July 2011, where a large explosion killed 13 people and injured 62 others, while completely destroying the major power plant of the island. The paper examines how foresight into crisis management decisions was compromised because of a conscious effort by high ranking decision-makers to minimize emergent danger and avoid responsibility for the crisis, in joint with red tape, bureaucracy, and poor coordination and information flows. The paper explores the notion of operational and political responsibility of individual decision-makers and discusses an alternative approach to foresight in crisis management, one that is built on multiple layers of decision-making.
Crisis management is the systematic attempt to identify and detect possible crises and to take actions and measures to prevent them, contain their effects or disruption, and finally recover , , , , , , ,  and . Crises vary from natural or environmental disasters such as earthquakes and floods (e.g. hurricane Katrina and the Haiti earthquake in 2010), to biological threats such as pandemics (e.g. the H1N1 pandemic in 2009), mechanical or technological failures (e.g. oil spill in the Mexico Gulf in 2010), and human-induced disasters such as terrorism (e.g. 9/11) and economic crises (e.g. the 2008 subprime market collapse). In fact, human error is present in most cases with devastating outcomes disturbing established patterns of working and dominant assumptions about the way aspects of society operate , , , ,  and . The crisis management literature can generally be divided into two principal schools of thought: Normal Accidents Theory  and , and High Reliability Organizations Theory ,  and . The dialogue between the two schools revolves around the question of whether reliability, resilience and learning can be inscribed in organizations by design. The High Reliability Organizations school argues that “trial and error learning, supplemented by an active search for improvements, can eventually lead to improved safety even in operations involving hazardous technologies” [6: pp. 207]. This learning capability is essential to achieving high reliability against a possible disaster. On the other hand, the Normal Accidents school argues that the real world – due to its complexity – often gives ambiguous feedback and that learning takes place in political environments. In other words, high complexity and tight coupling in organizations are structural conditions which hamper learning . It is this combination of high complexity and tight coupling in some systems that make accidents inevitable or “normal”. There are, thus, limits to crisis management and safety . Part of this complexity is that the causes of a crisis reside within an organizational system of governance and often remain unnoticed until it is too late to take any action to prevent them from spiraling out of control. This was what happened with the subprime mortgage crisis, which remained unnoticed (at least, to most government and corporate agencies) until it created a global economic crisis . Such crises happen because, as the system for governing decision-making grows in complexity, it becomes easier for the system's component parts to tight-couple with those of other systems, in the process allowing for the rapid proliferation of interactions (and errors) throughout all systems . This complexity is further exacerbated by the executive political decisions made before, during, and after a crisis that often do not take any consideration of foresight into the possible causes and aftermath of the crisis, but are primarily driven by red tape, bureaucracy, and poor coordination and information flows  and . The problematic political decision-making before and during Hurricane Katrina and the catastrophic aftermath for the city of New Orleans is a case in point . One way to understand this complexity is to identify the conditions that contribute to the failure of foresight — i.e. the ability to plan and think systematically about future scenarios in order to inform decision-making in the present  and . Foresight can be complicated by a “variable disjunction of information”, which refers to “a complex situation in which a number of parties handling a problem are unable to obtain precisely the same information about the problem, so that many differing interpretations of the situation exist” [19: pp. 40]. This information dispersion is a consequence of organizational structures of decision-making. That is, problems that produce crises can ramify in unexpected ways because dispersed groups have diverse, non-overlapping pieces of information: each group has partial information that is incomprehensible because crucial pieces are missing. It is the distribution and flow of information that affects an organization's ability to detect, mitigate, and recover from failures . In addition, the tendency of people to make do with the information they have at hand and to simplify interpretations creates collective blind spots which obscure problems which may be brewing, eventually leading to a crisis . This paper examines the failure of foresight in crisis management. The intention of this examination is to identify the set of organizational patterns that precede crises by drawing on Turner's failure of foresight framework  and , while also acknowledging the complexity in the decision-making processes of crisis management . The paper carries out a secondary analysis of a public inquiry report  into the detainment of the vessel Monchegorsk by the Cypriot government in 2009, upon the request of the US government. The vessel was found to carry 98 containers filled with arms-related material, sent by Iran to Syria. Following the UN Security Council Sanctions Committee resolutions, the Cypriot government took the decision to keep the 98 containers in Cyprus, until the details of the shipment became clearer. The 98 containers were stored in the Mari naval base next to a power plant until they exploded due to increased exposure to high temperatures on July 11th 2011. The explosion killed 13 people and injured 62 others, while completely destroying the neighboring power plant. In the next section, the paper provides a description of Turner's failure of foresight framework  and , and explains how it was applied in the analysis of the public inquiry report on the Mari disaster. Section 3 provides a chronology of the Mari disaster by drawing on the secondary data. Then, Section 4 provides an analysis and discussion of the failure of foresight in the Mari disaster while drawing links to other similar disasters. Finally, Section 5, explores the notion of operational and political responsibility of individual decision-makers and discusses an alternative approach to foresight in crisis management, one that is built on multiple layers of decision-making  and .
نتیجه گیری انگلیسی
This paper explored the failure of foresight in crisis management by drawing on data on events that preceded and followed the Mari disaster in a naval base in Cyprus in July 2011, where a large explosion killed 13 people and injured 62 others, while completely destroying the major power plant of the island. Drawing on Taylor's  and  failure of foresight framework, the paper examined how foresight into crisis management decisions was compromised because of a conscious effort by high ranking decision-makers to minimize emergent danger and avoid responsibility for the crisis, in joint with red tape, bureaucracy, and poor coordination and information flows. While linking the findings from the secondary data analysis of the Mari disaster to other similar crises, the paper explored the notion of responsibility and discussed an alternative approach to foresight in crisis management. This alternative approach to foresighted decision-making builds on the work of institutional theorists into collective action problems  and  to present a framework of decision-making across three layers, namely, operational, collective-choice, and constitutional. While drawing on the example of the Mari disaster and discussing how decision choices could have been structured otherwise, the paper argued that foresight can be improved through multi-layered decision-making. It is, however, acknowledged that multi-layered decision-making can only be implemented after significant restructuring of institutional bodies of governance, including public policies . Consequently, multi-layered decision-making can only be administered on a case by case basis, according to contextual attributes, situational practices, and associated politics. This is no to say, however, that politics are necessarily a constrain to implementing multi-layered decision-making. In recent years, political leaders have been found to exploit the chaotic situations caused by crises, thinking of them as opportunities for desired change and political reform  and . Recent events in the European agriculture sector underscore this “crisis-reform thesis” . Discussions at both the national and European levels have essentially reconceptualized crisis management as a matter of political reform. In conclusion, it is acknowledged that the analysis of the failure of foresight in crisis management and the subsequent framework of multi-layered decision-making is based primarily on data from a single public inquiry report on a single case. These limitations have been recognized early on and the paper has explicitly sought to link the methodological choices to other similar analyses drawing on public inquiry reports, as well as to link the findings to other similar crises. In particular, the reliance on a public inquiry report as the only source of data is in line with Taylor's  and  early studies into the failure of foresight, as well as Weick's  secondary data analysis of the Mann Gulch disaster, and Brown's  analysis of the Cullen Report into the Piper Alpha disaster, among other examples  and . In addition, the failure of foresight in the Mari disaster has been linked to similar failures of foresight in the foot and mouth disease outbreak in the UK  and the disaster of New Orleans in the aftermath of Hurricane Katrina . Future research could further explore the failure of foresight across a variety of settings and while drawing on both empirical and secondary data. More importantly for policy makers, however, further research could explore the implementation of multi-layered decision making as a means of preventing foresight failures and improving the outcomes of crisis management.