یادگیری از عوارض جانبی در صنعت برق هسته ای: یادگیری سازمانی، سیاستگذاری و نرمالیزه کردن
کد مقاله | سال انتشار | تعداد صفحات مقاله انگلیسی |
---|---|---|
4088 | 2012 | 12 صفحه PDF |
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Technology in Society, Volume 34, Issue 3, August 2012, Pages 239–250
چکیده انگلیسی
Nuclear power accidents repeatedly reveal that the industry has an incomplete understanding of the complex risks involved in its operation. Through analyzing the investigation of a nuclear power incident in Sweden in 2006, I show how the industry's learning practices shape recurrent normalization of risk regulation after such surprises. Learning is shaped through institutionalized measures of sufficiency and particular “risk objects” (e.g. human factors and safety culture) created through learning from previous events. Subsequent regulatory measures are shaped through improvement scripts associated with these risk objects. These learning practices exclude alternative conceptual perspectives to understand and address safety-critical incidents. Latent risks will therefore produce similar events in the future. The article contributes to the literature on organizational learning, policy making, sensemaking and normalization in complex systems. To improve learning from incidents and regulation in high-hazard industries, social scientists and a wider circle of stakeholders should be included in the regulatory and post-incident examination processes.
مقدمه انگلیسی
In contrast to nuclear waste management, nuclear power operation is most often de-politicized. Even if accidents such as those at Three Mile Island, Chernobyl and Fukushima at first challenged current risk analyses within the nuclear power industry, they are eventually normalized, whereby regulatory practices become “politics by other means”. This paper addresses what shapes such normalization through analyzing the industry's investigation of a Swedish nuclear power incident in July 25, 2006. The analysis shows that learning in the nuclear power industry is framed through institutionalized measures of sufficiency and particular “risk objects” (e.g. human factors and safety culture) created when learning from previous events. Subsequent regulatory measures are shaped through improvement scripts associated with these risk objects. These learning practices excluded conceptual perspectives to understand and address adverse events. Latent risks will therefore produce similar events in the future. Because of close similarities in nuclear power expert rationale and regulatory regimes between different countries and given the ongoing debate regarding complex technology systems and risks associated with nuclear energy, this paper adds to the discourse regarding the regulation of high-hazard industries. More specifically it adds specific recommendations regarding learning from and investigating high-hazard industries, catastrophic and near catastrophic events. Reactor 1 at the Forsmark nuclear power plant was shut down and the emergency power supply (to cool residual heat) was half-blocked due to the interaction between a maintenance error (trigger) and the failure of three technical subsystems (latent design errors), according to the plant's report [1]. The maintenance error caused an outside switchyard to uncouple reactor 1 from the national grid; inadequately installed low frequency protections for the turbines caused a transient electrical surge through the plant; and two out of four emergency power systems failed due to faulty design. Half of the instrument panels in the control room went black due to the loss of electric power. Moreover, instrumentation in the control room was misleading due to deficiencies in the human–machine interface, training and manuals. The operators did not know if the control rods had been fed into the reactor, or the reactor's water level or pressure. However, owing to their instructions, experience and training, they were able to deduce what had happened and after 22 minutes they manually connected the regional 70 kV grid to the plant. The water level decreased with two meters but the core was still covered. If more than two emergency power systems had failed, the operators' intervention would have been essential to save the core from damage. The nuclear power industry world-wide was surprised by the effects of the surge on the emergency power system and concerned since many other plants shared design features with Forsmark [2]. Moreover, the different parts of the emergency power system were interdependent, violating prescribed design rules. However, about a year after the incident, the Forsmark case seemed to be closed, the problems fixed and public trust in nuclear power was restored [3]. The plant's report and documents from the regulator, Statens Kärnkraftinspektion SKI, the Swedish nuclear safety agency, show a very varied response to the event. The industry had not inquired into the causes behind the control room operators' successful work and it had inquired into some but not all of the failed installation and design processes. Even when investigating causes, answers and remedies were vague. The technical errors were fixed and for some processes procedures were changed or efforts were made to improve adherence to existing procedures while other processes were left unchanged. This response left many unaddressed questions: Why was the incident surprising to the industry but later closed? What informed learning and remedy measures from this incident? What consequences might it have? How can learning from incidents be improved in order to match the complexity of nuclear power risks? To answer these questions I have used an ethnographic approach to identify and explain the rationales that guide industry experts learning efforts [4]. To this end, I have contrasted investigation reports with interview data, seminars and dialog with industry experts. The Forsmark incident was a valuable “window of opportunity” for the nuclear power industry to learn from both success and error. Simultaneously, the investigation is a frequent type of practice that provides an opportunity to analyze how learning practices within the industry are shaped. The incident story was spread worldwide and the regulator (SKI) arranged an international conference that provided ample learning opportunities. The incident investigation produced a wealth of data and it also aroused strong public and media interest since it raised questions of institutional trustworthiness on behalf of both the plant and SKI. For these reasons, the analysis contributes significantly to the understanding of policy learning, organizational learning in high-hazard industries and normalization of error as well as suggests how to improve regulatory practices. The literature on organizational learning in high-hazard industries [5] and [6] focuses on how investigation teams' disciplinary frameworks shape their findings in terms of calculated logics, fixing orientation and the like. It is often found that organizations are not able to advance beyond single-loop learning. The normalization literature addresses the pre- or post-investigation processes. Data that in hindsight proved to signal impending danger is often normalized in advance of accidents because the relevant audience does not recognize its significance appropriately [6] and [8]. After an accident, normalization is often seen as attempts to cover up wrongdoings that hindsight reveals [9]. Moreover, the surprising nature of the Forsmark incident suggested that it could have become a “focusing event” that called for radically reconfiguring regulatory practices and structures [10], [11] and [12]. Policy learning is often seen as triggered by such events and shaped by political dynamics favoring symbolic and speedy, but often misguided, measures [10]. This analysis indicates that there were both pre- and post-normalization caused by the same framing processes, and not by attempts to cover-up wrong-doings revealed in hindsight. These framing processes also prevented the incident from becoming a “focusing event” that could have led to double-loop learning and regulatory reform. These findings suggest that the nuclear power industry's learning practices should be challenged and complemented through inviting social scientists and other stakeholders into learning from accidents to match the complex risks within nuclear power design and operation. This would provide critical questions and powerful conceptual tools necessary to identify and understand both successful processes (such as the control room work) and failed processes (the design processes).
نتیجه گیری انگلیسی
The case can be summarized in three findings which are easily generalizable because the nuclear power industry experts' worldwide share the same risk objects and improvement scripts [5] and [6] and because many countries share the same regulatory approach (safety case). First, the case shows how learning practices in a high-hazard industry are shaped by specific, well-established, risk objects formed after a previous event and through requirements of sufficiency. Secondly, it shows how risk regulation is shaped by these learning practices through improvement scripts associated with these risk objects. The organizational processes behind the successful control room work were not investigated, instead the official investigation focused on engineering safety, informed by a ‘human error’ and safety culture framework developed after the Three Mile Island incident and Chernobyl accident. The human factors lessons drawn from these events not only pointed out the importance of these issues for the nuclear power industry, but also framed the risks associated with it, the data collected as well as the measures taken to manage it. The design errors that caused the incident were differently analyzed, as a consequence of available risk objects and improvement scripts. Third, through interviews, seminars and feedback from preliminary analysis, the specifics of organizational learning in the case of Forsmark were revealed. This made it possible to see how regulatory science was operationalized (and eventually legitimized by SKI's approval) through notions of sufficiency. In addition, there were clearly alternative avenues that could have enhanced learning which would have revealed possible consequences for risk analysis and risk regulation. Sufficiency was achieved through the investigators quick fix orientation for the first phase, using well-established risk objects and their associated scripts. Thus, the analysis explains why and how the investigation transformed the initial surprise and uneasiness the incident caused (in terms of poor design) into an account that conformed to and satisfied the readers' expectations of familiar causes and remedies. As a consequence, the investigation omitted possible lessons from the Forsmark event such as the positive human contribution in the control room, and the failed design of the emergency power system. These findings contribute to the literature and theories of sensemaking and organizational learning in high-hazard industries, normalization and policy learning. The analysis specifically contributes to policy-making and organizational learning in high-hazard industries [5] and [6]. In addition to disciplinary training and political dynamics, this study shows how organizational learning and risk regulation is shaped by measures of sufficiency and decisiveness, through risk objects and their improvement scripts. Moreover, the analysis shows that the notions of quick fixes and narrow regulatory science [5] and [31] needs to be nuanced: to restart the plant the regulator (SKI) needed to trust the equipment, the people and the management to a certain degree. In the long run, though, the regulator asked for more extensive answers to some pertinent questions and there is an awareness of the need to learn from successful work or failed processes. However, it seems as if available risk objects and improvement scripts or questions around sufficiency impact considerably how different kinds of issues were addressed. This analysis also confirms and expands upon theories of normalization. Hindsight data that seemed to signal the need for revising the understanding of complex organizational processes is normalized not only due to how experts' risk objects and associated improvement scripts shape their work. Normalization is also due to industry experts doing their best (doing what they are assigned) with the tools (risk objects and improvement scripts) that they have developed to that end, and informed by demands for sufficiency and decisiveness based upon relevant regulation and which allow decisions regarding resumed operation [25], [26] and [27]. Thus, learning from error is as much a cognitive as a political process that systematically produces certain findings at the expense of other as well as certain improvements at the expense of others. In this case, there was evidence of transparency (rather than any attempt to cover-up mistakes) but the learning and feedback cycle could have gone further with the improved understanding and framework suggested in this analysis. Since the incident showed major inadequacies in the industry's pre-incident understanding of relevant risks and design processes as well as for learning from successful control room work it could be expected to cause both strong incentives as well as great opportunities for double-loop learning and “full cultural re-adjustment” [5], [7], [59] and [60]. However, the learning practices among industry experts transformed an unexpected and uneasy event into familiar concepts within two years, which preempted calls for radical changes in understanding risks in nuclear power plants as well as means to address them. Organizational learning in fields like nuclear power are thus likely to be heavily shaped not only by professional values and calculated logics but also by measures of regulatory sufficiency, particular risk objects and improvement scripts from academics and safety management consultants. For the same reasons one could have expected the incident to become a “focusing event” that called for radically reconfiguring regulatory practices and structures [9], [10], [11], [12], [61] and [62]. In the Forsmark case, policy logics favoring quick, symbolic reactions, did operate with some effect, probably informed by the safety culture risk object and its associated improvement script. Consequently, the CEO and the Board were fired, Vattenfall, the plant owner, instituted a governing board over nuclear activities in its management structure, and SKI, the regulator was formally scrutinized for suspicions of corruption. However, when the investigation transformed the event into familiar patterns, it hampered the incitements for radical changes in risk regulation. Thus, learning practices among industry experts in fields such as nuclear power are likely to impede the effects of policy logics. Finally, the case suggests that organizational learning from nuclear power incidents do not match the complexity of the risks the industry faces and more than likely reproduces latent errors. Therefore, it is hoped this ethnographic analysis contributes to the literature and practice of safety-case risk regulation [63] and [64]. Experts are aware of these problems but they lack the conceptual and regulatory means to address them. The contemporary regulatory regime relies on mechanistic models for understanding organizational processes where instructions and procedures predominate as tools for understanding and controlling processes, complemented with an attitudinal or moralistic understanding of safety culture and trust in the robustness of the physical plant to withstand errors. The effectiveness of these tools can be questioned. This analysis suggests that there is a need to reform the conceptual and institutional means for safety-case regulation by inviting new groups of experts and opening up the regulatory process to other stakeholders. The literature on learning in high-hazard industries promotes improvement through involving different practitioners within the plants as a way to widen their frames of reference [65]. This study also indicates that regulatory work should be based on not only physics, engineering and human factors but also contemporary social science, e.g. STS, organizational studies, situated learning or anthropology and sociology in order to improve high-hazard incident response. In this way, the likeminded experts in the nuclear power industry are complemented with other perspectives that might challenge the embedded assumptions of risk objects and improvement scripts. These additional perspectives would expand the theoretical and methodological tools for institutional learning enabling improved understanding of the real-time logics that shape design and operation.