وعده و ترس : دولت سیستم ها و شیوه های ارزیابی زیست محیطی استراتژیک در کانادا
|کد مقاله||سال انتشار||تعداد صفحات مقاله انگلیسی||ترجمه فارسی|
|5687||2009||10 صفحه PDF||سفارش دهید|
نسخه انگلیسی مقاله همین الان قابل دانلود است.
هزینه ترجمه مقاله بر اساس تعداد کلمات مقاله انگلیسی محاسبه می شود.
این مقاله تقریباً شامل 9420 کلمه می باشد.
هزینه ترجمه مقاله توسط مترجمان با تجربه، طبق جدول زیر محاسبه می شود:
- تولید محتوا با مقالات ISI برای سایت یا وبلاگ شما
- تولید محتوا با مقالات ISI برای کتاب شما
- تولید محتوا با مقالات ISI برای نشریه یا رسانه شما
پیشنهاد می کنیم کیفیت محتوای سایت خود را با استفاده از منابع علمی، افزایش دهید.
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Environmental Impact Assessment Review, Volume 29, Issue 1, January 2009, Pages 66–75
Has strategic environmental assessment (SEA) finally reached a point of maturity in Canada? Or, is it still stumbling to find its place in the impact assessment family? Strategic environmental assessment has been ongoing in Canada for a number of years, both formally and informally, and under a variety of labels and institutional models. The result is a system of SEA that is diverse, founded on a range of principles and frameworks, and not well understood. This paper provides a critical review of Canadian SEA systems and practices. To accomplish this objective, a manageable and diverse set of past and recent SEA and SEA-like frameworks and applications are described and critically analyzed based on a set of input, process, and output evaluation criteria. Results suggest considerable variability in SEA experience and value added. This is due in large part to the institutional and methodological pluralism of SEA, the boundaries of which are not well defined. Under the federal system, since the formalization of SEA, many applications have been disappointing in light of broader SEA good-practice principles and criteria. Indeed, some of the better examples of SEA have neither carried the SEA name tag nor occurred under formal SEA requirements. Further, many of the same challenges to project-based impact assessment also plague the development and value added of SEA. Of particular concern is the systematic separation of SEA from downstream decision inputs and assessment activities. As Canada commences review of its federal SEA Directive in preparation for the next generation of SEA, this paper reflects on what it has achieved in the prior.
Various forms of strategic environmental assessment (SEA) have been ongoing in Canada for a number of years; yet at the same time SEA remains the least understood of the impact assessment family. The beginnings of SEA in Canada date back to the Environmental Assessment and Review Process (EARP) of the early 1970s and the subsequent Guidelines Order of 1984, which defined the reach of environmental assessment to extend well beyond individual projects and encompass broader regional, conceptual, and policy-level review processes ( Noble, 2002 and Sadler, 2005). Early strategic forms of impact assessment, such as the Mackenzie Valley Pipeline inquiry (1974–1977), the Beaufort Sea hydrocarbon review (1982–1984), and the Atomic Energy of Canada Limited's nuclear fuel waste management concept (1988–1994), were operationalized as area-wide reviews, public review panels, and concept-based assessments. Although none of these early assessments were formally recognized as SEA, they have much to offer the future of SEA development. It was not until 1990 that SEA was formally established by way of a federal Cabinet Directive and as a separate process from project impact assessment, “making it the first of the new generation of SEA systems that evolved in the 1990s” (Dalal-Clayton and Sadler, 2005: 61). In many respects, however, the formalization of SEA in Canada was a step backwards for impact assessment in general insofar as the Directive created a non-statutory system for policy, plan, and program (PPP) assessment that would remain separate from any legislated environmental assessment process to come. Procedural guidance for SEA was provided in The Environmental Assessment Process for Policy and Programme Proposals (FEARO, 1993), with implementation subject to oversight by the Federal Environmental Assessment Review Office and later the Canadian Environmental Assessment Agency. Critiqued for inconsistencies and inadequacies in its application, a revised Directive was issued in 1999 to strengthen the role of SEA in PPP decision making and to clarify the obligations of federal departments and agencies. From 2000 onward SEA experienced considerable growth. This new era of SEA, however, is in sharp contrast to the conceptual, public and area-wide reviews conducted under EARP; SEA under the Directive is narrowly focused on the implications of federal government initiatives and confidential memoranda submitted to Cabinet. It was not until January 2004 when Canadian federal departments and agencies were required to prepare a public statement whenever a full SEA had been completed. Outside the federal process, SEA is practiced largely on an ad hoc basis and with much less known of assessment experiences, frameworks and outcomes. As such, notwithstanding decades of SEA development in Canada there remains only limited knowledge of the diverse nature and scope of SEA systems and practices and the value added to PPP development and decision making. Aside from selected reviews of individual applications under the Canadian federal system (e.g. Auditor General, 2004, Hazell and Benevides, 1998, Noble, 2004, Noble, 2003 and Sadler, 2005), there has not been an examination of Canadian SEA models and frameworks that includes both formal and informal applications across a range of federal and provincial PPP initiatives. In response, this paper provides a critical review of formal and informal SEA systems and practices in Canada. More specifically, the objective is to present and evaluate a range of SEA case applications, characteristic of a variety of SEA models and frameworks, with a view to understanding how each incorporated a number of proposed SEA principles and design criteria and contributed to improved decision making. The case analysis is based on work completed by the author for the Canadian Minister of Environment's Regulatory Advisory Committee, Sub-Committee on SEA (herein referred to as the SEA Sub-Committee), in preparation for the review of the Canadian SEA Directive—the Canadian Cabinet Directive on the Environmental Assessment of Policy, Plan, and Program Proposals. Lessons learnt from the case reviews, together with the Directive review, will set the stage for discussions concerning the ‘next generation’ of SEA in Canada. This paper is presented in five sections, including the Introduction. In the sections that follow the study approach and review framework and criteria are presented. This is followed by a critical review of selected Canadian SEA experiences, and the results of the review framework application. The paper concludes with a number of observations concerning the state of SEA systems and practices in Canada, and opportunities and challenges for the next generation of SEA.
نتیجه گیری انگلیسی
Based on the case studies, and in considering the lessons learnt from SEA experiences reported elsewhere in the literature, a number of observations are ventured concerning the state of SEA and its development in Canada. Some of these observations point toward critical decisions and actions that must be taken if SEA is to advance; others address general challenges to SEA systems and practices that have plagued environmental assessment in general since its inception. First, the notion of a strategic-level assessment is not new to Canada, but skepticism remains as to the benefits of SEA. In large part, this skepticism is due to a lack of common understanding of the roles SEA can and should play in decision making, the limited availability of tested methodological frameworks, and, perhaps most significantly, a lack of cases clearly demonstrating the added value of SEA to PPP development or downstream assessment. The cumulative result is difficulty in conceptualizing how to apply SEA and, when applied, applications that often fall short of expectations. That being said, ‘SEA-type’ practices are ongoing in Canada, many of which carry no SEA label but are based, purposefully or not, on relatively sound principles and methodology. This suggests that there must be some real benefits to SEA; the problem is that very little is known about such applications as SEA exists nowhere in a formal context outside of the federal Directive. In those instances where SEA is applied in compliance with the Directive, there is a greater tendency for its application to be perceived as something that “must be done”, an ad hoc exercise in policy review, but having limited influence over, or contribution to, PPP development or downstream actions. Given the current state of development and understanding of SEA in Canada, there is not a definitive answer as to whether a more formalized and legislated SEA would be advantageous. The assessment of PPPs in Canada was separated from formal EA with the development and implementation of the Canadian Environmental Assessment Act in 1992 and 1995. There is little evidence to suggest that a federally legislated requirement for SEA would translate to ‘better’ assessment or have any influence over SEA systems and practices under provincially regulated processes or within the private realm of industry. Some of the ‘better’ SEA experiences in Canada to date have neither carried the SEA name tag nor occurred under formal SEA requirements; rather, such cases have been integrated with government or private sector PPP development, often adopting SEA principles and methodology ‘accidentally’, and tailor-made to the particular needs and objectives of the planning system and problem at hand. Second, part of the challenge to realizing the benefits of SEA is the currently limited tiering of strategic- and project-level assessment and decision outputs. Indeed, SEA is often touted as a tool that can complement and support lower tiered assessments by identifying preferred options and directions for decision making. In practice, however, Canadian SEA remains relatively static, limited to a single tier at a time, and with only marginal input to subsequent assessment processes. While in most cases SEA is intended to influence or guide subsequent actions and decisions, there is often no clear connection between systems of SEA and downstream environmental assessment input requirements. In this sense, the systematic separation of SEA has constrained its ability to effectively influence downstream activities. Third, in those cases where SEA has demonstrated at least some success it unfolded as an integrated process with PPP development. Even as a parallel process to planning and decision making, the added value of SEA has proven to be limited in terms its ability to adequately integrate and coordinate assessment methods, objectives, and outputs with those of the existing planning and decision making process. That being said, an integrated approach does require that SEA become an accepted part of PPP development. This will be difficult to achieve in Canadian practice where environmental assessment has long been an add-on process or yardstick against which the acceptability of proposals is measured, rather than an integrated decision support tool to develop better ones. Fourth, although SEA is said to be something different than traditional EIA it is still plagued by many of the same problems—in particular post-decision follow-up and monitoring. The promise of SEA follow-up is a popular theme in recent literature (e.g. Arts and Morrison-Saunders, 2004 and Partidario and Fischer, 2004), but thus far there has been little guidance for real implementation. As such, SEA is still very much an ex ante evaluation and rarely carries over to the post-decision stages to address PPP implementation effects. In the Canadian context, this appears to be more so the result of limited institutional capacity (or desire) to link SEA outputs to subsequent PPP and project inputs rather than the “splash effect” (see Partidario and Arts, 2005) of strategic initiatives per se. Finally, context is critical. Understanding context is essential to understanding the value of SEA. Hildén et al. (2004) argue that different perspectives and understandings of planning and decision making lead to very different views of what SEA is about and what it should be delivering. SEA systems, from the formal to informal, are designed very differently across Canada and intended to serve very different ends. The nature and characteristics of any SEA system or application is dictated in large part by the institutional framework and how and where SEA fits into the relevant decision making process (Sheate et al., 2001). To return to the CNSOPB offshore Misaine Bank assessment as a case in point, the intent of SEA was to streamline downstream assessment. From an outsider's view of the system may seem narrowly focused; however, in practice SEA is operating within the context in which was intended to operate and under the constraints of the offshore regulatory system. While general guiding principles for SEA may be established, there is not likely to emerge a single best one-size-fits-all framework. 5.1. Conclusion An assessment of the overall state of SEA in Canada is difficult to determine objectively given that the majority of SEAs at the federal level are not publicly available, there is no central registry of SEAs as there is for EIA, and a large number of SEA applications do not occur under the SEA name tag or under the federal SEA framework. However, based on known experiences to date, Canadian SEA systems and practices are diverse, far from consolidated in scope and function, and encompass a range of models and practices. As such, there is considerable variability in outcome and expectations. In part, this may be due to how SEA was introduced and evolved in Canada—as a ‘good concept’, but one that lacked the necessary methodological guidance and institutional support. Canadian SEA is currently characterized by methodological and institutional pluralism, the boundaries of which are not well defined. There is no single model of SEA that can be unequivocally applied to all SEA systems and practices under the various PPP regulatory systems that exist; rather, attention must be given to custom-designed SEA sensitive to the tier of application and to the specific nature, objectives, and constraints under which SEA is operating. Bina (2003) goes so far as to argue for the conceptualization of SEA at the level of organisations, not of PPP tiers or of economic sectors alone, framing the purpose of SEA by how it fits the decision framework. This is not to say that ‘good practice’ SEA should not be defined by an agreed-upon set of principles and criteria, but rather SEA operates in diverse forms, under a range of institutional and methodological frameworks and expectations; evaluations must be sensitive to context. In conclusion, the results of this study suggest that SEA practice is ongoing in Canada, both formal and informal, but under varied systems and frameworks and with mixed success. In light of the upcoming review of the Canadian federal SEA system and Cabinet Directive, there is a need to examine a larger number of cases across a broader range of sectors, at different tiers of decision making, and under both formal and informal SEA systems in order to gain a comprehensive understanding of SEA experiences, state-of-the-art, and requirements for the next generation of SEA. To date, the Canadian SEA report card might read as follows: has considerable promise, but falling short of its full potential.