CDF روش تجزیه و تحلیل حساسیت برای رتبه بندی پارامترهای مؤثر در ارزیابی عملکرد سطح بالا مخزن ضایعات پیشنهادی در درخت یوکای امریکایی کوه، نوادا، ایالات متحده آمریکا
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|25572||2001||10 صفحه PDF||سفارش دهید||5954 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Reliability Engineering & System Safety, Volume 73, Issue 2, August 2001, Pages 167–176
A cumulative distribution function (CDF)-based method has been used to perform sensitivity analysis on a computer model that conducts total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, and to identify the most influential input parameters affecting the output of the model. The performance assessment computer model referred to as the TPA code, was recently developed by the US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA), to evaluate the performance assessments conducted by the US department of energy (DOE) in support of their license application. The model uses a probabilistic framework implemented through Monte Carlo or Latin hypercube sampling (LHS) to permit the propagation of uncertainties associated with model parameters, conceptual models, and future system states. The problem involves more than 246 uncertain parameters (also referred to as random variables) of which the ones that have significant influence on the response or the uncertainty of the response must be identified and ranked. The CDF-based approach identifies and ranks important parameters based on the sensitivity of the response CDF to the input parameter distributions. Based on a reliability sensitivity concept [AIAA Journal 32 (1994) 1717], the response CDF is defined as the integral of the joint probability-density-function of the input parameters, with a domain of integration that is defined by a subset of the samples. The sensitivity analysis does not require explicit knowledge of any specific relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The method allows for calculating sensitivity over a wide range of the response and is not limited to the mean value.
The US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA) recently developed a general computer model  and  for conducting total-system performance assessments of the proposed high-level waste repository at Yucca Mountain (YM) in Nevada, USA. This computer model is a tool for evaluating the performance assessments conducted by the US department of energy (DOE) in support of their license application. The relationship between the output and the input parameters in such a model is highly nonlinear and involves strong interactions among the input parameters. Since it is often difficult to describe precisely the governing physical processes, coupling among these processes, parameters defining the physical system, and the evolution of the repository system over the long time period of interest (TPI), significant uncertainties exist in the computer model. The performance assessment model is then designed in such a way that these uncertainties can be analyzed by using alternative conceptual models and assigning uncertainties to model parameters. The performance assessment model also has a large number of input parameters that are described by ranges of variance and probability distribution functions representing uncertainties and variability. Sensitivity analysis of the performance assessment model is conducted to determine the uncertainty in the output due to uncertainties in the model (not considered in the paper) and input parameters and to determine the most influential input parameters that control the behavior of the output. Knowledge of the most influential input parameters is important because it can be used for (i) providing an insight on where more efforts should be devoted to reduce the uncertainties of the output to significantly improve the understanding of the system, (ii) comparing results from calculations conducted by different research groups involved in solving the same problem, and (iii) aiding in design improvements by interactively improving the design criteria or requirements to reduce vulnerability to a particular design parameter. Methods that are available to identify influential parameters such as linear and step-wise regression , non-parametric approaches  and , one-at-a-time approaches  and , response surface analysis, and Fourier amplitude sensitivity test (FAST) have their advantages and disadvantages. For example, while the linear regression-based sensitivity analysis methods provide ranking of the parameters regardless of the values of the response (an advantage), these methods are most suitable when the response is approximately a linear function of the input parameters (a disadvantage). Similarly, while a one-at-a-time approach has an unambiguous way to relate output parameter sensitivity to an input parameter, the method may not be able to study the interaction effects well. The limitation of other methods are documented in Mohanty et al.  and Lu and Mohanty . This paper presents an alternative approach to identify and rank influential parameters based on the sensitivity of the cumulative distribution function (CDF) of model response to the parameter distribution. The approach does not assume a linear or other explicit functional relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The feature of this method is that it allows calculating sensitivity over a wide range of the response and is not limited to the mean value. This approach is more general and provides more information than the regression-based methods. In the alternative approach, the responses and the corresponding random samples of the parameters are ordered so that for each selected response percentile, a corresponding subset of the ordered samples can be identified. Based on a reliability sensitivity concept , the response CDF is defined as the integral of the joint probability-density-function of the parameters, with a domain of integration defined by a subset of the samples. The response CDF sensitivities are then calculated from the derivatives of the probability integral. The derivatives are statistically estimated from the samples and used to identify and rank the influential random variables. Section 2 presents the major functions and components of the performance assessment model for a geologic high-level nuclear waste repository, to give the reader a background of the computer model to be analyzed. Section 3 presents the details of the CDF-based method that is used to conduct sensitivity analysis on the computer model. The application of the CDF-based method to the performance assessment model is presented in Section 4 along with verification of the results, and the conclusion is presented in Section 5.
نتیجه گیری انگلیسی
The CDF-based method was used to conduct a sensitivity analysis on a computer model that performs a TPA of the proposed HLW repository at Yucca Mountain, Nevada, USA. A description of the method was first presented, and several key properties and performance measures of the CDF-based method were investigated. One feature of this method is that it allows calculating sensitivity at any designated level of the CDF of the response and is not limited to the mean value. One thousand TPA code realizations were used to demonstrate the applicability of the method for conducting sensitivity analysis for a large and complex problem. Two sensitivity measures, Sμ (CDF sensitivity with respect to mean) and Sσ (CDF sensitivity with respect to standard deviation), were discussed. These measures provide a useful analysis tool to explore a repository design by identifying the influential input variables and by investigating the potential impact of the uncertainties in the input variables to the performance CDF. The results of the TPA basecase show that at different performance CDF levels, the top influential variables may be different. For example, a variable very sensitive at CDF=0.1 may be insensitive at CDF=0.9. In general, the ranking of random variables at tail CDF levels can be different from the ranking obtained around the mean typically used in other sensitivity analysis methods. Based on the average of the Sσ sensitivities taken over the entire CDF, the method identified four most influential variables ((1) ARDSAVNp, (2) ARDSAVPu, (3) ARDSAVAm, and (4) SbArWt%) for 50,000 years TPI. These four variables were expected to be most sensitive to performance variability. When verified through the reduction of the variance in the performance by fixing the variables at their mean values, the performance standard deviation dropped to 6% after fixing the top four variables and dropped to 2% after fixing the top 10 variables. The TPA sensitivities were computed based on 1000 LHS samples. However, the method can be applied to the regular Monte Carlo or other random sampling methods and a much smaller number of samples may be sufficient as long as the calculated sensitivity is statistically significant. In fact, it was found that for the TPA basecase, 100 LHS samples were sufficient to identify the same top five variables. This suggests that, in general, an integrated LHS-sensitivity analysis approach can be designed to adaptively determine and minimize the number of samples for sensitivity analysis purposes. It should be cautioned that the TPA analysis is presented in this paper to demonstrate how the most influential variables may be efficiently and effectively identified in a large and complex problem and is not intended at this stage to make suggestions about what may be the most influential variables for the proposed repository at YM. The future efforts will focus on determining the combined influence of two or more statistically correlated parameters and on developing other useful sampling-based sensitivity measures.