دانلود مقاله ISI انگلیسی شماره 26028
ترجمه فارسی عنوان مقاله

اثرات عدم قطعیت پارامتر در تجزیه و تحلیل حساسیت مبتنی بر واریانس

عنوان انگلیسی
Parameter uncertainty effects on variance-based sensitivity analysis
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
26028 2014 8 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Reliability Engineering & System Safety, Volume 94, Issue 2, February 2009, Pages 596–603

ترجمه کلمات کلیدی
تجزیه و تحلیل حساسیت مبتنی بر واریانس - عدم قطعیت پارامتر - تجزیه واریانس - مدل خطی در پارامتر -
کلمات کلیدی انگلیسی
Variance-based sensitivity analysis, Parameter uncertainty, Variance decomposition, Linear-in-parameter model,
پیش نمایش مقاله
پیش نمایش مقاله  اثرات عدم قطعیت پارامتر در تجزیه و تحلیل حساسیت مبتنی بر واریانس

چکیده انگلیسی

In the past several years there has been considerable commercial and academic interest in methods for variance-based sensitivity analysis. The industrial focus is motivated by the importance of attributing variance contributions to input factors. A more complete understanding of these relationships enables companies to achieve goals related to quality, safety and asset utilization. In a number of applications, it is possible to distinguish between two types of input variables—regressive variables and model parameters. Regressive variables are those that can be influenced by process design or by a control strategy. With model parameters, there are typically no opportunities to directly influence their variability. In this paper, we propose a new method to perform sensitivity analysis through a partitioning of the input variables into these two groupings: regressive variables and model parameters. A sequential analysis is proposed, where first an sensitivity analysis is performed with respect to the regressive variables. In the second step, the uncertainty effects arising from the model parameters are included. This strategy can be quite useful in understanding process variability and in developing strategies to reduce overall variability. When this method is used for nonlinear models which are linear in the parameters, analytical solutions can be utilized. In the more general case of models that are nonlinear in both the regressive variables and the parameters, either first order approximations can be used, or numerically intensive methods must be used.

مقدمه انگلیسی

Sensitivity analysis (SA) methods are well known in the engineering literature. They have been used across various disciplines such as weather forecasting, chemical engineering and economics. Readers are referred to [1], [2], [3] and [4] and the references contained therein for recent reviews of these topics and a number of extensions and modifications. The objective of SA is to ascertain how the model output depends on its input factors. Three general classes of SA techniques can be defined [4]: • factor screening methods (e.g. one-at-a-time (OAT) experiments, factorial designs); • local methods (e.g. differential or nominal value analysis); • global methods (e.g. Monte Carlo analysis, FAST and Sobol's method). Screening methods are typically qualitative tools providing only a ranking of the importance of the regressors/parameters. Screening methods can usually be further characterized as retaining properties of either local or global methods. One advantage of the screening methods is that they are computationally efficient. Their disadvantage is that they tend to address only a specific point, or local region in the regressor/parameter space. Mostly based on partial derivatives, local methods usually embody a univariate assessment framework among the regressors/parameters. As such they are valid only in a small range taken about the nominal values of the regressors/parameters. Global methods apportion the total variance of the output to each input factor and the interactions among the input factors. All input factors are allowed to vary simultaneously over their ranges taking into account the shapes of their probability density functions. Global methods are far more computationally demanding and involve various methods of sampling the input factor space (e.g. random sampling, quasi-random sampling, Latin hypercube sampling (LHS), etc.) [5]. A review of global SA methods, including Monte Carlo based regression–correlation measures and variance-based method can be found in [3], [6] and [7]. Consider a system described by a uniresponse model of the form equation(1) Y=f(X,Θ),Y=f(X,Θ), Turn MathJax on where YY is the value of the output or response variable, and [X,Θ][X,Θ] are the input factors. ff is the model that maps the input factors to the output. The input factors have been partitioned into two sets. XX is an m×1m×1 vector of regressive variables and ΘΘ is a p×1p×1 vector of model parameters. The regressive variables would typically correspond to the input settings of the model. In chemical engineering, these might correspond to material flow rates, temperatures, pressures, etc. The parameters are also inputs, but they are fundamentally different from the regressive inputs. In chemical engineering, these inputs include the physicochemical parameters of the model, such as those related to reaction kinetics and thermodynamic equilibria. Most often the physicochemical parameters are estimated from experiment data or they are obtained from standard correlations. In either case, there is always uncertainty with these values. A variance-based SA for the model in Eq. (1) that accounts for uncertainties in both regressors, XX, and parameters, ΘΘ, can be conducted in a straightforward fashion using correlation ratio-based techniques [8] and [9], FAST [10], [11] and [12], or Sobol’ techniques [13] and variants on these methods. (In this paper we will utilize the latter two approaches.) Unfortunately, the full SA may not always provide entirely satisfactory results. In some cases, understanding and quantifying the regressor uncertainty may have a higher priority than the parameter uncertainty. For example, in controller performance assessment [14], a variance decomposition of important process variables is undertaken with respect to measured or unmeasured variables. The results can be directly used to assist in output variance reduction. After the preliminary SA is completed, the impact of parameter uncertainty effects on these results might be considered. In this paper, the SA is approached in a sequential fashion. First, the sensitivity with respect to the regressor variables (X′s)(X′s) is determined using the nominal or expected values for the parameters. We call this regressive sensitivity analysis (RSA). The analysis with respect to the parameters is then undertaken using the results from the first step to determine the effects of the parameter uncertainty. This approach relies on a fundamental variance decomposition identity [15, p. 55] (see Eq. (14)). Considerable simplification arises when a model is linear in the parameters, or can be adequately approximated as such. Polynomial models are quite simple, typically involving a bilinear product of the form f(X,Θ)=HT(X)Θf(X,Θ)=HT(X)Θ, where H(X)H(X) is a polynomial in the regressive variables. If the model is linear in its parameters, the effects of the parameter uncertainties on the RSA can be deduced using the RSA and the first two moments of the distribution of the parameter uncertainties. When the model cannot be partitioned in this fashion, computational intensive methods, such as Monte Carlo techniques, must be applied. The layout of the paper is as follows: In Section 2, the global variance-based SA is introduced. In Section 3, an approach for incorporating the parameter uncertainty effects on RSA for general nonlinear models is outlined. In most cases, numerically intensive methods are required to account for these uncertainties. However, for models that are linear in the parameters, analytical solutions are readily obtained. Some simple methods to cope with the more general case are proposed. In Section 4, two examples are used to illustrate the methodology. This is followed by the conclusions and a discussion of areas for future work.

نتیجه گیری انگلیسی

Methods to include parameter uncertainty effects in RSA have been considered in this paper. Our approach is fundamentally different from the full variance-based SA. We are interested in performing the analysis with respect to the original set of input (regressor) variables, and then accounting for uncertainties in the model parameters. For general nonlinear models that are nonlinear in both regressors and parameters, Monte Carlo type methods are generally required to undertake the analysis. When models are nonlinear in the regressors terms and are linear in the parameters, explicit expressions for the parameter uncertainty effects have been derived. Modifications to both Sobol's and FAST methods have been developed to find conditional partial variance matricesMi1,i2,…,isMi1,i2,…,is described in Eq. (21). The effects of parameter uncertainties on the sensitivity indices are described by the uncertain parameter partial variance (UPPV) which is the combination of constant parameter partial variance (CPPV) given the expected values of the parameters and the second order statistical properties that describe the parameter uncertainty. The difference between CPPV and UPPV can be analytically expressed as the term tr(Mi1,i2,…,isΣΘ)tr(Mi1,i2,…,isΣΘ). Our approach can be used for SA for many empirical models since many of these are linear in the parameters. A number of special cases have been considered, and extension of this method to approximate the effect of parameter uncertainties in models that are nonlinear in the parameters has been proposed. Several related issues require further investigation: (i) in most applications, there are some correlations among the input factors, requiring the use of more flexible sampling methods and (ii) further development of methods to account for nonlinearities in the parameters. A simple modification has been proposed in this paper to enable the results to be used on nonlinear models. However, it is known in the statistics literature [28], that the impact of nonlinearities can significantly affect the statistical analysis of parameter uncertainty. Sophisticated mathematical methods to determine when this might be an issue have been developed and it would be interesting to modify their development for our approach.