رویکردی بر اساس آزمون هتلینگ برای بهینه سازی شبیه سازی چند معیاری تصادفی
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|9758||2000||15 صفحه PDF||سفارش دهید||محاسبه نشده|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Simulation Practice and Theory, Volume 8, Issue 5, 15 December 2000, Pages 341–355
In a stochastic simulation context, iterative methods of optimization, which perform at each step of their optimization procedure a comparison between two different values of the objective function, need the use of statistical tests in order to properly evaluate and compare the simulation results. However, when the objective function to be optimized is a multicriteria function involving several performance measures, classical statistical procedures, which do not take into account the correlation between the performance measures, could reject acceptable solutions. To avoid this, we propose an efficient and rigorous statistical procedure already used in a multicriteria context, Hotelling’s T2 procedure. This paper shows that this procedure is very well adapted when the problem is to compare simultaneously several criteria in a stochastic simulation–optimization context.
A stochastic simulation model is a model with random input variables. Consequently, performance measures are also random variables, which may be evaluated through their means μ. However, each run of the simulation model gives only an estimate of the true value of μ. In order to evaluate the precision of the estimation, it is necessary to calculate a confident interval on μ. Frequently, a simulation model is associated with measures of the system performance, f(z), which has to be optimized. Since iterative methods only require, point by point, the evaluation of the objective function, and not derivatives, these methods are often used in simulation–optimization problems where the simulation model is complex and cannot be expressed analytically in terms of the model variables. Popular iterative methods are based: •either on direct search methods , ,  and  such as the Hooke and Jeeves  method, where a comparison is made – at each new evaluation of the objective function – between the current optimum (i.e., the best value obtained up to that iteration), and the challenger point (i.e., the new value of the objective function at that point); •or on random optimization techniques such as simulated annealing, taboo search, or evolutionary strategies  and . These optimization methods also require the comparison – at each step in their optimization procedure – between a reference value and a challenger value. In stochastic simulation, the proper comparison of two solutions needs statistical tests such as a comparison of two-sample means  and . This is appropriated when comparing one criterion on two samples to determine whether the two samples belong to the same population or not, i.e., the observed criterion has the same mean for the two samples, assuming that both samples have the same variance ,  and . But, when the objective function is a multicriteria function, this breaks down. The aims of this paper are: 1. to highlight the theoretical problems caused by the simultaneous comparison of several criteria in a stochastic simulation–optimization environment; 2. to propose two solutions to these problems. The first solution has already been discussed in , the second, more general, will be presented in this paper. We will compare this new solution with the classical procedures using an example based on a manufacturing system model.
نتیجه گیری انگلیسی
In this paper, we have tried to point out the drawbacks of the classical statistical procedures used to compare two samples in case of stochastic simulation–optimization. We have shown that a major drawback is that in a multicriteria context, the classical statistical procedures often lead to the rejection of acceptable solutions. To cope with these drawbacks, we propose an efficient and rigorous statistical procedure already used in a multicriteria context. This paper shows that this procedure is appropriate when the problem is to simultaneously compare several criteria in stochastic simulation–optimization. Hotelling’s T2 procedure, which is an extension of the famous student test to the multicriteria case, avoids rejection of a solution when some criteria are significantly improved whereas any other is significantly degraded. So, this procedure accepts more solutions than classical statistical tests and thus improves the quality of the optimization procedure. Moreover, Hotelling’s T2 procedure takes into account the correlation between all performance criteria, without using weights for these variables (such weights are used when the global objective function is expressed as a weighted sum of all the performance criteria). But this procedure needs the assumption of normality and equal within-sample variability. Nevertheless, this procedure is robust, particularly in case of large sample sizes.