دانلود مقاله ISI انگلیسی شماره 25660
ترجمه فارسی عنوان مقاله

یک الگوریتم بهینه سازی بر اساس تجزیه و تحلیل حساسیت تصادفی برای چشم انداز های هدف پر سر و صدا

عنوان انگلیسی
An optimization algorithm based on stochastic sensitivity analysis for noisy objective landscapes
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
25660 2003 7 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Reliability Engineering & System Safety, Volume 79, Issue 2, February 2003, Pages 245–252

ترجمه کلمات کلیدی
الگوریتم بهینه سازی - حساسیت تصادفی - روش گرادیان - مسئله فروشنده دوره گرد -
کلمات کلیدی انگلیسی
Optimization algorithm, Stochastic sensitivity, Gradient method, Traveling salesman problem,
پیش نمایش مقاله
پیش نمایش مقاله   یک الگوریتم بهینه سازی بر اساس تجزیه و تحلیل حساسیت تصادفی برای چشم انداز های هدف پر سر و صدا

چکیده انگلیسی

A function minimization algorithm that updates solutions based on approximated derivative information is proposed. The algorithm generates sample points with Gaussian white noise, and approximates derivatives based on stochastic sensitivity analysis. Unlike standard trust region methods which calculate gradients with n or more sample points, where n is the number of variables, the proposed algorithm allows the number of sample points M to be less than n. Furthermore, it ignores small amounts of noise within a trust region. This paper addresses the following two questions: how does the derivative approximation become worse when the number of sample points is small? Can the algorithm find good solutions with inexact derivative information when the objective landscape is noisy? Through intensive numerical experiments using quadratic functions, the algorithm is shown to be able to approximate derivatives when M is about n/10 or more. The experiments using a formulation of the traveling salesman problem show that the algorithm can find reasonably good solutions for noisy objective landscapes with inexact derivatives information.

مقدمه انگلیسی

Optimization problems that seek for minimization (or equivalently maximization) of an objective function have practical importance in various areas. Once a task is modeled as an optimization problem, general optimization techniques become applicable; e.g. linear programming, gradient methods, etc. One may encounter difficulties, however, in applying these techniques when the objective function is non-differentiable or when it is defined as a procedure. The example problem considered in this paper involving such a function is a parametric local search for the traveling salesman problem (TSP), a representative combinatorial optimization problem, in which an objective function of a parameter vector is defined as a heuristic procedure. Another example, which has been studied by the authors [1] but which is not covered in this paper, is a model involving a step function. For both cases, the associated objective landscapes are noisy in the sense that they include many non-differentiable points and many local minima. In this paper, an unconstrained function minimization algorithm that updates solutions based on derivative information approximated with sample points is proposed for such noisy landscapes. The assumption of this study is that the number of sample points may be less than the dimension n of the objective function. Note that standard trust region methods require n or more sample points (see [2], [3], [4] and [5] and references therein). For example, direct search methods that approximate derivatives by linear interpolation maintain n+1 non-degenerate points within a trust region, and search methods that use quadratic polynomial interpolation require (n+l)(n+2)/2 non-degenerate points. To evaluate practicality of the proposed algorithm, the following questions may be helpful: • How does the derivative approximation become worse when the number of sample points is small? • Can the algorithm find good solutions with inexact derivative information when the objective landscape is noisy? This paper aims to answer these questions through numerical experiments. The optimization algorithm proposed in this paper assumes that the objective function is scaled such that an area formed by the Gaussian distribution of unit variance can be used as a trust region, and uses stochastic sensitivity analysis to approximate derivatives. The method is to inject Gaussian white noise into each of the variables in the current solution, and approximate the first derivatives using the injected noise. A mathematical framework of the derivative approximation based on stochastic sensitivity analysis is also presented. The paper is organized as follows. In Section 2, the optimization algorithm is described, and the mathematical framework of the derivative approximation is presented. Section 3 shows numerical experiments to answer the first question. In Section 4, the algorithm is applied to the TSP with a noisy objective landscape to answer the second question. Finally, Section 5 summarizes the paper.

نتیجه گیری انگلیسی

A new function minimization algorithm called SNR was proposed and evaluated with quadratic functions and an objective function for a new TSP formulation based on parametric local search. A derivative approximation method based on Taylor series expansion were also presented. The proposed algorithm is classified as a trust region method, where derivatives are approximated with sample points around a current solution, and a trust region is defined as a Gaussian distribution of unit variance. A unique property of the algorithm is that it allows the number of sample points to be smaller than the number of variables. Through intensive numerical experiments, it has been shown that the described algorithm, SNR, has the following characteristics: • It can approximate derivatives with fewer sample points than the dimension of the objective function. • It can converge to reasonably good solutions with inexact derivative information even when the objective landscape is noisy.