دانلود مقاله ISI انگلیسی شماره 25326
ترجمه فارسی عنوان مقاله

نکاتی درباره حداقل ضعف شدید در برنامه ریزی خطی چند معیاری

عنوان انگلیسی
A note on weak sharp minima in multicriteria linear programming
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
25326 2012 8 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Applied Mathematics Letters, Volume 25, Issue 7, July 2012, Pages 1071–1076

ترجمه کلمات کلیدی
مشکل برنامه ریزی خطی چندمعیاری - حداقل تیز ضعیف - عملکرد باقی مانده -
کلمات کلیدی انگلیسی
Multicriteria linear programming problem, Weak sharp minima, Residual function,
پیش نمایش مقاله
پیش نمایش مقاله  نکاتی درباره حداقل ضعف شدید در برنامه ریزی خطی چند معیاری

چکیده انگلیسی

In this note, the set of weak Pareto solutions of a multicriteria linear programming problem (MCLP, for short) is proved to be a set of weak sharp minima for another residual function of MCLP, i.e., the minimum of the natural residual functions of finitely many scalarization problems of MCLP, which is less than the natural residual function of MCLP. This can be viewed as a slight improvement of a result due to Deng and Yang. Some examples are given to illustrate these results.

مقدمه انگلیسی

Let RpRp be a pp-dimensional Euclidean space. Denote by View the MathML sourceR+p={x=(x1,…,xp)|xi≥0,i=1,…,p} and by View the MathML source△p={ξ=(ξ1,…,ξp)∈R+p|∑i=1pξi=1} Turn MathJax on the unit simplex, where pp is a given positive integer. In this paper, we always consider the max-norm in Euclidean space RpRp, i.e., View the MathML source‖x−y‖=max1≤i≤p|xi−yi|, Turn MathJax on where x=(x1,…,xp)x=(x1,…,xp) and y=(y1,…,yp)∈Rpy=(y1,…,yp)∈Rp. Let A⊆RpA⊆Rp and x∈Rpx∈Rp. Denote by d(x,A)d(x,A) the distance from xx to AA, i.e., d(x,A)=infa∈A‖x−a‖d(x,A)=infa∈A‖x−a‖. Consider the multicriteria linear programming problem (MCLP, for short) as follows: View the MathML sourceminCxsubject to x∈P, Turn MathJax on where Cx=(〈c1,x〉,…,〈cn,x〉)Cx=(〈c1,x〉,…,〈cn,x〉), View the MathML sourceci=(ci1,…,cim)∈Rm(i=1,…,n) and P⊆RmP⊆Rm is a nonempty polyhedral convex set. A point x∗∈Px∗∈P is called a weak Pareto solution of MCLP if, View the MathML sourceCx−Cx∗∉−intR+n,∀x∈P, Turn MathJax on or equivalently, View the MathML sourceCx−Cx∗∈R+n∖(−intR+n),∀x∈P. Turn MathJax on Denote by EwEw the set of weak Pareto solutions of MCLP. In this paper, we always suppose that EwEw is nonempty. It is clear that d(Cx,CEw)=0d(Cx,CEw)=0 and x∈Px∈P if and only if x∈Ewx∈Ew and so d(Cx,CEw)d(Cx,CEw) serves as a natural residual function for MCLP. We say that EwEw is a set of weak sharp minima for the function d(Cx,CEw)d(Cx,CEw) if, there exists τ>0τ>0 such that View the MathML sourced(x,Ew)≤τd(Cx,CEw),∀x∈P. Turn MathJax on The concept of a sharp minimum for real-valued functions was introduced in [1]. Weak sharp minima for real-valued functions, as a generalization of sharp minima, were introduced and investigated by Ferris [2]. Weak sharp minima play important roles in mathematical programming. It is well known that weak sharp minima are closely related to error bounds in convex programming, the sensitivity analysis of optimization problems and the convergence analysis of some algorithms (see, for example, [3], [4], [5], [6] and [7]). In [8], Deng and Yang studied the existence of weak sharp minima in MCLP and proved that weak sharp minimality holds for the natural residual function d(Cx,CEw)d(Cx,CEw). As it is well known that among solution approaches for MCLP, scalarization is one of the most analyzed topics at least from the computational point of view. By the well known structure of MCLP [9], there are finitely many vectors in the unit simplex such that EwEw is the union of the sets of solutions of scalarization problems of MCLP related to such vectors. To this end, in this note, we prove that EwEw is a set of weak sharp minima for another residual function of MCLP, i.e., the minimum of the natural residual functions of such finitely many scalarization problems of MCLP, which is shown to be less than the natural residual function d(Cx,CEw)d(Cx,CEw). This can be viewed as a slight improvement of a result due to Deng and Yang [8]. We will give some examples to illustrate these results.