دانلود مقاله ISI انگلیسی شماره 25680
ترجمه فارسی عنوان مقاله

رگرسیون بردار پشتیبانی مزدوج وزن دار

عنوان انگلیسی
A weighted twin support vector regression
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
25680 2012 10 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Knowledge-Based Systems, Volume 33, September 2012, Pages 92–101

ترجمه کلمات کلیدی
- رگرسیون بردار پشتیبانی - رگرسیون بردار پشتیبانی دوقلو - عملکرد محدود بالا و پایین - ضریب وزنی - رگرسیون بردار پشتیبانی دوقلو وزندار
کلمات کلیدی انگلیسی
SVR,TSVR,Up- and down-bound functions,Weighted coefficient,Weighted TSVR
پیش نمایش مقاله
پیش نمایش مقاله  رگرسیون بردار پشتیبانی مزدوج وزن دار

چکیده انگلیسی

Twin support vector regression (TSVR) is a new regression algorithm, which aims at finding ϵ-insensitive up- and down-bound functions for the training points. In order to do so, one needs to resolve a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one in a classical SVR. However, the same penalties are given to the samples in TSVR. In fact, samples in the different positions have different effects on the bound function. Then, we propose a weighted TSVR in this paper, where samples in the different positions are proposed to give different penalties. The final regressor can avoid the over-fitting problem to a certain extent and yield great generalization ability. Numerical experiments on one artificial dataset and nine benchmark datasets demonstrate the feasibility and validity of our proposed algorithm.

مقدمه انگلیسی

Support vector machine (SVM), motivated by the Vapnik–Chervonenkis (VC) dimensional theory and the statistical learning theory [15], is a promising technique. Many papers exploiting it made the state of the art and one of the most used classifiers. Compared with other machine learning approaches like artificial neural networks [14], SVM has many advantages. First, SVM solves a QPP, assuring that once an optimal solution is obtained, it is the unique (global) solution. Second, SVM derives its sparse and robust solution by maximizing the margin between the two classes. Third, SVM implements the structural risk minimization principle rather than the empirical risk minimization principle, which minimizes the upper bound of the generalization error. SVM has been successfully applied in various aspects ranging from remote sensing image classification [11], text classification [18] to business prediction [10]. However, one of the main challenges for the standard SVM is the high computational complexity. The computational complexity of the SVM is n3, where n is the total size of training data. In order to improve the computational speed of SVM, Jayadeva et al. [5] proposed a twin support vector machine (TSVM) for binary data classification in the spirit of the proximal SVM [2], [4] and [3]. TSVM generates two nonparallel hyper-planes by solving two smaller-sized QPPs such that each hyper-plane is closer to one class and as far as possible from the other. The strategy of solving two smaller-sized QPPs, rather than a single large one, makes the learning speed of TSVM approximately four times faster than that of the standard SVM. At present, TSVM has become one of the popular methods because of its low computational complexity. Many variants of TSVM have been proposed by Peng [12], Kumar and Gopal [7], Jayadeva et al. [6], Khemchandani et al. [9]. Certainly, the above algorithms are suitable to the classification problems. As for the regression problem, Peng [13] proposed an efficient TSVR. In TSVR, the same penalties are given to the samples. However, as samples locate in the different positions, it is more reasonable to give different penalties to them. Inspired by the above studies, we introduce two weighted coefficients σ1 and σ2 [1], [19] and [16] into the TSVR and propose a weighted TSVR in this paper. By dividing the whole plane into different parts, we bring different penalties to the samples depending on their different positions. The effectiveness of our proposed algorithm is demonstrated by numerical experiments on one artificial dataset and nine benchmark datasets. In the artificial experiment, we preliminarily determine the range of the penalty parameter σ, and it is helpful to the choice of the parameter σ in the following benchmark experiments. While we investigate the distributions of samples. The experimental results on nine benchmark datasets show that the weighted TSVR achieves significant performance in comparison with SVR and TSVR. The paper is organized as follows. Section 2 outlines the SVR and TSVR. A weighted TSVR is proposed in Section 3, which includes both the linear and nonlinear cases. Section 4 performs experiments on one artificial dataset and nine benchmark datasets to investigate the effectiveness of the weighted TSVR. The last section concludes the conclusions.

نتیجه گیری انگلیسی

A weighted TSVR is proposed in this paper, to deal with the over-fitting problem. We first divide the whole plane into three parts, and then give different penalties to the samples depending their different positions. Specifically, we give the larger penalties to the samples locating in the region 3, and smaller penalties to the samples locating in the region 2. Finally the weighted TSVR yields lower prediction error compared with SVR and TSVR. Experimental results on nine benchmark datasets show that our weighted TSVR far outperforms SVR, and lightly outperforms TSVR not only in linear case but also in nonlinear case. Moreover, the running time does not increase much although two new penalty parameters σ1 and σ2 are introduced into our proposed algorithm. However, as TSVR, our proposed weighted TSVR also losses sparsity. We should find a sparse algorithm in further work.