دانلود مقاله ISI انگلیسی شماره 25133
ترجمه فارسی عنوان مقاله

کاهش تعداد نمونه برای افزایش سرعت رگرسیون بردار پشتیبانی نیمه پارامتری چند هسته ای

عنوان انگلیسی
Reducing samples for accelerating multikernel semiparametric support vector regression
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
25133 2010 7 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Expert Systems with Applications, Volume 37, Issue 6, June 2010, Pages 4519–4525

ترجمه کلمات کلیدی
رگرسیون بردار پشتیبانی - آموزش هسته های متعدد
کلمات کلیدی انگلیسی
Support vector regression,Multiple kernel learning
پیش نمایش مقاله
پیش نمایش مقاله  کاهش تعداد نمونه برای افزایش سرعت رگرسیون بردار پشتیبانی نیمه پارامتری چند هسته ای

چکیده انگلیسی

In this paper, the reducing samples strategy instead of classical νν-support vector regression (νν-SVR), viz. single kernel νν-SVR, is utilized to select training samples for admissible functions so as to curtail the computational complexity. The proposed multikernel learning algorithm, namely reducing samples based multikernel semiparametric support vector regression (RS-MSSVR), has an advantage over the single kernel support vector regression (classical εε-SVR) in regression accuracy. Meantime, in comparison with multikernel semiparametric support vector regression (MSSVR), the algorithm is also favorable for computational complexity with the comparable generalization performance. Finally, the efficacy and feasibility of RS-MSSVR are corroborated by experiments on the synthetic and real-world benchmark data sets.

نتیجه گیری انگلیسی

In our real world, a lot of systems own different data trends in different regions. In this situation, the commonly-used single-kernel learning algorithm sometimes does not achieve satisfactory result. Hence, it is necessary to develop multikernel learning algorithms. Nguyen and Tay (2008) proposed a multikernel semiparametric support vector regression to cope with the systems holding complicated structure, viz. different data trends in different regions. Compared with the single-kernel learning algorithm, namely the classical εε-SVR, Nguyen et al.’s proposed multikernel learning algorithm has an advantage in regression accuracy. However, the computational complexity of determining admissible functions is expensive, i.e., O(N3)O(N3) using the classical νν-SVR to select training samples to construct admissible functions. Towards this end, we utilize the reducing samples strategy instead of the classical νν-SVR so as to reduce the computational burden from O(N3)O(N3) to O(N2)O(N2). This viewpoint is supported by the experiments on the synthetic and real-world benchmark data sets. Although our experiments are all performed on the small data sets because of using the active method in MATLAB environment, if professional softwares for solving QP are used, they are not hard to be extended to the problems with large data sets. In addition, if the admissible function is replaced by the bias, the multikernel semiparametric support vector regression will become the classical support vector regression, i.e., the classical support vector regression is a special case of the multikernel semiparametric support vector regression.