دانلود مقاله ISI انگلیسی شماره 24941
ترجمه فارسی عنوان مقاله

نظم حداقل مربعات رگرسیون بردار پشتیبانی برای یادگیری به طور همزمان از یک تابع و مشتقات آن

عنوان انگلیسی
Regularized least squares support vector regression for the simultaneous learning of a function and its derivatives
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
24941 2008 13 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Information Sciences, Volume 178, Issue 17, 1 September 2008, Pages 3402–3414

ترجمه کلمات کلیدی
ماشین آلات بردار پشتیبانی - نظم حداقل مربعات - یادگیری ماشین - تابع تقریب
کلمات کلیدی انگلیسی
Support vector machines,Regularized least squares,Machine learning,Function approximation
پیش نمایش مقاله
پیش نمایش مقاله  نظم حداقل مربعات رگرسیون بردار پشتیبانی برای یادگیری به طور همزمان از یک تابع و مشتقات آن

چکیده انگلیسی

In this paper, we propose a regularized least squares approach based support vector machine for simultaneously approximating a function and its derivatives. The proposed algorithm is simple and fast as no quadratic programming solver needs to be employed. Effectively, only the solution of a structured system of linear equations is needed.

مقدمه انگلیسی

The last decade has witnessed the evolution of Support Vector Machines (SVMs) as a powerful paradigm for pattern classification and regression [2], [4] and [23]. Support vector machines techniques have been successfully applied to a wide variety of areas, e.g. computer vision [7], image retrieval [26], etc. Support Vector Regression (SVR) attempts to fit a regressor through a given set of data samples, where the points may be in the pattern space or in a higher-dimensional feature space. SVR involves the solution of a quadratic minimization problem with linear inequality constraints which also minimizes the structural complexity. Estimation of a function along with its derivatives, or partial derivatives, is an important problem with diverse applications [9] and [15]. Derivatives of estimated static relations are often used for linearization in control and in extended Kalman filtering [3]. A wide variety of real problems, especially in linear circuits are concerned with nonparametric estimation of a function and its derivative [18]. The automatic estimation of derivatives can be employed for the construction of asymptotic local confidence intervals for the nonparametric estimate of the regression function and its first derivatives [16]. Other applications include Optimal control, optimization processes, computer graphics and image processing. For instance, in volumetric modelling, or when modelling shapes, a technique that can estimate the shape from measurements of the height as well as gradient or curvature at a few points would be able to find a better representation of the shape. In many cases, the curvature or gradient may be available at a different set of points than at which the height of the surface is available. An example is modelling a terrain from a remote sensing data as well as ground data. Although function approximation methods have been widely applied to modelling device characteristics, as in [14], such modelling is limited unless it can also estimate the derivatives or partial derivatives, which carry important information about nonlinearity in the device’s behaviour. Information about derivatives or partial derivatives helps determine the region of operation of a nonlinear device, and using samples of such information can make the representation of the device characteristics more accurate and compact. Additionally, a technique that uses such information can benefit by requiring fewer samples to obtain a representation with a desired accuracy. Further, Lamers and Kok [10] used variances of the first order derivatives as a measure of nonlinearity in the training data. It is often easy to obtain the derivative information as compare to information regarding the functional value. For instance, it is easier to determine the velocity of a moving vehicle as compared to its absolute position. Recently, Lázaro et al. [11] and [12] proposed a SVM based approach for the simultaneous learning of a function and its derivatives. This approach employs an iterative weighted least squares (IRWLS) procedure, which has earlier been applied in the regular SVM literature for classification and regression [17]. In this paper, the problem of simultaneous learning of a function and its derivative is formulated in a regularized least squares support vector machines framework. The main advantage of the proposed approach is that it obtains the solution by inverting a single positive definite matrix instead of solving a quadratic programming problem (QPP). The algorithm is simple and fast as the solution is obtained by solving a structured system of linear equations, instead of a single QPP of large size. The paper is organized as follows: Section 2 briefly dwells on Lázaro et al. approach. Section 3 describes Regularized Least Squares Support Vector Regression for the Simultaneous Learning of a Function and its Derivatives. Section 4 deals with the experimental results, while Section 5 is devoted to the concluding remarks.