دانلود مقاله ISI انگلیسی شماره 25871
ترجمه فارسی عنوان مقاله

شبکه های عصبی مکرر عمومی برای رگرسیون بردار پشتیبانی حساس ε

عنوان انگلیسی
Generalized recurrent neural network for ϵ-insensitive support vector regression
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
25871 2012 8 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Mathematics and Computers in Simulation, Volume 86, December 2012, Pages 2–9

ترجمه کلمات کلیدی
ϵ رگرسیون بردار پشتیبانی حساس ε - شبکه های عصبی مکرر عمومی - همگرایی جهانی
کلمات کلیدی انگلیسی
ϵ-Insensitive support vector regression,Generalized recurrent neural network,Global convergence
پیش نمایش مقاله
پیش نمایش مقاله  شبکه های عصبی مکرر عمومی برای رگرسیون بردار پشتیبانی حساس ε

چکیده انگلیسی

In this paper, a generalized recurrent neural network is proposed for solving ϵ-insensitive support vector regression (ϵ-ISVR). The ϵ-ISVR is first formulated as a convex non-smooth programming problem, and then a generalize recurrent neural network with lower model complexity is designed for training the support vector machine. Furthermore, simulation results are given to demonstrate the effectiveness and performance of the proposed neural network.

مقدمه انگلیسی

Support vector machines (SVMs) are powerful tools for data classification and regression. In the recent years, many fast algorithms for SVMs have been developed [2]. Mangasarian [14] proposed the finite Newton algorithm for SVMs learning. Keerthi and DeCoste [5] introduced the modified finite Newton algorithm to speed up the finite Newton algorithm for fast solution of large scale linear SVMs. More recently, as a software and hardware implementable approach, recurrent neural networks for solving linear and nonlinear optimization problems with their engineering applications have been widely developed [6], [9], [13], [15] and [17]. Compared with traditional numerical optimization algorithms, the neural networks have fast convergence rate in real-time solutions. In 1986, Tank and Hopfield [15] proposed a recurrent neural network for solving the linear programming problems for the first time. In 1988, the dynamical canonical nonlinear programming circuit (NPC) was introduced by Kennedy and Chua [6] for optimization by utilizing a finite penalty parameter, which can generate the approximate optimal solutions. Wang and Xia [17] proposed a primal-dual neural network for solving the linear assignment problems. To get the optimal solutions of non-smooth optimization problems, Forti et al. [4] proposed and investigated the generalized NPC (G-NPC), which can be considered as a natural extension of NPC. In order to reduce the model complexity, some one-layer recurrent neural networks with lower model complexity have been constructed for solving linear and nonlinear programming problems [10] and [12]. This paper is concerned with a generalized recurrent neural network for the ϵ-insensitive support vector regression. The global convergence of the proposed recurrent neural network is guaranteed using the Lyapunov-like method. Compared with the existing neural networks for support vector regression (SVR) learning, the proposed neural network herein has lower model complexity, but is efficient for SVR learning.

نتیجه گیری انگلیسی

In this paper, based on the non-smooth analysis and gradient method, a generalized recurrent neural network with a discontinuous activation function has been proposed for support vector regression training. Simulation results show that the neural network is efficient for support vector regression training.