In many real life realms, many unknown systems own different data trends in different regions, i.e., some parts are steep variations while other parts are smooth variations. If we utilize the conventional kernel learning algorithm, viz. the single kernel linear programming support vector regression, to identify these systems, the identification results are usually not very good. Hence, we exploit the nonlinear mappings induced from the kernel functions as the admissible functions to construct a novel multikernel semiparametric predictor, called as MSLP-SVR, to improve the regression effectiveness. The experimental results on the synthetic and the real-world data sets corroborate the efficacy and validity of our proposed MSLP-SVR. Meantime, compared with other multikernel linear programming support vector algorithm, ours also takes advantages. In addition, although the MSLP-SVR is proposed in the regression domain, it can also be extended to classification problems.
In many real life fields, we usually encounter some unknown systems to identify, and these systems sometimes hold different data trends in different regions, i.e. some parts are steep variations while others are smooth variations. In this context, if we use the traditional kernel learning algorithms like the single kernel LP-SVR to identify these systems, the fitting effectiveness is commonly not satisfactory. For this case, there are two candidate methods to mitigate this embarrassment. One is multikernel trick which has been becoming one hot topic in the kernel learning domain. It is capable of utilizing different kind kernel functions or different parameter kernel functions to fit different data trends in different regions for the unknown systems. On the other hand, if we have the additional knowledge about the unknown systems in advance, of course, it is a very good choice to identify these systems by virtue of the priori knowledge. For example, Smola et al. (1998) utilized the additional knowledge to construct the admissible functions, viz. the semiparametric technique, so as to improve the learning effectiveness. However, generally, we do not have the additional knowledge about the will-identified system. In this case, it is very hard to construct the explicit admissible functions which are usually beneficial to model the systems. Hence, in this letter, we exploit the nonlinear mappings induced from the kernel functions as the admissible functions to improve the regression accuracy of the conventional LP-SVR. This novel predictor, named as MSLP-SVR, combines the semiparametric technique with the multikernel trick, which can improve the fitting effectiveness of the conventional LP-SVR more or less with the comparable computational complexity.
The experimental results on the synthetic and real-world data sets show the effectiveness of the MSLP-SVR. Meantime, compared with other multikernel linear programming learning algorithm, our proposed MSLP-SVR own the superiorities of the number of support vectors, the computational complexity, and regression accuracy. Although we propose MSLP-SVR for the regression problem, similarly, it can be extended to classification realm. In addition, in the linear programming learning algorithms, we are able to obtain a solvable linear programming problem even using non-Mercer kernels (Lu & Sun, 2009), so MSLP-SVR may be expanded with non-Mercer hybrid kernels, which is our future work.