Actas de congresos
Ensembles Of Support Vector Machines For Regression Problems
Registro en:
Proceedings Of The International Joint Conference On Neural Networks. , v. 3, n. , p. 2381 - 2386, 2002.
2-s2.0-0036083003
Autor
Lima C.A.M.
Coelho A.L.V.
Von Zuben F.J.
Institución
Resumen
Support vector machines (SVMs) tackle classification and regression problems by non-linearly mapping input data into high-dimensional feature spaces, wherein a linear decision surface is designed. Even though the high potential of these techniques has been demonstrated, their applicability has been swamped by the necessity of the a priori choice of the kernel function to realize the non-linear mapping, which, sometimes, turns to be a complex and non-effective process. In this paper, we advocate that the application of neural ensembles theory to SVMs should alleviate such performance bottlenecks, because different networks with distinct kernel functions such as polynomials or radial basis functions may be created and properly combined into the same neural structure. Ensembles of SVMs, thus, promote the automatic configuration and tuning of SVMs, and have their generalization capability assessed here by means of some function regression experiments. 3
2381 2386 Cortes, C., Vapnik, V., Support-vector networks (1995) Machine Learning, 20, pp. 273-207 Girosi, F., An equivalence between sparse approximation and support vector machines (1997), A. I. Memo 1606, MIT, MayGunn, S., Support vector machine for classification and regression (1998), Image Speech & Intelligent Systems Group, Technical Report ISIS-I-98, University of Southampton, NovVapnik, V., (1995) The Nature of Stastical Learning Theory, , Springer, Verlag Scholkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, J., Vapnik, V., Comparing support vector machines with Gaussian kernels to radial basis function classifiers (1996), A. I. Memo 1599, MIT, DecMuller, K., Smola, A., Ratsh, G., Scholkopf, B., Kohlmorgen, J., Vapnik, V., Predicting time series with support vector machines Proceedings of the International Conference on Artificial Neural Networks, 1997 Reilly, R.L., Scofied, C.L., Elbaum, C., Cooper, L.N., Learning system architectures composed of multiple learning modules (1987) Proc. IEEE First Int. Conf. on Neural Networks, 2. , IEEE Scofield, C., Kenton, L., Chang, J., Multiple neural net architectures for character recognition Proc. Compcon, San Francisco, CA, February 1991, pp. 487-491. , IEEE Comp. Soc. Press Baxt, W.G., Improving the accuracy of an artificial neural network using multiple differently trained networks (1992) Neural Computation, 4 (5), pp. 135-144 Hashem, S., Schmeiser, B., Improving model accuracy using optimal linear combinations of trained neural networks (1995) IEEE Transactions on Neural Networks, 6 (3), pp. 792-794 Wolpert, D.H., Stacked generalization (1990), Technical Report LA-UR-90-3460, Complex Systems Group, Los Alamos, NMJacobs, R., Jordan, M., Nowlan, S., Hinton, G., Adaptive mixtures of local experts (1991) Neural Computation, 3 (1), pp. 79-87 Hashem, S., Optimal linear combinations of neural networks (1997) Neural Network, 10 (4), pp. 599-614 Kwok, Tin-Yau, J., Support vector mixture for classification and regression problems (1998) Procs. of the International Conference on Pattern Recognition (ICPR), pp. 255-258. , Brisbane, August Jordan, M., Jacobs, R., Hierarchical mixtures of experts and the EM algorithm (1994) Neural Computation, 6 (2), pp. 181-214. , Mar Perrone, M.P., Cooper, L.N., When network disagree: Ensemble method for neural networks (1993) Neural Networks for Speech and Image Processing, pp. 126-142. , In Mammone, R. J., editor Chapman-Hall Hwang, J.-N., Lay, S.R., Maechler, M.I., Martin, R.D., Schimert, J., Regression modeling in back-propagation and project pursuit learning (1994) IEEE Transactions on Neural Networks, 5 (3), pp. 342-353