dc.contributorUniversidade Federal de São Carlos (UFSCar)
dc.contributorUniversidade do Estado de Mato Grosso (UNEMAT)
dc.contributorUniversidade Estadual de Campinas (UNICAMP)
dc.contributorUniversidade Estadual Paulista (Unesp)
dc.date.accessioned2015-11-03T15:29:57Z
dc.date.available2015-11-03T15:29:57Z
dc.date.created2015-11-03T15:29:57Z
dc.date.issued2014-01-01
dc.identifier2014 27th Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi). New York: Ieee, p. 259-265, 2014.
dc.identifierhttp://hdl.handle.net/11449/130175
dc.identifier10.1109/SIBGRAPI.2014.36
dc.identifierWOS:000352613900034
dc.description.abstractIn the pattern recognition research field, Support Vector Machines (SVM) have been an effectiveness tool for classification purposes, being successively employed in many applications. The SVM input data is transformed into a high dimensional space using some kernel functions where linear separation is more likely. However, there are some computational drawbacks associated to SVM. One of them is the computational burden required to find out the more adequate parameters for the kernel mapping considering each non-linearly separable input data space, which reflects the performance of SVM. This paper introduces the Polynomial Powers of Sigmoid for SVM kernel mapping, and it shows their advantages over well-known kernel functions using real and synthetic datasets.
dc.languageeng
dc.publisherIeee
dc.relation2014 27th Sibgrapi Conference On Graphics, Patterns And Images (sibgrapi)
dc.rightsAcesso aberto
dc.sourceWeb of Science
dc.subjectMachine learning
dc.subjectKernel functions
dc.subjectPolynomial powers of sigmoid
dc.subjectPPS-Radial
dc.subjectSupport vector machines
dc.titleLearning kernels for support vector machines with polynomial powers of sigmoid
dc.typeActas de congresos


Este ítem pertenece a la siguiente institución