dc.creatorTomassi, Diego
dc.creatorMarx, Nicolás
dc.creatorBeauseroy, Pierre
dc.date2016-09
dc.date2016-11-22
dc.date2016-11-22T16:32:35Z
dc.identifierhttp://sedici.unlp.edu.ar/handle/10915/56980
dc.identifierhttp://45jaiio.sadio.org.ar/sites/default/files/ASAI-13_0.pdf
dc.identifierissn:2451-7585
dc.descriptionDimensionality reduction using feature extraction and selection approaches is a common stage of many regression and classification tasks. In recent years there have been significant e orts to reduce the dimension of the feature space without lossing information that is relevant for prediction. This objective can be cast into a conditional independence condition between the response or class labels and the transformed features. Building on this, in this work we use measures of statistical dependence to estimate a lower-dimensional linear subspace of the features that retains the su cient information. Unlike likelihood-based and many momentbased methods, the proposed approach is semi-parametric and does not require model assumptions on the data. A regularized version to achieve simultaneous variable selection is presented too. Experiments with simulated data show that the performance of the proposed method compares favorably to well-known linear dimension reduction techniques.
dc.descriptionSociedad Argentina de Informática e Investigación Operativa (SADIO)
dc.formatapplication/pdf
dc.format142-149
dc.languageen
dc.rightshttp://creativecommons.org/licenses/by-sa/3.0/
dc.rightsCreative Commons Attribution-ShareAlike 3.0 Unported (CC BY-SA 3.0)
dc.subjectCiencias Informáticas
dc.subjectdimension reduction
dc.subjectvariable selection
dc.subjectdependence measures
dc.subjectsupervised learning
dc.titleFeature extraction and selection using statistical dependence criteria
dc.typeObjeto de conferencia
dc.typeObjeto de conferencia


Este ítem pertenece a la siguiente institución