dc.date.accessioned2017-04-27T18:50:10Z
dc.date.available2017-04-27T18:50:10Z
dc.date.created2017-04-27T18:50:10Z
dc.date.issued2012
dc.identifier1559-9612
dc.identifierhttp://hdl.handle.net/10533/197042
dc.identifierD08I1060
dc.identifierWOS:000305321700005
dc.identifierWOS:000305321700005
dc.identifier0
dc.description.abstractIn this article we report a new method for gender classification from frontal face images using feature selection based on mutual information and fusion of features extracted from intensity, shape, texture, and from three different spatial scales. We compare the results of three different mutual information measures: minimum redundancy and maximal relevance (mRMR), normalized mutual information feature selection (NMIFS), and conditional mutual information feature selection (CMIFS). We also show that by fusing features extracted from six different methods we significantly improve the gender classification results relative to those previously published, yielding 99.13% of the gender classification rate on the FERET database.
dc.languageENG
dc.publisherTAYLOR & FRANCIS INC
dc.relationhttps://doi.org/10.1080/15599612.2012.663463
dc.relation10.1080/15599612.2012.663463
dc.relationinfo:eu-repo/grantAgreement/Fondef/D08I1060
dc.relationinfo:eu-repo/semantics/dataset/hdl.handle.net/10533/93477
dc.relationinstname: Conicyt
dc.relationreponame: Repositorio Digital RI2.0
dc.relationinstname: Conicyt
dc.relationreponame: Repositorio Digital RI2.0
dc.rightsinfo:eu-repo/semantics/openAccess
dc.titleGender classification from face images using mutual information and feature fusion
dc.typeArticulo


Este ítem pertenece a la siguiente institución