dc.contributorUniversidade Estadual Paulista (UNESP)
dc.creatorMarana, Aparecido Nilceu
dc.creatorda, L.
dc.creatorVelastin, S. A.
dc.creatorLotufo, R. A.
dc.date2014-05-27T11:18:10Z
dc.date2016-10-25T18:14:15Z
dc.date2014-05-27T11:18:10Z
dc.date2016-10-25T18:14:15Z
dc.date1997-01-01
dc.date.accessioned2017-04-06T00:48:48Z
dc.date.available2017-04-06T00:48:48Z
dc.identifierICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, v. 4, p. 2773-2775.
dc.identifier0736-7791
dc.identifierhttp://hdl.handle.net/11449/64992
dc.identifierhttp://acervodigital.unesp.br/handle/11449/64992
dc.identifier10.1109/ICASSP.1997.595364
dc.identifier2-s2.0-0030701424
dc.identifierhttp://dx.doi.org/10.1109/ICASSP.1997.595364
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/886759
dc.descriptionThis paper presents a technique for oriented texture classification which is based on the Hough transform and Kohonen's neural network model. In this technique, oriented texture features are extracted from the Hough space by means of two distinct strategies. While the first operates on a non-uniformly sampled Hough space, the second concentrates on the peaks produced in the Hough space. The described technique gives good results for the classification of oriented textures, a common phenomenon in nature underlying an important class of images. Experimental results are presented to demonstrate the performance of the new technique in comparison, with an implemented technique based on Gabor filters.
dc.languageeng
dc.relationICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.subjectMathematical transformations
dc.subjectNeural networks
dc.subjectHough transform
dc.subjectKohonen's self organizing map
dc.subjectOriented texture classification
dc.subjectFeature extraction
dc.titleOriented texture classification based on self-organizing neural network and Hough transform
dc.typeOtro


Este ítem pertenece a la siguiente institución