dc.creator | Sussner, P | |
dc.creator | Esmi, EL | |
dc.date | 2011 | |
dc.date | MAY 15 | |
dc.date | 2014-08-01T18:41:50Z | |
dc.date | 2015-11-26T16:28:45Z | |
dc.date | 2014-08-01T18:41:50Z | |
dc.date | 2015-11-26T16:28:45Z | |
dc.date.accessioned | 2018-03-28T23:09:48Z | |
dc.date.available | 2018-03-28T23:09:48Z | |
dc.identifier | Information Sciences. Elsevier Science Inc, v. 181, n. 10, n. 1929, n. 1950, 2011. | |
dc.identifier | 0020-0255 | |
dc.identifier | WOS:000288833300011 | |
dc.identifier | 10.1016/j.ins.2010.03.016 | |
dc.identifier | http://www.repositorio.unicamp.br/jspui/handle/REPOSIP/82220 | |
dc.identifier | http://repositorio.unicamp.br/jspui/handle/REPOSIP/82220 | |
dc.identifier.uri | http://repositorioslatinoamericanos.uchile.cl/handle/2250/1269553 | |
dc.description | A morphological neural network is generally defined as a type of artificial neural network that performs an elementary operation of mathematical morphology at every node, possibly followed by the application of an activation function. The underlying framework of mathematical morphology can be found in lattice theory. With the advent of granular computing, lattice-based neurocomputing models such as morphological neural networks and fuzzy lattice neurocomputing models are becoming increasingly important since many information granules such as fuzzy sets and their extensions, intervals, and rough sets are lattice ordered. In this paper, we present the lattice-theoretical background and the learning algorithms for morphological perceptrons with competitive learning which arise by incorporating a winner-take-all output layer into the original morphological perceptron model. Several well-known classification problems that are available on the internet are used to compare our new model with a range of classifiers such as conventional multi-layer perceptrons, fuzzy lattice neurocomputing models, k-nearest neighbors, and decision trees. (C) 2010 Elsevier Inc. All rights reserved. | |
dc.description | 181 | |
dc.description | 10 | |
dc.description | SI | |
dc.description | 1929 | |
dc.description | 1950 | |
dc.language | en | |
dc.publisher | Elsevier Science Inc | |
dc.publisher | New York | |
dc.publisher | EUA | |
dc.relation | Information Sciences | |
dc.relation | Inf. Sci. | |
dc.rights | fechado | |
dc.rights | http://www.elsevier.com/about/open-access/open-access-policies/article-posting-policy | |
dc.source | Web of Science | |
dc.subject | Computational intelligence | |
dc.subject | Lattice theory | |
dc.subject | Mathematical morphology | |
dc.subject | Minimax algebra | |
dc.subject | Morphological neural network | |
dc.subject | Morphological perceptron | |
dc.subject | Competitive neuron | |
dc.subject | Pattern recognition | |
dc.subject | Classification | |
dc.subject | Weight Neural-networks | |
dc.subject | Fuzzy Mathematical Morphologies | |
dc.subject | Associative Memories | |
dc.subject | Gray-scale | |
dc.subject | Reasoning Flr | |
dc.subject | Set-theory | |
dc.subject | Classification | |
dc.title | Morphological perceptrons with competitive learning: Lattice-theoretical framework and constructive learning algorithm | |
dc.type | Artículos de revistas | |