dc.creatorCholaquidis, A.
dc.creatorFraiman, R.
dc.creatorSued, Raquel Mariela
dc.date.accessioned2021-11-26T13:41:26Z
dc.date.accessioned2022-10-15T15:24:10Z
dc.date.available2021-11-26T13:41:26Z
dc.date.available2022-10-15T15:24:10Z
dc.date.created2021-11-26T13:41:26Z
dc.date.issued2020-12
dc.identifierCholaquidis, A.; Fraiman, R.; Sued, Raquel Mariela; On semi-supervised learning; Springer; Test; 29; 4; 12-2020; 914-937
dc.identifier1133-0686
dc.identifierhttp://hdl.handle.net/11336/147485
dc.identifierCONICET Digital
dc.identifierCONICET
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/4402426
dc.description.abstractMajor efforts have been made, mostly in the machine learning literature, to construct good predictors combining unlabelled and labelled data. These methods are known as semi-supervised. They deal with the problem of how to take advantage, if possible, of a huge amount of unlabelled data to perform classification in situations where there are few labelled data. This is not always feasible: it depends on the possibility to infer the labels from the unlabelled data distribution. Nevertheless, several algorithms have been proposed recently. In this work, we present a new method that, under almost necessary conditions, attains asymptotically the performance of the best theoretical rule when the size of the unlabelled sample goes to infinity, even if the size of the labelled sample remains fixed. Its performance and computational time are assessed through simulations and in the well- known “Isolet” real data of phonemes, where a strong dependence on the choice of the initial training sample is shown. The main focus of this work is to elucidate when and why semi-supervised learning works in the asymptotic regime described above. The set of necessary assumptions, although reasonable, show that semi-parametric methods only attain consistency for very well-conditioned problems.
dc.languageeng
dc.publisherSpringer
dc.relationinfo:eu-repo/semantics/altIdentifier/doi/http://dx.doi.org/10.1007/s11749-019-00690-2
dc.relationinfo:eu-repo/semantics/altIdentifier/url/https://link.springer.com/article/10.1007%2Fs11749-019-00690-2
dc.relationinfo:eu-repo/semantics/altIdentifier/url/https://arxiv.org/abs/1805.09180
dc.rightshttps://creativecommons.org/licenses/by-nc-sa/2.5/ar/
dc.rightsinfo:eu-repo/semantics/openAccess
dc.subjectCONSISTENCY
dc.subjectSEMI-SUPERVISED LEARNING
dc.subjectSMALL TRAINING SAMPLE
dc.titleOn semi-supervised learning
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:ar-repo/semantics/artículo
dc.typeinfo:eu-repo/semantics/publishedVersion


Este ítem pertenece a la siguiente institución