dc.creatorIparraguirre-Villanueva, Orlando
dc.creatorGuevara-Ponce, Victor
dc.creatorRuiz-Alvarado, Daniel
dc.creatorBeltozar-Clemente, Saul
dc.creatorSierra-Liñan, Fernando
dc.creatorZapata-Paulini, Joselyn
dc.creatorCabanillas-Carbonell, Michael
dc.date.accessioned2023-11-30T16:15:15Z
dc.date.accessioned2024-08-06T20:49:44Z
dc.date.available2023-11-30T16:15:15Z
dc.date.available2024-08-06T20:49:44Z
dc.date.created2023-11-30T16:15:15Z
dc.date.issued2023
dc.identifierhttps://hdl.handle.net/20.500.13067/2830
dc.identifierhttps://doi.org/10.11591/ijeecs.v29.i3.pp1758-1768
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/9538881
dc.description.abstractUnit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as input, a model is developed to find the best way to extract the words from the context. For training the model, the poem "La Ciudad y los perros" which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context.
dc.languageeng
dc.publisherIndonesian Journal of Electrical Engineering and Computer Science
dc.rightshttps://creativecommons.org/licenses/by/4.0/
dc.rightsinfo:eu-repo/semantics/openAccess
dc.source29
dc.source3
dc.source1758
dc.source1768
dc.subjectDropout
dc.subjectPrediction
dc.subjectRecurrent neural network
dc.subjectText
dc.subjectUnit short-term memory
dc.titleText prediction recurrent neural networks using long shortterm memory-dropout
dc.typeinfo:eu-repo/semantics/article


Este ítem pertenece a la siguiente institución