dc.creator | Iparraguirre-Villanueva, Orlando | |
dc.creator | Guevara-Ponce, Victor | |
dc.creator | Ruiz-Alvarado, Daniel | |
dc.creator | Beltozar-Clemente, Saul | |
dc.creator | Sierra-Liñan, Fernando | |
dc.creator | Zapata-Paulini, Joselyn | |
dc.creator | Cabanillas-Carbonell, Michael | |
dc.date.accessioned | 2023-11-30T16:15:15Z | |
dc.date.accessioned | 2024-08-06T20:49:44Z | |
dc.date.available | 2023-11-30T16:15:15Z | |
dc.date.available | 2024-08-06T20:49:44Z | |
dc.date.created | 2023-11-30T16:15:15Z | |
dc.date.issued | 2023 | |
dc.identifier | https://hdl.handle.net/20.500.13067/2830 | |
dc.identifier | https://doi.org/10.11591/ijeecs.v29.i3.pp1758-1768 | |
dc.identifier.uri | https://repositorioslatinoamericanos.uchile.cl/handle/2250/9538881 | |
dc.description.abstract | Unit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as
input, a model is developed to find the best way to extract the words from the context. For training the model, the poem "La Ciudad y los perros" which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context. | |
dc.language | eng | |
dc.publisher | Indonesian Journal of Electrical Engineering and Computer Science | |
dc.rights | https://creativecommons.org/licenses/by/4.0/ | |
dc.rights | info:eu-repo/semantics/openAccess | |
dc.source | 29 | |
dc.source | 3 | |
dc.source | 1758 | |
dc.source | 1768 | |
dc.subject | Dropout | |
dc.subject | Prediction | |
dc.subject | Recurrent neural network | |
dc.subject | Text | |
dc.subject | Unit short-term memory | |
dc.title | Text prediction recurrent neural networks using long shortterm memory-dropout | |
dc.type | info:eu-repo/semantics/article | |