dc.creatorKhattak, Muhammad Irfan
dc.creatorSaleem, Nasir
dc.creatorGao, Jiechao
dc.creatorVerdú, Elena (1)
dc.creatorParra Fuente, Javier (1)
dc.date.accessioned2022-12-09T13:28:08Z
dc.date.accessioned2023-03-07T19:39:50Z
dc.date.available2022-12-09T13:28:08Z
dc.date.available2023-03-07T19:39:50Z
dc.date.created2022-12-09T13:28:08Z
dc.identifier00457906
dc.identifierhttps://reunir.unir.net/handle/123456789/13885
dc.identifierhttps://doi.org/10.1016/j.compeleceng.2022.107887
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/5908140
dc.description.abstractA speech enhancement algorithm improves the perceptual aspects of a speech degraded by noise signals. We propose a phase-aware deep neural network (DNN) using the regularized sparse features for speech enhancement. A regularized sparse decomposition is applied to noisy speech and the obtained sparse features are combined with robust acoustic features to train DNN. Two time-frequency masks including ideal ratio mask (IRM) and ideal binary mask (IBM) are estimated. An intelligibility improvement filter is applied as post-processer to further improve the intelligibility. During waveform reconstruction, the estimated phase is used for better quality. The results show that the proposed algorithm achieves better speech intelligibility and quality. Besides, less residual noise and speech distortion is observed. By using the TIMIT and LibriSpeech databases, the proposed algorithm improved the intelligibility and quality by 14.61% and 42.11% over the noisy speech.
dc.languageeng
dc.publisherComputers and Electrical Engineering
dc.relation;vol. 100, nº 107887
dc.relationhttps://www.sciencedirect.com/science/article/pii/S0045790622001756?via%3Dihub
dc.rightsopenAccess
dc.subjectDNN
dc.subjectintelligibility
dc.subjectphase estimation
dc.subjectsparseness
dc.subjectspeech enhancement
dc.subjectspeech quality
dc.subjectScopus
dc.subjectJCR
dc.titleRegularized sparse features for noisy speech enhancement using deep neural networks
dc.typeArticulo Revista Indexada


Este ítem pertenece a la siguiente institución