Article
Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks
Registro en:
Candia-Garc?a C., Forero M.G., Herrera-Rivera S. (2019) Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks. In: Vera-Rodriguez R., Fierrez J., Morales A. (eds) Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. CIARP 2018. Lecture Notes in Computer Science, vol 11401. Springer, Cham
0302-9743
Autor
Forero Vargas, Manuel Guillermo
Herrera-Rivera, Sergio
Candia-Garc?a, Cristian
Institución
Resumen
When modeling phenomena that cannot be studied by deterministic analytical approaches, one of the main tasks is to generate random variates. The widely-used techniques, such as the inverse transformation, convolution, and rejection-acceptance methods, involve a significant amount of statistical work and do not provide satisfactory results when the data do not conform to the known probability density functions. This study aims to propose an alternative nonparametric method for generating random variables that combines kernel density estimation (KDE), and radial basis function based neural networks (RBFBNNs). We evaluate the method?s performance using Poisson, triangular, and exponential probability density distributions and assessed its utility for unknown distributions. The results show that the model?s effectiveness depends substantially on selecting an appropriate bandwidth value for KDE and a certain minimum number of data points to train the algorithm. the proposed method enabled us to achieve an R2 value between 0.91 and 0.99 for analyzed distributions.