Artículos de revistas
Sensitivity, Prediction Uncertainty, and Detection Limit for Artificial Neural Network Calibrations
Fecha
2016-08Registro en:
Allegrini, Franco; Olivieri, Alejandro Cesar; Sensitivity, Prediction Uncertainty, and Detection Limit for Artificial Neural Network Calibrations; American Chemical Society; Analytical Chemistry; 88; 15; 8-2016; 7807-7812
0003-2700
CONICET Digital
CONICET
Autor
Allegrini, Franco
Olivieri, Alejandro Cesar
Resumen
With the proliferation of multivariate calibration methods based on artificial neural networks, expressions for the estimation of figures of merit such as sensitivity, prediction uncertainty, and detection limit are urgently needed. This would bring nonlinear multivariate calibration methodologies to the same status as the linear counterparts in terms of comparability. Currently only the average prediction error or the ratio of performance to deviation for a test sample set is employed to characterize and promote neural network calibrations. It is clear that additional information is required. We report for the first time expressions that easily allow one to compute three relevant figures: (1) the sensitivity, which turns out to be sample-dependent, as expected, (2) the prediction uncertainty, and (3) the detection limit. The approach resembles that employed for linear multivariate calibration, i.e., partial least-squares regression, specifically adapted to neural network calibration scenarios. As usual, both simulated and real (near-infrared) spectral data sets serve to illustrate the proposal.