The estimation of probability density functions (pdf) from unlabeled data samples is a relevant (and, still open) issue in pattern recognition and machine learning. Statistical parametric and nonparametric approaches present severe drawbacks. Only a few instances of neural networks for pdf estimation are found in the literature, due to the intrinsic difficulty of unsupervised learning under the necessary integral-equals-one constraint. In turn, also such neural networks do suffer from serious limitations. The paper introduces a soft-constrained algorithm for training a multilayer perceptron (MLP) to estimate pdfs empirically. A variant of the Metropolis-Hastings algorithm (exploiting the very probabilistic nature of the MLP) is used to satisfy numerically the constraint on the integral of the function learned by the MLP. The preliminary outcomes of a simulation on data drawn from a mixture of Fisher-Tippett pdfs are reported on, and compared graphically with the estimates yielded by statistical techniques, showing the viability of the approach.
Trentin, E. (2016). Soft-Constrained Nonparametric Density Estimation with Artificial Neural Networks. In Artificial Neural Networks in Pattern Recognition. ANNPR 2016 (pp.68-79). Cham : Springer [10.1007/978-3-319-46182-3_6].
Soft-Constrained Nonparametric Density Estimation with Artificial Neural Networks
Trentin, Edmondo
2016-01-01
Abstract
The estimation of probability density functions (pdf) from unlabeled data samples is a relevant (and, still open) issue in pattern recognition and machine learning. Statistical parametric and nonparametric approaches present severe drawbacks. Only a few instances of neural networks for pdf estimation are found in the literature, due to the intrinsic difficulty of unsupervised learning under the necessary integral-equals-one constraint. In turn, also such neural networks do suffer from serious limitations. The paper introduces a soft-constrained algorithm for training a multilayer perceptron (MLP) to estimate pdfs empirically. A variant of the Metropolis-Hastings algorithm (exploiting the very probabilistic nature of the MLP) is used to satisfy numerically the constraint on the integral of the function learned by the MLP. The preliminary outcomes of a simulation on data drawn from a mixture of Fisher-Tippett pdfs are reported on, and compared graphically with the estimates yielded by statistical techniques, showing the viability of the approach.File | Dimensione | Formato | |
---|---|---|---|
Trentin2016_Chapter_Soft-ConstrainedNonparametricD.pdf
non disponibili
Tipologia:
PDF editoriale
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
746.1 kB
Formato
Adobe PDF
|
746.1 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/1007285