This paper introduces hybrid random fields, which are a class of probabilistic graphical models aimed at allowing for efficient structure learning in high-dimensional domains. Hybrid random fields, along with the learning algorithm we develop for them, are especially useful as a pseudo-likelihood estimation technique (rather than a technique for estimating strict joint probability distributions). In order to assess the generality of the proposed model, we prove that the class of pseudo-likelihood distributions representable by hybrid random fields strictly includes the class of joint probability distributions representable by Bayesian networks. Once we establish this result, we develop a scalable algorithm for learning the structure of hybrid random fields, which we call ‘Markov Blanket Merging’. On the one hand, we characterize some complexity properties of Markov Blanket Merging both from a theoretical and from the experimental point of view, using a series of synthetic benchmarks. On the other hand, we evaluate the accuracy of hybrid random fields (as learned via Markov Blanket Merging) by comparing them to various alternative statistical models in a number of pattern classification and link-prediction applications. As the results show, learning hybrid random fields by the Markov Blanket Merging algorithm not only reduces significantly the computational cost of structure learning with respect to several considered alternatives, but it also leads to models that are highly accurate as compared to the alternative ones.
Antonino, F., Trentin, E., Gori, M. (2009). A hybrid random field model for scalable statistical learning. NEURAL NETWORKS, 22(5-6), 603-613 [10.1016/j.neunet.2009.06.017].
A hybrid random field model for scalable statistical learning
TRENTIN, EDMONDO;GORI, MARCO
2009-01-01
Abstract
This paper introduces hybrid random fields, which are a class of probabilistic graphical models aimed at allowing for efficient structure learning in high-dimensional domains. Hybrid random fields, along with the learning algorithm we develop for them, are especially useful as a pseudo-likelihood estimation technique (rather than a technique for estimating strict joint probability distributions). In order to assess the generality of the proposed model, we prove that the class of pseudo-likelihood distributions representable by hybrid random fields strictly includes the class of joint probability distributions representable by Bayesian networks. Once we establish this result, we develop a scalable algorithm for learning the structure of hybrid random fields, which we call ‘Markov Blanket Merging’. On the one hand, we characterize some complexity properties of Markov Blanket Merging both from a theoretical and from the experimental point of view, using a series of synthetic benchmarks. On the other hand, we evaluate the accuracy of hybrid random fields (as learned via Markov Blanket Merging) by comparing them to various alternative statistical models in a number of pattern classification and link-prediction applications. As the results show, learning hybrid random fields by the Markov Blanket Merging algorithm not only reduces significantly the computational cost of structure learning with respect to several considered alternatives, but it also leads to models that are highly accurate as compared to the alternative ones.File | Dimensione | Formato | |
---|---|---|---|
12-FrenoTrentinGori.pdf
non disponibili
Tipologia:
Post-print
Licenza:
PUBBLICO - Pubblico con Copyright
Dimensione
2.03 MB
Formato
Adobe PDF
|
2.03 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/24191
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo