Structured data in the form of labeled graphs (with variable order and topology) may be thought of as the outcomes of a random graph generating process, characterized by an underlying probabilistic law. The paper formalizes the notions of generalized random graph (GRG) and of probability density function (pdf) for GRGs. Thence, a “universal” learning machine (combining the encoding module of a recursive neural network and a radial basis functions network) is introduced for estimating the unknown pdf from an unsupervised sample of GRGs. A maximum likelihood training algorithm is presented, constrained so as to ensure that the resulting model satisfies the axioms of probability. Techniques for preventing the model from degenerate solutions are proposed, as well as variants of the algorithm suitable to the tasks of graphs classification and graphs clustering. The major properties of the machine are discussed. The approach is validated empirically through experimental investigations in the estimation of pdfs for synthetic and real-life GRGs, in the classification of images from the Caltech Benchmark dataset and of molecules from the Mutagenesis dataset, and in clustering of images from the LabelMe dataset.
|Titolo:||Recursive neural networks for density estimation over generalized random graphs|
|Citazione:||Bongini, M., Rigutini, L., & Trentin, E. (2018). Recursive neural networks for density estimation over generalized random graphs. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS.|
|Appare nelle tipologie:||1.1 Articolo in rivista|