The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classification model and it is often used as an indicator for the generalization capability of a learning method. The VC-dim has been studied on common feed-forward neural networks, but it has yet to be studied on Graph Neural Networks (GNNs) and Recursive Neural Networks (RecNNs). This paper provides upper bounds on the order of growth of the VC-dim of GNNs and RecNNs. GNNs and RecNNs are from a new class of neural network models which are capable of processing inputs that are given as graphs. A graph is a data structure that generalizes the representational power of vectors and sequences, via the ability to represent dependencies or relationships between feature vectors. It was shown previously that the ability of recurrent neural networks to process sequences increases the VC-dim when compared to the VC-dim of Neural Networks, which are limited to processing vectors. Since graphs are a more general form than sequences, the question arises how this will affect the VC-dimension of GNNs and RecNNs. A main finding in this paper is that the upper bounds on the VC-dim for GNNs and RecNNs are comparable to the upper bounds for recurrent neural networks. The result also suggests that the generalization capability of such models increases with the number of connected nodes. (C) 2018 Elsevier Ltd. All rights reserved.

Scarselli, F., Tsoi, A.C., Hagenbuchner, M. (2018). The Vapnik–Chervonenkis dimension of graph and recursive neural networks. NEURAL NETWORKS, 108, 248-259 [10.1016/j.neunet.2018.08.010].

The Vapnik–Chervonenkis dimension of graph and recursive neural networks

Scarselli, Franco
;
2018-01-01

Abstract

The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classification model and it is often used as an indicator for the generalization capability of a learning method. The VC-dim has been studied on common feed-forward neural networks, but it has yet to be studied on Graph Neural Networks (GNNs) and Recursive Neural Networks (RecNNs). This paper provides upper bounds on the order of growth of the VC-dim of GNNs and RecNNs. GNNs and RecNNs are from a new class of neural network models which are capable of processing inputs that are given as graphs. A graph is a data structure that generalizes the representational power of vectors and sequences, via the ability to represent dependencies or relationships between feature vectors. It was shown previously that the ability of recurrent neural networks to process sequences increases the VC-dim when compared to the VC-dim of Neural Networks, which are limited to processing vectors. Since graphs are a more general form than sequences, the question arises how this will affect the VC-dimension of GNNs and RecNNs. A main finding in this paper is that the upper bounds on the VC-dim for GNNs and RecNNs are comparable to the upper bounds for recurrent neural networks. The result also suggests that the generalization capability of such models increases with the number of connected nodes. (C) 2018 Elsevier Ltd. All rights reserved.
2018
Scarselli, F., Tsoi, A.C., Hagenbuchner, M. (2018). The Vapnik–Chervonenkis dimension of graph and recursive neural networks. NEURAL NETWORKS, 108, 248-259 [10.1016/j.neunet.2018.08.010].
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0893608018302363-main.pdf

non disponibili

Tipologia: PDF editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 773.4 kB
Formato Adobe PDF
773.4 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1066559