Graph neural network

From Wikipedia, the free encyclopedia

A graph neural network (GNN) is a class of neural network for processing data best represented by graph data structures.[1][2][3] They were popularized by their use in supervised learning on properties of various molecules.[4]

Since their inception, variants of the message passing neural network (MPNN) framework have been proposed.[5][6][7][8] These models optimize GNNs for use on larger graphs and apply them to domains such as social networks, citation networks, and online communities.[9]

GNNs are a weak form of the Weisfeiler–Lehman graph isomorphism test,[10] so any GNN model is at least as powerful as this test.[11] Researcher are attempting to unite GNNs with other "geometric deep learning models"[12] to better understand how and why these models work.

In the case of the absence of a known graph structure for example a k-nearest neighbor graph can be heuristically induced.

GNNs can be understood as a generalization of convolutional neural nets.

References[]

  1. ^ Scarselli, Franco; Gori, Marco; Tsoi, Ah Chung; Hagenbuchner, Markus; Monfardini, Gabriele (2009). "The Graph Neural Network Model". IEEE Transactions on Neural Networks. 20 (1): 61–80. doi:10.1109/TNN.2008.2005605. ISSN 1941-0093. PMID 19068426. S2CID 206756462.
  2. ^ Sanchez-Lengeling, Benjamin; Reif, Emily; Pearce, Adam; Wiltschko, Alex (2021-09-02). "A Gentle Introduction to Graph Neural Networks". Distill. 6 (9): e33. doi:10.23915/distill.00033. ISSN 2476-0757.
  3. ^ Daigavane, Ameya; Ravindran, Balaraman; Aggarwal, Gaurav (2021-09-02). "Understanding Convolutions on Graphs". Distill. 6 (9): e32. doi:10.23915/distill.00032. ISSN 2476-0757.
  4. ^ Gilmer, Justin; Schoenholz, Samuel S.; Riley, Patrick F.; Vinyals, Oriol; Dahl, George E. (2017-07-17). "Neural Message Passing for Quantum Chemistry". International Conference on Machine Learning. PMLR: 1263–1272. arXiv:1704.01212.
  5. ^ Kipf, Thomas N; Welling, Max (2016). "Semi-supervised classification with graph convolutional networks". International Conference on Learning Representations. 5 (1): 61–80. arXiv:1609.02907. doi:10.1109/TNN.2008.2005605. PMID 19068426. S2CID 206756462.
  6. ^ Defferrard, Michaël; Bresson, Xavier; Vandergheynst, Pierre (2017-02-05). "Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering". Neural Information Processing Systems. 30. arXiv:1606.09375.
  7. ^ Hamilton, William; Ying, Rex; Leskovec, Jure (2017). "Inductive Representation Learning on Large Graphs" (PDF). Neural Information Processing Systems. 31. arXiv:1706.02216 – via Stanford.
  8. ^ Veličković, Petar; Cucurull, Guillem; Casanova, Arantxa; Romero, Adriana; Liò, Pietro; Bengio, Yoshua (2018-02-04). "Graph Attention Networks". International Conference on Learning Representations. 6. arXiv:1710.10903.
  9. ^ "Stanford Large Network Dataset Collection". snap.stanford.edu. Retrieved 2021-07-05.
  10. ^ Douglas, B. L. (2011-01-27). "The Weisfeiler–Lehman Method and Graph Isomorphism Testing". arXiv:1101.5211 [math.CO].
  11. ^ Xu, Keyulu; Hu, Weihua; Leskovec, Jure; Jegelka, Stefanie (2019-02-22). "How Powerful are Graph Neural Networks?". International Conference on Learning Representations. 7. arXiv:1810.00826.
  12. ^ Bronstein, Michael M.; Bruna, Joan; LeCun, Yann; Szlam, Arthur; Vandergheynst, Pierre (2017). "Geometric Deep Learning: Going beyond Euclidean data". IEEE Signal Processing Magazine. 34 (4): 18–42. arXiv:1611.08097. Bibcode:2017ISPM...34...18B. doi:10.1109/MSP.2017.2693418. ISSN 1053-5888. S2CID 15195762.
Retrieved from ""