Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jun 14;25(12):6583.
doi: 10.3390/ijms25126583.

Composite Graph Neural Networks for Molecular Property Prediction

Affiliations

Composite Graph Neural Networks for Molecular Property Prediction

Pietro Bongini et al. Int J Mol Sci. .

Abstract

Graph Neural Networks have proven to be very valuable models for the solution of a wide variety of problems on molecular graphs, as well as in many other research fields involving graph-structured data. Molecules are heterogeneous graphs composed of atoms of different species. Composite graph neural networks process heterogeneous graphs with multiple-state-updating networks, each one dedicated to a particular node type. This approach allows for the extraction of information from s graph more efficiently than standard graph neural networks that distinguish node types through a one-hot encoded type of vector. We carried out extensive experimentation on eight molecular graph datasets and on a large number of both classification and regression tasks. The results we obtained clearly show that composite graph neural networks are far more efficient in this setting than standard graph neural networks.

Keywords: artificial intelligence; composite graph neural networks; deep learning; graph neural networks; molecular graphs; molecular property prediction; open graph benchmark.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
The graph neural network model places a copy of the state-updating network over every node of each input graph. After a fixed number of “message-passing” iterations, in which the state of every node is updated based on its previous state, the previous states of its neighbors and the node and edge labels as output are calculated based on the final state. The state-updating network copies share their weights and can be seen as twin building blocks that compose the adaptive architecture of the GNN, replicating the structure of the input graph.

References

    1. Scarselli F., Gori M., Tsoi A.C., Hagenbuchner M., Monfardini G. The Graph Neural Network Model. IEEE Trans. Neural Netw. 2009;20:61–80. doi: 10.1109/TNN.2008.2005605. - DOI - PubMed
    1. Weisfeiler B., Leman A. The reduction of a graph to canonical form and the algebra which appears therein. NTI Ser. 1968;2:12–16.
    1. Xu K., Hu W., Leskovec J., Jegelka S. How Powerful are Graph Neural Networks?; Proceedings of the ICLR 2018; Vancouver, BC, Canada. 30 April–3 May 2018.
    1. Zhou J., Cui G., Hu S., Zhang Z., Yang C., Liu Z., Wang L., Li C., Sun M. Graph neural networks: A review of methods and applications. AI Open. 2020;1:57–81. doi: 10.1016/j.aiopen.2021.01.001. - DOI
    1. Pradhyumna P., Shreya G.P. Graph neural network (GNN) in image and video understanding using deep learning for computer vision applications 2021; Proceedings of the Second International Conference on Electronics and Sustainable Communication Systems (ICESC); Coimbatore, India. 4–6 August 2021.

LinkOut - more resources