Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Oct 8;13(1):16966.
doi: 10.1038/s41598-023-44224-1.

Co-embedding of edges and nodes with deep graph convolutional neural networks

Affiliations

Co-embedding of edges and nodes with deep graph convolutional neural networks

Yuchen Zhou et al. Sci Rep. .

Abstract

Graph neural networks (GNNs) have significant advantages in dealing with non-Euclidean data and have been widely used in various fields. However, most of the existing GNN models face two main challenges: (1) Most GNN models built upon the message-passing framework exhibit a shallow structure, which hampers their ability to efficiently transmit information between distant nodes. To address this, we aim to propose a novel message-passing framework, enabling the construction of GNN models with deep architectures akin to convolutional neural networks (CNNs), potentially comprising dozens or even hundreds of layers. (2) Existing models often approach the learning of edge and node features as separate tasks. To overcome this limitation, we aspire to develop a deep graph convolutional neural network learning framework capable of simultaneously acquiring edge embeddings and node embeddings. By utilizing the learned multi-dimensional edge feature matrix, we construct multi-channel filters to more effectively capture accurate node features. To address these challenges, we propose the Co-embedding of Edges and Nodes with Deep Graph Convolutional Neural Networks (CEN-DGCNN). In our approach, we propose a novel message-passing framework that can fully integrate and utilize both node features and multi-dimensional edge features. Based on this framework, we develop a deep graph convolutional neural network model that prevents over-smoothing and obtains node non-local structural features and refined high-order node features by extracting long-distance dependencies between nodes and utilizing multi-dimensional edge features. Moreover, we propose a novel graph convolutional layer that can learn node embeddings and multi-dimensional edge embeddings simultaneously. The layer updates multi-dimensional edge embeddings across layers based on node features and an attention mechanism, which enables efficient utilization and fusion of both node and edge features. Additionally, we propose a multi-dimensional edge feature encoding method based on directed edges, and use the resulting multi-dimensional edge feature matrix to construct a multi-channel filter to filter the node information. Lastly, extensive experiments show that CEN-DGCNN outperforms a large number of graph neural network baseline methods, demonstrating the effectiveness of our proposed method.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests

Figures

Figure 1
Figure 1
(a) The edge feature representation and node feature representation in ordinary GCN; (b) The edge feature representation and node feature representation used in our proposed CEN-DGCNN.
Figure 2
Figure 2
The overall architecture of Co-embedding of Edges and Nodes with Deep Graph Convolutional Neural Network (CEN-DGCNN).
Figure 3
Figure 3
The novel graph neural network message passing framework proposed by us.
Figure 4
Figure 4
Edge features update and node features update of CEN-DGCNN.
Figure 5
Figure 5
MAD values of different layers of CEN-DGCNN on 5 datasets.
Figure 6
Figure 6
MAD values of different layers of regular GCN on 5 datasets.
Figure 7
Figure 7
t-SNE Visualization of node representations learned by CEN-DGCNN.
Figure 8
Figure 8
Node feature visualization results of CEN-DGCNN with various edge feature encoding methods in the Cora dataset.
Figure 9
Figure 9
Node feature visualization results of CEN-DGCNN with different message passing frameworks and edge feature construction methods in the Citeseer dataset.
Figure 10
Figure 10
CEN-DGCNN Model Analysis. (a) The running time of the CEN-DGCNN model under different-dimensional edge feature constructions. (b) Attention weight distribution on the Citeseer dataset. (c) The performance of CEN-DGCNN is influenced by the value of the hyperparameter λ.

Similar articles

Cited by

References

    1. Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O. & Dahl, G. E. Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning (eds. Precup, D. & Teh, Y. W.) vol. 70 1263–1272 (PMLR, 2017).
    1. Kipf, T. N. & Welling, M. Semi-supervised classification with graph convolutional networks. CoRRabs/1609.0 (2016).
    1. Duvenaud, D. et al. Convolutional networks on graphs for learning molecular fingerprints. In Advances in Neural Information Processing Systems 28: Annual Conference on Neural Information Processing Systems 2015, December 7–12, 2015, Montreal, Quebec, Canada (eds. Cortes, C., Lawrence, N. D., Lee, D. D., Sugiyama, M. & Garnett, R.) 2224–2232 (2015).
    1. Schlichtkrull, M. S. et al. Modeling relational data with graph convolutional networks. In The Semantic Web—15th International Conference, {ESWC} 2018, Heraklion, Crete, Greece, June 3–7, 2018, Proceedings (eds. Gangemi, A. et al.) vol. 10843, 593–607 (Springer, 2018).
    1. Veličković, P. et al. Graph attention networks. In 6th International Conference on Learning Representations, {ICLR} 2018, Vancouver, BC, Canada, April 30–May 3, 2018, Conference Track Proceedings (OpenReview.net, 2018).