Graph neural network (GNN) is a powerful technique for graph representation learning, which is good at capturing the relationships between short-range nodes. The restricted receptive field of GNNs makes it challenging to capture dependencies between long-range nodes. Meanwhile, simply increasing the depth and width of GNNs is inadequate to expand their receptive field, and may lead to issues such as over-smoothing and over-squeezing. This paper proposes the Graph Pyramid Pooling Transformer (GPPTrans), a novel architecture that leverages pyramid pools to reduce node sequence length and extract multi-scale information. The graph pyramid pooling module consists of graph pooling operations with different pooling ratios, which enable the capture of hierarchical features at multiple scales. By combining basic GNN layers with graph pyramid pooling transformers, our hybrid structure effectively captures both short-range and long-range dependencies. Extensive experiments demonstrate that GPPTrans outperforms previous GNNs and graph transformer methods, achieving state-of-the-art results on various graph-level tasks.
Graph Pyramid Pooling Transformer for Graph-level Representation Learning
2026-05-14 10:08:22
科研
7