CMU-CS-25-109
Computer Science Department
School of Computer Science, Carnegie Mellon University



CMU-CS-25-109

Enhancing GNNs with Encoding, Rewiring, and Attention

Tongzhou Liao

M.S. Thesis

April 2025

CMU-CS-25-109.pdf


Keywords: Machine learning, deep learning, graph neural networks, graph encoding, graph rewiring, attention mechanism

Graph Neural Networks (GNNs) have become important tools for machine learning on graph-structured data. In this paper, we explore the synergistic combination of graph encoding, graph rewiring, and graph attention, by introducing Graph Attention with Stochastic Structures (GRASS), a novel GNN architecture. GRASS utilizes relative random walk probabilities (RRWP) encoding and a novel decomposed variant (D-RRWP) to efficiently capture structural information. It rewires the input graph by superimposing a random regular graph to enhance long-range information propagation. It also employs a novel additive attention mechanism tailored for graph-structured data. Our empirical evaluations demonstrate that GRASS achieves state-of-the-art performance on multiple benchmark datasets, including a 20.3% reduction in mean absolute error on the ZINC dataset.

43 pages

Thesis Committee:
Barnabás Póczos (Chair)
Tianqi Chen

Srinivasan Seshan, Head, Computer Science Department
Martial Hebert, Dean, School of Computer Science


Return to: SCS Technical Report Collection
School of Computer Science

This page maintained by reports@cs.cmu.edu