Preventing Representational Rank Collapse in MPNNs by Splitting the Computational Graph

Preventing Representational Rank Collapse in MPNNs by Splitting the Computational Graph

Abstract

The ability of message-passing neural networks (MPNNs) to fit complex functions over graphs is limited as most graph convolutions amplify the same signal across all feature channels, a phenomenon known as rank collapse, and over-smoothing as a special case. Most approaches to mitigate over-smoothing extend common message-passing schemes, e.g., the graph convolutional network, by utilizing residual connections, gating mechanisms, normalization, or regularization techniques. Our work contrarily proposes to directly tackle the cause of this issue by modifying the message-passing scheme and exchanging different types of messages using multi-relational graphs. We identify a sufficient condition to ensure linearly independent node representations. As one instantion, we show that operating on multiple directed acyclic graphs always satisfies our condition and propose to obtain these by defining a strict partial ordering of the nodes. We conduct comprehensive experiments that confirm the benefits of operating on multi-relational graphs to achieve more informative node representations.

Grafik Top
Authors
  • Roth, Andreas
  • Bause, Franka
  • Kriege, Nils M.
  • Liebig, Thomas
Grafik Top
Shortfacts
Category
Paper in Conference Proceedings or in Workshop Proceedings (Paper)
Event Title
Learning on Graphs Conference (LoG) 2024
Divisions
Data Mining and Machine Learning
Event Location
Virtual
Event Type
Conference
Event Dates
26.11.-29.11.2024
Date
26 November 2024
Export
Grafik Top