Berlin Tech Meetup: The Future of Relational Foundation Models, Systems, and Real-World Applications

Register now:
PyG/Guide7 min read

Influence Propagation: How Effects Spread Through Network Connections

Fraud is contagious: accounts that transact with fraudulent accounts are at higher risk. Churn is contagious: customers whose friends left are more likely to leave. GNNs capture this propagation naturally through message passing.

PyTorch Geometric

TL;DR

  • 1Influence propagation describes how effects (fraud risk, churn, product adoption) spread through network connections. A node's outcome depends not just on its features but on its neighbors' states.
  • 2GNN message passing is a computational model of influence propagation. Each layer propagates information one hop. The learned weights determine how much influence each connection type carries.
  • 3Three propagation patterns: risk contagion (fraud, default), behavioral contagion (churn, adoption), and information diffusion (product awareness, trend spread).
  • 4Propagation decays with distance: 1-hop neighbors have strong influence, 2-hop moderate, 3-hop weak. This maps directly to GNN layer depth, explaining why 2-3 layers usually suffice.
  • 5Applications: fraud ring detection (risk propagation), viral marketing (adoption propagation), churn prevention (behavioral contagion), and epidemic modeling (disease spread).

Influence propagation describes how effects spread through connections in a network. A fraudulent account increases the fraud risk of its transaction partners. A customer who churns raises the churn probability of their social connections. A product adopted by an influencer drives adoption among their followers. GNNs capture these propagation dynamics naturally through message passing.

Propagation patterns

Risk contagion

In financial networks, risk propagates through transaction relationships. An account flagged for fraud increases suspicion on all accounts it transacted with. Those accounts, in turn, increase suspicion on their counterparties. This is how fraud rings are uncovered: one detected node leads to the discovery of connected fraudulent nodes.

A 2-layer GNN on a transaction graph naturally captures this: layer 1 propagates direct fraud signals, layer 2 propagates indirect signals through shared counterparties.

Behavioral contagion

Human behavior is influenced by social connections. When a customer's friends leave a platform, the customer is more likely to leave. When a user's connections adopt a feature, the user is more likely to adopt it. Research shows that churn probability increases by 5-15% for each churned 1-hop connection in a social graph.

Information diffusion

New products, trends, and information spread through networks following cascade patterns. An influential node (high degree, central position) can trigger adoption across a large portion of the network. GNNs model this by learning which structural positions amplify or dampen propagation.

Decay with distance

Influence decays with graph distance. A 1-hop neighbor's state has strong influence. A 2-hop neighbor has moderate influence. By 3-4 hops, the signal is typically negligible. This decay pattern has important implications:

  • GNN depth: 2-3 layers capture most propagation signal. Deeper models add noise from distant, irrelevant nodes (over-smoothing).
  • Risk assessment: A customer 1 hop from a fraudster needs immediate attention. A customer 3 hops away needs monitoring, not action.
  • Viral marketing: Target 1-hop connections of early adopters for maximum cascade effect.

Influence-aware predictions

Understanding influence propagation enables several enterprise predictions:

  • Fraud ring scoring: Score each node by the cumulative fraud risk propagated from its neighborhood. High-risk nodes in clusters of flagged accounts are prioritized for investigation.
  • Churn cascade prediction: Predict not just whether a customer will churn, but whether their churn will trigger a cascade among connected customers.
  • Influence maximization: Identify the K nodes whose adoption would maximize total network adoption. This is the viral marketing problem.
  • Intervention targeting: Identify nodes where intervening (retention offer, fraud alert) would have the largest propagation effect, preventing cascades.

Measuring propagation strength

Not all edges carry equal influence. The strength of propagation depends on:

  • Edge type: Financial transactions propagate fraud risk more strongly than shared-merchant connections.
  • Edge weight: High-value transactions carry more risk than micro-transactions.
  • Recency: Recent connections propagate influence more strongly than old ones.
  • Node degree: Influence from a low-degree node (exclusive relationship) is stronger than from a high-degree hub.

GNN attention mechanisms (GAT, graph transformers) learn these weights automatically from data, assigning higher attention to edges that carry more predictive signal.

Frequently asked questions

What is influence propagation in graphs?

Influence propagation describes how effects spread through network connections: a fraudulent account increases fraud risk for its transaction partners, a churned customer increases churn risk for their social connections, a product adoption by an influencer drives adoption among followers. GNNs capture this through message passing.

How do GNNs model influence propagation?

Message passing is influence propagation. When a GNN aggregates neighbor features, it naturally models how properties at one node affect predictions at connected nodes. A 2-layer GNN propagates influence across 2 hops. The learned weights determine how much influence each connection carries.

What is the difference between influence propagation and message passing?

Message passing is the computational mechanism. Influence propagation is the real-world phenomenon it models. Message passing moves information through the graph during training. Influence propagation is the domain concept that certain effects (risk, behavior, adoption) spread through connections in the actual network.

Learn more about graph ML

PyTorch Geometric is the open-source foundation for graph neural networks. Explore more layers, concepts, and production patterns.