A dynamic graph is a graph that evolves as nodes and edges are added or removed over time. The structure itself changes: new users join a social network, products are added to a catalog, fraudulent accounts are created and suspended. A static graph is a frozen snapshot. A dynamic graph is a living system.
This distinction matters because real-world enterprise systems never stop changing. A fraud detection model trained on January's graph structure may miss new fraud patterns that emerge in February when new account types appear and new transaction pathways form.
Two approaches to dynamic graphs
Snapshot-based (discrete-time)
Divide time into fixed windows and create one static graph per window:
import torch
from torch_geometric.data import Data
# Create weekly snapshots of a social network
snapshots = []
for week in range(52):
# Get edges active during this week
edges = get_edges_for_week(week)
features = get_node_features_for_week(week)
snapshot = Data(
x=features,
edge_index=edges,
)
snapshots.append(snapshot)
# Process each snapshot with GNN, then model sequence
# GNN output per snapshot -> RNN/Transformer across time
embeddings = [gnn(snap.x, snap.edge_index) for snap in snapshots]
temporal_output = sequence_model(torch.stack(embeddings))Snapshot approach: one static graph per time window, processed by GNN, then a sequence model captures evolution.
Event-based (continuous-time)
Process each structural change as it arrives:
- Node addition: initialize the new node's embedding (zero, random, or from features)
- Edge addition: update the representations of both endpoints
- Node/edge deletion: remove from the graph, propagate updates to affected neighbors
This is more complex but captures exact timing and avoids the information loss of discretization. TGN (Temporal Graph Network) is the primary event-based architecture in PyG.
Enterprise example: evolving product catalog
An e-commerce platform adds 1,000 new products per week and discontinues 200. The user-product interaction graph is inherently dynamic:
- Cold start: new products have no interaction history. Dynamic GNNs can initialize their embeddings from product features and propagate information from similar existing products.
- Trend detection: a product suddenly getting 10x more interactions than last week signals a trend. Comparing snapshots reveals this velocity.
- Decay: discontinued products should fade from recommendations. In a static graph, they persist forever. In a dynamic graph, the removal is explicit.
# Handling new products in a dynamic graph
def update_graph(graph, new_products, new_interactions, removed_products):
# Add new product nodes with feature-based initialization
for product in new_products:
graph.add_node(product.id, features=product.feature_vector)
# Add new interaction edges
for interaction in new_interactions:
graph.add_edge(interaction.user_id, interaction.product_id,
timestamp=interaction.time, type=interaction.action)
# Remove discontinued products
for product_id in removed_products:
graph.remove_node(product_id)
# Neighbors automatically lose this connection
# Re-run GNN on updated subgraph (incremental)
affected_nodes = get_affected_neighborhood(new_products + new_interactions)
updated_embeddings = gnn.forward_incremental(graph, affected_nodes)
return updated_embeddingsIncremental graph updates. Only affected neighborhoods are recomputed, not the entire graph.
Challenges of dynamic graphs
- Catastrophic forgetting: as the graph evolves, the model may forget patterns from earlier time periods. Combining memory modules with periodic retraining mitigates this.
- Cold start: new nodes have no neighborhood. Feature-based initialization and few-shot propagation help bootstrap embeddings for new entities.
- Computational cost: recomputing all embeddings after every change is expensive. Incremental updates that only recompute affected neighborhoods make dynamic GNNs practical at scale.
- Evaluation: standard train/test splits do not work. You must evaluate on future time periods using only past data for training, matching the real deployment scenario.