240715_Thuy_Labseminar[SeedGNN: Graph Neural Network for Supervised Seeded Graph Matching].pptx
thanhdowork
65 views
18 slides
Jul 23, 2024
Slide 1 of 18
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
About This Presentation
SeedGNN: Graph Neural Network for Supervised Seeded Graph Matching
Size: 2.13 MB
Language: en
Added: Jul 23, 2024
Slides: 18 pages
Slide Content
SeedGNN: Graph Neural Network for Supervised Seeded Graph Matching Van Thuy Hoang Network Science Lab Dept. of Artificial Intelligence The Catholic University of Korea E-mail: [email protected] 2024-07-15 ICML 23
Problem: Graph matching Graph matching techniques have been applied to the following applications: Matching between movie segments and synopsis paragraphs. https://openaccess.thecvf.com/content_ICCV_2019/papers/Xiong_A_Graph-Based_Framework_to_Bridge_Movies_and_Synopses_ICCV_2019_paper.pdf
Problem: Graph matching Graph matching techniques have been applied to the following applications Image correspondence https://openaccess.thecvf.com/content_ICCV_2019/papers/Xiong_A_Graph-Based_Framework_to_Bridge_Movies_and_Synopses_ICCV_2019_paper.pdf
Illustration of GNN-based graph similarity learning Problem For graph similarity prediction mainly employ GNNs to learn graph representations and leverage the learned representations into CNNs for predicting similarity scores, which is approached as a classification or regression problem. Fully connected layers are often added for the similarity score prediction in an end-to-end learning framework. Bai Y, Xu D, Gu K, Wu X, Marinovic A, Ro C, Sun Y, Wang W (2019b) Neural maximum common subgraph detection with guided subgraph extraction. https://openreviewnet/pdf?id=BJgcwh4FwS
Problem: seeded graph matching Problem Goal : find the mapping between unmatched nodes using seeds 1 Pre-matched nodes (seeds) Unmatched nodes 𝐺1 𝐺2 Applications : computer vision, social network de-anonymization, computational biology, and natural language processing …
GNNs for seeded graph matching Problem SeedGNN Supervised learning Learn to use seeds
Architecture Two components Convolution module aggregates neighboring information near the node pair Percolation module filters out noisy information Combining these two modules allows adaptive feature choosing
Architecture Convolution Module Given two graphs Injective mapping π : V1 → V2 Training set : Flatten(·) to denote the matrix reshape operation that converts a n1 × n2 × d matrix to a matrix of n1n2 × d, where the (i, j, :)-th entry of the input matrix is the ((i−1)n2 +j, :)-th entry of the output matrix seeded relationship as inputs:
Architecture Convolution Module Taking the seed encoding vector s1 as input, the counting of 1-hop witnesses can be written as Likewise, we may further compute the l-hop witness-like information hl in the l-th layer of our SeedGNN as:
Architecture Convolution Module For computation : update function ϕ is implemented as a K-layer neural network: The k-th layer of ϕl can be formulated as :
Architecture Percolation Module To match highconfidence nodes at one layer and to propagate the matched entries correspond to true (resp. fake) pair To obtain a similarity matrix in the l-th layer by mapping the node-pair representations ml to a 1-dimension vector, which is:
Architecture Convolution Module The matching result by Hungarian algorithm
Architecture Convolution Module The witness like information and new seeds computed by each layer Loss Function :
Experimental results Performance comparison Observation: The iterative 2-hop algorithm has the best performance for sparse graphs (p = 0.01) SeedGNN is capable of choosing the right features automatically to match different types of graphs.
Experiments Problem Comparison of GNN methods on SHREC’16 dataset. Provided with only a very small fraction of seeds (θ = 0.01), our SeedGNN can significantly outperform the semi-supervised methods.
Discussion and Conclusion Problem: Graph matching Findings: Convolution module aggregates neighboring information near the node pair Percolation module filters out noisy information Combining these two modules allows adaptive feature choosing Limitations: Only consider using topological structure in our SeedGNN. Compared to non-learning methods, SeedGNN requires additional training process and higher computational complexity.