[NS][Lab_Seminar_240722]Face Clustering via Graph Convolutional Networks with Confidence Edges.pptx

thanhdowork 63 views 18 slides Jul 23, 2024
Slide 1
Slide 1 of 18
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18

About This Presentation

Face Clustering via Graph Convolutional Networks with Confidence Edges


Slide Content

Face Clustering via Graph Convolutional Networks with Confidence Edges Tien-Bach-Thanh Do Network Science Lab Dept. of Artificial Intelligence The Catholic University of Korea E-mail: os fa19730 @catholic.ac.kr 202 4/07/22 Yang Wu et al. ICCV 2023

Introduction Previous work Face clustering methods divided into 2 categories: Unsupervised: K-Means [14], DBSCAN [4], FaceMap [35] Limitation: Perform poorly due to limited capacities Original features are not clustering-oriented Supervised: mainly based on GCNs [11] build face graphs by deeming images as nodes and linking them based on their deep features, extracted from CNN Recent works build based on kNN, where each node is connected to its top k neighbors Limitation: Don’t consider associations, images of different identities may have high similarity Graph contain many undesirable edges whose endpoints have incompatible labels and similarities => harmful information

Introduction Solution Combien labels with similarities to adjust face graphs Propose confidence edge to denote the highly consistent edges Given G = (V, E), adjacent matrix A = (aij) and threshold High similarity edges Low similarity edges L = E \ H S us set of edges whose endpoints have same labels and D is others 4 types: HS, LS, HD, LD where HS and LD form confidence edges Local information fusion is proposed to mine the local information of the nodes and provide more precise metric Unsupervised neighbor determination adjust the face graphs and increase the ratio of confidence edges GCN takes the graphs as input to increase the similarity between pairs and outputs features beneficial for clustering Propose confidence-GCN which gives a higher level of attention to non-confidence edges, narrow the graph between labels and similarities

Method Outline

Method Local Information Fusion (LIF) Idea from [27], who uses Jaccard Index to improve Cosine similarity Based on that, propose LIF and derive more discriminative similarity metric v i ∊ V, N i is k nearest neighbor sequences by descending order of cosine similarity

Method Local Information Fusion (LIF) cosine and Jaccard similarities, d is similarity differences,

Method Unsupervised Neighbor Determination (UND) Due to different requirements of each node, graph built based on given k or threshold will be unsatisfactory Propose UND, for any image, region with the fastest decrease in similarity is an ideal neighborhood boundary All nodes after boundary are less similar with vi in feature and structure => remove numerous non-confidence edges Si be the corresponding similarity sequence Information entropy increases when linking the edges one by one from vi1 Provide to entropy where is transition probability from vi to vij Boundary is drawn when amount of information reduces sharply

Method Unsupervised Neighbor Determination (UND) Adopt Z-score of the first-order difference sequence to represent the falling speed To avoid the influence of abnormal points, utilize the mean value of an interval to approximate the Z-score of midpoint First-order difference sequence Z-score of vij relative to tail mean standard deviation length of the interval Boundary detection

Method The Confidence-GCN Model (con-GCN) Let A and F be adjacency and feature matrices of G Proposed con-GCN consists of 2 layers New loss function guided by confidence edges is proposed, based on variant Hinge losses set of edges and losses of corresponding edge type cosine similarity between enhanced features

Method The Confidence-GCN Model (con-GCN) During training, ensure nodes with same labels have higher cosine similarity During inference , trained GCN model obtains output features for all nodes Apply Infomap [19] to cluster

Experiments Datasets, Metrics 3 datasets MS-Celeb-1M: large-scale face dataset with 5.8M images of 86K identities MSMT17 contains 4101 classes and 126441 images DeepFashion: closets dataset, train set 25752 images of 3997 classes, test set 26960 images of 3984 categories Metrics Pairwise F-score (FP) BCubed F-score (FB) Confidence ratio (CR) Average similarity ratio (ASR)

Experiments Comparison

Experiments Comparison

Experiments Comparison

Experiments Ablation - Study on Graph Construction Method

Experiments Ablation - Study on confidence-GCN model

Experiments Ablation - Study on graph embedding

Conclusion Propose new concept named confidence edges to select graphs for clustering Confidence edges-oriented graph construction method contains 2 closely related modules: Local information fusion Unsupervised neighbor determination Mine neighborhood information and reconstruct the similarity metric Link edges for each node with statistical information of neighbor sequences, obtaining an informative and confident graph Achieve SOTA on face clustering, ReID
Tags