Densely Connected Convolutional Networks

HoseinMohebbi 1,022 views 27 slides Jun 13, 2018
Slide 1
Slide 1 of 27
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27

About This Presentation

A class presentation in soft computing course, June 2018


Slide Content

DENSELY
Prof. Mohammad-R. Akbarzadeh-T
Ferdowsi University of Mashhad
A Presentation by:
•Hosein Mohebbi
•M.-SajadAbavisani

CVPR 2017 BEST PAPER AWARD
2
Gao Huang
Cornell University
h-index: 12
Zhuang Liu
Tsinghua University
h-index: 5
Laurens van der Maaten
Facebook AI Research
h-index: 29
Kilian Weinberger
Associate Professor
Cornell University
h-index: 41

3

Normalized initialization and intermediate normalization layers
The main culprit : Vanishing/exploding gradients
Not caused by overfitting
4

5
RESNET

STOCHASTIC DEPTH
Deep network during testing, but shallower network during training.
6
??????
??????=??????�??????????????????
??????�
??????(??????
??????−1+??????�??????
??????−1) ??????
??????∈0,1
They all share a key characteristic:
They create short paths from early layers to later layers

DENSE
7
??????(??????+1)
2
direct connections

DENSE
8
The growth rate regulates how much new information each
layer contributes to the global state.

Traditional Convolutional feed-forward networks :
9
??????
??????=??????
??????(??????
??????−1)
??????
??????=??????
??????(??????
??????−1) + ??????
??????−1
ResNets:
DenseNets:
??????
??????=??????
??????([??????
0,??????
1,…,??????
??????−1])
Where [??????
0,??????
1,…,??????
??????−1] refers to the concatenation of the feature-maps produced
in layers 0….. ??????-1.

10

DENSENET
11

DENSENET
WITH BOTTLENECK LAYER
12

DENSENET
13
Compression in transition layer
Pooling reduces
Feature map sizes
Feature map sizes match
Within each block

DENSE

15
Direct access: Deep Supervision with single classifier
Reduces overfitting on tasks with smaller training set sizes

16

17
ResNetconnectivity:
DenseNetconnectivity:
#parameters:
k: Growth rate

18
Standard Connectivity :

19
Feature visualization of convolutional net trained on ImageNetfrom [Zeiler& Fergus 2013]
Remember feature visualization

20
“Collective Knowledge”

Feature reuse
Information flow from the first to the last layers of the block
Compression in transition layer
Concentrate on high level feature for final classification
21

CIFAR-10
23

DENSENET IMAGENET
24

IMAGENET
25

26
KaimingHe, et al. "Deep residual learning for image recognition" CVPR
2016
Chen-Yu Lee, et al. "Deeply-supervised nets" AISTATS 2015
Gao Huang, et al. "Deep networks with stochastic depth" ECCV 2016
CS231n: Convolutional Neural Networks for Visual Recognition
Gao Huang, Zhuang Liu, Kilian Q Weinberger, and Laurens van der
Maaten. Densely connected convolutional networks. Conference on
Computer Vision and Pattern Recognition, 2017
Geoff Pleiss, et al. "Memory-Efficient Implementation of DenseNets”,
arXivpreprint arXiv:1707.06990 (2017)