Deep Learning – Recurrent Neural Network and Convolutional Neural Network Ms. Pradnya Saval
CONTENTS
Feedforward networks Feedforward networks, also called as Deep feedforward networks or multilayer perceptrons (MLPs) . These models are called feedforward because information flows through the function being evaluated through the intermediate computations and finally to the output. There are no feedback connections in which outputs of the model are fed back into itself so the outputs are independent of each other.
Problems with Feedforward networks Eg : Reading a Book We cannot predict the next word/output in a sentence/model if we use Feedforward networks.
Recurrent Neural Networks When feedforward neural networks are extended to include feedback connections, they are called Recurrent Neural Networks (RNN) .
Example
Mathematical Representation of RNN
Problems using RNN
Convolutional Neural Network (CNN) A Convolutional Neural Network ( ConvNet /CNN) is a Deep Learning algorithm which can take in an input image, assign importance (learnable weights and biases) to various aspects/objects in the image and be able to differentiate one from the other.
Architecture of CNN
Working of CNN Layers: Convolution ReLU Layer(Activation function) Pooling Fully Connected
Convolution Layer
Activation Functions Name Formula Graph Range Sigmoid (Logistic Function) (0,1) Tanh (Hyperbolic tangent) tanh(a) = (-1,1) ReLu (Rectified linear unit) relu (a) = max(0,a) (0,∞) Softmax Different everytime (0,1) Name Formula Graph Range Sigmoid (Logistic Function) (0,1) Tanh (Hyperbolic tangent) (-1,1) ReLu (Rectified linear unit) relu (a) = max(0,a) (0,∞) Softmax Different everytime (0,1)
ReLU Layer (activation Function) Activation function of a neuron defines the output of that neuron given a set of inputs. ReLU layers work far better because the network is able to train a lot faster (because of the computational efficiency) without making a significant difference to the accuracy. Example: Climate
ReLu (Rectified linear unit) The more positive the neuron the most activate it is.
Pooling Its function is to progressively reduce the spatial size of the representation to reduce the amount of parameters and computation in the network. Types: Average Pooling Max Pooling Pooling layer operates on each feature map independently. The most common approach used in pooling is max pooling .
Max Pooling
Fully Connected Layer Fully Connected Layers form the last few layers in the network. The input to the fully connected layer is the output from the final Pooling or Convolutional Layer, which is flattened and then fed into the fully connected layer.
Projects of RNN and CNN
Demonstration of CNN Flower Classification using CNN Dataset: Kaggle: https://www.kaggle.com/alxmamaev/flowers-recognition