Artificial Neural Networks Artificial Neural Networks

MajdDassan 265 views 20 slides May 26, 2024
Slide 1
Slide 1 of 20
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20

About This Presentation

Artificial Neural Networks
Artificial Neural Networks


Slide Content

ArtificialNeuralNetworks
Fundamentals of ComputerVision–Week6
Assist. Prof. Özge ÖztimurKaradağ
ALKÜ –Departmentof ComputerEngineering
Alanya

InspiredfromBiology
•Brain has a huge number of neurons (10 billion neurons and about
60000 billions of interconnections)
•Brain is a very complex, nonlinear, parallel computer
•Whatis a neuron?
•Dendrites accept inputs from other neurons
•Axon transmits impulses to other neurons
•Synapses are structures where impulses are
transferredfrom one neuron to another
•Synapses connect neurons to pass electrical
signals from one neuron to another

NeuralNetworks
•Biological neural networks
•Biological organisms
•Human and animal brains
•High complexity and parallelism
•Artificial neural networks
•Motivated by biological neural networks
•Much simpler and primitive compared to biological networks
•Implementation on general purpose digital computers or using specialized
hardware

ArtificialNeuralNetworks (ANN)
•Computingsystems inspired by thebiological neural networksthat constitute animalbrains.
•An ANN is based on a collection of connected units or nodes calledartificial neurons, which
loosely model theneuronsin a biological brain.
•Each connection, like thesynapsesin a biological brain, can transmit a signal to other
neurons.
•ANN is a massive parallel distributed processor that is good for memorizing knowledge
•ANN is similar to biological NN in the following aspects:
•Knowledge is gained through learning process
•Knowledge is encoded in mutual connections between neurons

ANN Properties
•Nonlinearity
•Input to output mapping (supervised learning)
•Adaptivity
•Fault tolerance
•Possible VLSI implementation
•Neurobiological analogy

Neuronmodel
•Neuron model elements:
•Set of synapses, i.e. Inputs with respective weights. (Notation: Signal x
jat input j of neuron k
has weight w
kj)
•Adder for summation of weighted inputs. These two operations calculate the weighed sum of
inputs.
•Non-linear activation function that limits output of neuron to interval [0,1]

ANN
•Activationfunctions:
•Threshold
•Linear

ANN
•Activationfunctions:
•Sigmoid
•RectifierLinearUnit(ReLU)

ANN
•An artificial neuron receives a signal then processes it and can
signal neurons connected to it.
•NeuronModel summary:
•The "signal" at a connection is areal number, and the output of each
neuron is computed by some non-linear function of the sum of its inputs.
•The connections are callededges. Neurons and edges typically have
aweightthat adjusts as learning proceeds.
•The weight increases or decreases the strength of the signal at a
connection.
•Neurons may have a threshold such that a signal is sent only if the
aggregate signal crosses that threshold.
•ANN Architecture
•Typically, neurons are aggregated into layers. Different layers may perform
different transformations on their inputs. Signals travel from the first layer
(the input layer), to the last layer (the output layer), possibly after
traversing the layers multiple times.

ANN
•Architectures
•SingleLayerNetworks
•Has a single neuron layer (output layer)
•Input layer does not count due to lack of processing
•Network inputs are connected to neuron inputs
•Neuron outputs are also network outputs
•No feedbacks from outputs to inputs

ANN
•Architectures:
•MultilayerNetworks
•Multilayer networks have one or more hidden layers, in
addition to input and output layers
•Outputs from n-thlayer are inputs to n+1-th layer
•Connectedness:
•A network is fully connected when each neuron in a
layer is connected to all neurons in the next layer
•If some connections are missing the network is partially
connected
•An example of network with one hidden layer with four
neurons
•The network has four input neurons
•There are two output neurons

ANN
•Learning: Learning is the adaptation of the network to better handle a
task by considering sample observations.
•Learning involves adjusting the weights (and optional thresholds) of the
network to improve the accuracy of the result.
•This is done by minimizing the observed errors. Learning is complete when
examining additional observations does not usefully reduce the error rate.
•Learning rate: the size of the corrective steps that the model takes to adjust
for errors in each observation.
•acost functionis evaluated periodically during learning. As long as its output
continues to decline, learning continues.

ANN
•Backpropagation:
•Backpropagation is a method used to adjust the connection weights to
compensate for each error found during learning. The error amount is
effectively divided among the connections. Technically, backpropcalculates
thegradient(the derivative) of thecost functionassociated with a given state
with respect to the weights.

Example
•Let’s say we have a dataset of N rows, 3 features and 1 target variable
(i.e. binary 1/0):
•Just like in every other machine learning use case, we are going to
train a model to predict the target using the features row by row. Let’s
start with the first row:

Example…
•”training a model” Searching for the best parameters in a mathematical
formula that minimize the error of our predictions.
•ie. In the regression models (i.e. linear regression) you have to find the best weights
•Usually, the weights are randomly initialized then adjusted as the learning
proceeds. Here I’ll just set them all as 1:

Example…
•Differentthana linearmodel:
•The activation function defines the output of that node.

Example…
•Training: comparethe output with the target, calculatethe error and
optimizethe weights, reiteratethe whole process again and again.

Example…
•Training

Example
•Image ClassificationusingSIFT features.
•Classifyimagesin theCIFAR-10 imagedatasetbyANN usingSIFT
features.

Referrences
•MauroDiPietro, Deep Learning with Python: Neural Networks,
https://towardsdatascience.com/deep-learning-with-python-neural-
networks-complete-tutorial-6b53c0b06af0
•Prof. SvenLončarić, LectureNoteson NeuralNetworks.
Tags