•Biologicalneuralnetworksisamassivelylargeandcomplexparallel
computingnetwork.
•Itisbecauseofthismassiveparallelcomputingnetworkthatthenervous
systemhelpshumanbeingstoperformactionsortakedecisionsataspeed
andwithsucheasethatthefastestsupercomputeroftheworldbeafraidof.
Forexample
✓Superbflyingcatchestakenbythefieldersinthecricketground.
✓Swimminginthepool.
The fascinating capability of the biological neural network has inspired the
inception of artificial neural network (ANN).
ANN is made of artificial neurons and is machine designed to model the
functioning of the nervous system.
The biological form of neuron is replicated in the electronic or digital form
of neuron.
Cytoarchitectural map of the cerebral cortex
•Dendrites:Ithasirregularsurfaceandreceivessignalsfromneighbouringneurons.
•Soma:Itisthemainbodyoftheneuronwhichaccumulatesthesignalscomingfromthedifferent
dendrites.Itfireswhenasufficientamountofsignalsisaccumulated.
•Axon:Ithassmoothersurface,fewerbranchesandgreaterlength.Itisthelastpartoftheneuron
whichreceivessignalfromsoma,oncetheneuronfiresandpassesitontotheneighbouring
neuronsthroughtheaxonterminals.
Central Nervous System
Interregional circuits
Local circuits
Neurons
Dendritic Trees
Neural Microcircuits
Synapses
Molecules
Structural Organization of Levels in the Brain
A neural microcircuits refers to an assembly
of synapses organized into patterns of
connectivity to produce a functional
operation of interest.
•Artificialneuralnetwork(ANN)isamachinelearningapproachthatmodelshuman
brainandconsistsofanumberofartificialneurons.
•Thebrainisahighlycomplexnon-linearandparallelcomputer.
•NeuroninANNstendtohavefewerconnectionsthanbiologicalneurons.
•EachneuroninANNreceivesanumberofinputs.
•Anactivationfunctionisappliedtotheseinputswhichresultsinactivationlevelof
neuron(outputvalueoftheneuron).
•Knowledgeaboutthelearningtaskisgivenintheformofexamplescalledtraining
examples.
Plasticity permits the developing nervous system to adapt to its surrounding environment.
Initsmostgeneralform,aneuralnetworkisamachinethatisdesigned
tomodelthewayinwhichthebrainperformsaparticulartaskorfunction
ofinterest;thenetworkisusuallyimplementedbyusingelectronic
componentsorissimulatedinsoftwareonadigitalcomputer.
A neural network is a massively parallel distributed processor made up of
simple processing units, which has a natural propensity for storing
experimental knowledge and making it available for use.
Itresemblethebrainintworespects:
(a)Knowledgeisacquiredbythenetworkfromitsenvironmentthrougha
learningprocess.
(b)Interneuronconnectionstrength,knownassynapticweight,areusedto
storetheacquiredknowledge.
(.) Output
y
k
Activation
function
Non Linear Model of a Neuron
w
k1
w
k2
w
km
x
1
x
2
x
m
v
k
Induced field
Summing
junction
Synaptic weights
Bias
b
k
Affine Transformation produced by the presence of a bias
Another Non Linear model of a Neuron
Neuron Model
•Thechoiceofactivationfunctiondeterminestheneuronmodel.
Examples:
➢Threshold function:
=
0
1
)(v
if v ≥ 0
if v < 0
➢Piecewise Linear Function
•Piecewise linear function is viewed as a approximation to a non linear amplifier.
•It reduces to a threshold function if the amplification factor of the linear region is made infinitely large.
NETWORK ARCHITECTURE
Therearethreedifferentclassesofnetworkarchitectures:
✓Single-layer feed-forward network
✓Multi-layer feed-forward network
✓Recurrent network
The manner in which the neurons of a neural network are structured is intimately
linked with the learning algorithm used to train the network.
Single Layer Feed-Forward Neural Network
Output layer
of
neurons
Input layer
of
source nodes
Feedforward network with a single layer of neurons.
Inalayeredneuralnetworktheneuronsareorganizedintheformoflayers.Inthesimplestformofa
layerednetwork,wehaveaninputlayerofsourcenodesthatprojectontoanoutputlayersofneurons,
butnotvice-versa.Thisisafeedforwardoracyclicnetwork.
Multi Layer Feed-Forward Neural Network
•MFFNNisamoregeneralnetworkarchitecture,wheretherearehidden
layersbetweeninputandoutputlayers.
•Hiddennodesdonotdirectlyreceiveinputsnorsendoutputstothe
externalenvironment.
•MFFNNsovercomethelimitationofsingle-layerNN.
•Theycanhandlenon-linearlyseparablelearningtasks.
Input
layer
Output
layer
Feedforward network with one hidden layer and one output layer.
Deep Learning
•Inamulti-layerneuralnetwork,aswekeepincreasingthenumberof
hiddenlayersthecomputationbecomesveryexpensive.
•Goingbeyondtwotothreelayersbecomesquitedifficultcomputationally.
Suchintensecomputationishandledbygraphicsprocessingunit(GPU).
•Whenthenumberoflayersatthemaximumaretwotothreethenitis
calledasshallowneuralnetwork.
•Whenthenumberoflayersincreasestomorethanthreethenitistermed
asDeepneuralnetwork
Recurrent Neural Network
•Arecurrentneuralnetworkdistinguishesitselffromafeedforwardneuralnetworkinthatithasat
leastonefeedbackloop.
Recurrent neural network with no hidden neurons Recurrent neural network with hidden neurons
McCulloch-Pitts Model of a Neuron
•McCulloch-PittsneuronmodelisoneoftheearliestANNmodel,hasonlytwo
typesofinputsexcitatoryandinhibitory.
•Theexcitatoryinputshaveweightsofpositivemagnitudeandtheinhibitory
weightshaveweightsofnegativemagnitude.
•TheinputsoftheMcCulloch-Pittsmodelcouldbeeither0or1.
•Ithasathresholdfunctionasactivationfunctionandtheoutputis1iftheinput
isgreaterthanequaltoagiventhresholdelse0.
•McCulloch-Pittsneuronmodelcanbeusedtodesignlogicaloperations.For
thatpurpose,theconnectionweightsneedtobecorrectlydecidedalongwith
thethresholdfunction.
Situation X
1 X
2 Y
sum Y
out
1 0 0 0 0
2 0 1 1 1
3 1 0 1 1
4 1 1 2 1
X
1
X
2
1
1
Y
sum Y
out
�
��??????=σ
??????=1
2
�
??????�
??????
�
??????��=??????�
��??????=ቊ
1,�≥1
0,�<1
A neuron consists of a linear combiner followed by a hard limiter (signum
activation function).
The decision boundary, a hyperplane is defined by:
For the perceptron to function properly, the two
classes C1 and C2 must be linearly separable.
Major Aspects in ANN
•The number of layers in the network
•The direction of signal flow
•The number of nodes in each layer
•The value of weights attached with each interconnection between neurons
MULTI LAYER PERCEPTRON NEURAL NETWORK
•Multi layer perceptron neural network is an important class of feed forward neural network that consists
of a input layer, hidden layers and an output layer.
•The input signal propagates through the network in a forward direction on a layer by layer basis which is
referred to as a multi layer perceptron neural network.
•It is a generalization of the single layer perceptron.
•Multi layer perceptron have been successfully applied to solve difficult and diverse problems by training in
a supervised manner with a highly popular algorithm known as error back propagation (BP) algorithm.
•BP algorithm is based on the error correction learning rule and may be viewed as the generalization of the
least mean square algorithm.
•BP algorithm consists of two phases forward and backward pass.
•In the forward pass the weights are fixed and in the backward pass the weights are adjusted in accordance
with an error correction rule.
•Error signal is propagated backward through network against the direction of synaptic connection hence
the name error back propagation algorithm.
•The synaptic weights are adjusted to make the actual response of the network move closer to the desired
response in a statistical sense.
Radial Basis Function Architecture
x
2
x
m
x
1
y
w
m1
w
11 1m
Radial basis function neural network.
•It consists of one hidden layer with RBF activation function.
•It consists of output layer with linear activation function.)(....)(
111111 mmm txwtxwy −++−= 11....
m
distance of x=(x
1,……,x
m) from center t.tx−
●Herewerequireweights,w
ifromthehiddenlayertotheoutputlayer
only.
●Theweightsw
icanbedeterminedwiththehelpofanyofthestandard
iterativemethodsdescribedearlierforneuralnetworks.
●However,sincetheapproximatingfunctiongivenbelowislinearw.r.
t.w
i,itcanbedirectlycalculatedusingthematrixmethodsoflinear
leastsquareswithouthavingtoexplicitlydeterminew
iiteratively.
●Itshouldbenotedthattheapproximatefunctionf(X)isdifferentiable
withrespecttow
i.)()(
1
=
−==
N
i
iii tXwXfY
Comparison of RBFNN and FFNN
RBF NN FF NN
Non-linear layered feed-forward
networks.
Non-linear layered feed-forward
networks
Hidden layer of RBF is non-linear, the
output layer of RBF is linear.
Hidden and output layers of FFNN
are usually non-linear.
One singlehidden layer May have morehidden layers.
Neuron model of the hidden neurons is
differentfrom the one of the output
nodes.
Hidden and output neurons share
a common neuron model.
Activation function of each hidden
neuron in a RBF NN computes the
Euclidean distance between input
vector and the center of that unit.
Activation function of each hidden
neuron in a FFNN computes the
inner product of input vector and
the synaptic weight vector of that
neuron