NeuralNetworksPresentationPart1important.pdf

mogeyo5634 8 views 41 slides Sep 15, 2025
Slide 1
Slide 1 of 41
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41

About This Presentation

Neural Network 1


Slide Content

NEURAL NETWORK

Introduction
•Machinelearningmimicsthehumanformoflearning.
•Humanlearningoreveryactionofhumanbeingiscontrolledbythenervous
system.
•Thenervoussystemcoordinatesthedifferentactionsbytransmittingsignals
toandfromdifferentpartsofbody.
•Thenervoussystemisconstitutedofaspecialtypeofcellcalledneuronor
nervecell,whichhasspecialstructureallowingittoreceiveorsendsignals
fromotherneurons.
•Thisstructureessentiallyformsanetworkofneuronsoraneuralnetwork.

•Biologicalneuralnetworksisamassivelylargeandcomplexparallel
computingnetwork.
•Itisbecauseofthismassiveparallelcomputingnetworkthatthenervous
systemhelpshumanbeingstoperformactionsortakedecisionsataspeed
andwithsucheasethatthefastestsupercomputeroftheworldbeafraidof.
Forexample
✓Superbflyingcatchestakenbythefieldersinthecricketground.
✓Swimminginthepool.
The fascinating capability of the biological neural network has inspired the
inception of artificial neural network (ANN).
ANN is made of artificial neurons and is machine designed to model the
functioning of the nervous system.
The biological form of neuron is replicated in the electronic or digital form
of neuron.

Human Brain
Receptors Neural Nets EffectorsStimulus Response
•Thehumanbrainmaybeviewedasthethreestagesystem.Centretothesystemisthebrainwhich
receivesinformationandmakeappropriatedecisions.
•Thereceptorsconvertsstimulifromexternalenvironmentintoelectricalpulsesthatconveyinformation
tothebrain.
•Theeffectorsconvertelectricalimpulsesgeneratedbytheneuralnetintodiscernibleresponsesas
systemoutputs.
•Structuralconstituentsofthebrainisneuronsandtheyaremassivelyconnectedwitheachother.
•Neuronisabletoreceive,processandtransmitinformationintheformofchemicalandelectrical
signals.
•Itisestimatedthatthereareapproximately10billionneuronsinthehumancortexand60trillion
synapsesorconnection.
•Synapsesareelementarystructuralandfunctionalunitsthatmediatetheinteractionbetweenneurons.
•Synapseconvertsapresynapticelectricalsignalintoachemicalsignalandthenbackintoapostsynaptic
electricalsignal.

Structure of a Biological Neuron

Cytoarchitectural map of the cerebral cortex
•Dendrites:Ithasirregularsurfaceandreceivessignalsfromneighbouringneurons.
•Soma:Itisthemainbodyoftheneuronwhichaccumulatesthesignalscomingfromthedifferent
dendrites.Itfireswhenasufficientamountofsignalsisaccumulated.
•Axon:Ithassmoothersurface,fewerbranchesandgreaterlength.Itisthelastpartoftheneuron
whichreceivessignalfromsoma,oncetheneuronfiresandpassesitontotheneighbouring
neuronsthroughtheaxonterminals.

Central Nervous System
Interregional circuits
Local circuits
Neurons
Dendritic Trees
Neural Microcircuits
Synapses
Molecules
Structural Organization of Levels in the Brain
A neural microcircuits refers to an assembly
of synapses organized into patterns of
connectivity to produce a functional
operation of interest.

•Artificialneuralnetwork(ANN)isamachinelearningapproachthatmodelshuman
brainandconsistsofanumberofartificialneurons.
•Thebrainisahighlycomplexnon-linearandparallelcomputer.
•NeuroninANNstendtohavefewerconnectionsthanbiologicalneurons.
•EachneuroninANNreceivesanumberofinputs.
•Anactivationfunctionisappliedtotheseinputswhichresultsinactivationlevelof
neuron(outputvalueoftheneuron).
•Knowledgeaboutthelearningtaskisgivenintheformofexamplescalledtraining
examples.
Plasticity permits the developing nervous system to adapt to its surrounding environment.

Initsmostgeneralform,aneuralnetworkisamachinethatisdesigned
tomodelthewayinwhichthebrainperformsaparticulartaskorfunction
ofinterest;thenetworkisusuallyimplementedbyusingelectronic
componentsorissimulatedinsoftwareonadigitalcomputer.
A neural network is a massively parallel distributed processor made up of
simple processing units, which has a natural propensity for storing
experimental knowledge and making it available for use.
Itresemblethebrainintworespects:
(a)Knowledgeisacquiredbythenetworkfromitsenvironmentthrougha
learningprocess.
(b)Interneuronconnectionstrength,knownassynapticweight,areusedto
storetheacquiredknowledge.

•Theprocedureusedtoperformthelearningprocessiscalledalearningalgorithm,the
functionofwhichistomodifythesynapticweightsofthenetworkinanorderlyfashion
toattainthedesireddesignobjective.
•Itisalsopossibleforaneuralnetworktomodifyitsowntopology,whichismotivated
bythefactthatneuronsinthehumanbraincandieandthatnewsynapticweightscan
grow.
•Neuralnetworkarealsoreferredinliteratureasneurocomputers,connectionist
network,paralleldistributedprocessor.
•NeuralNetworkderiveitscomputingpowerfirstfromthemassivelyparalleldistributed
structureandseconditsabilitytolearnandthereaftergeneralize.
•Generalizationreferstotheneuralnetworkproducingreasonableoutputsforinputsnot
encounteredduringtraining(learning).

•AnArtificialNeuralNetworkisspecifiedby:
➢neuronmodel:theinformationprocessingunitoftheNN,
➢anarchitecture:asetofneuronsandlinksconnectingneurons.Eachlink
hasaweight,
➢alearningalgorithm:usedfortrainingtheNNbymodifyingthe
weightsinordertomodelaparticularlearningtaskcorrectlyonthe
trainingexamples.
•TheaimistoobtainaNNthatistrainedandgeneralizeswell.
•Itshouldbehavescorrectlyonnewinstancesofthelearningtask.

NEURON
•TheneuronisthebasicinformationprocessingunitofaNN.Itconsistsof:
1Asetoflinks,describingtheneuroninputs,withweightsW
1,W
2,…,W
m
2Anadderfunction(linearcombiner)forcomputingtheweightedsumoftheinputs:
(realnumbers)
(1)
3Activationfunctionforlimitingtheamplitudeoftheneuronoutput.Here‘b’denotes
bias.
(2)
�
�=෍
�=1
??????
�
���
�
�
�=??????�
�+??????
�
�
�=�
�+??????
�

(.) Output
y
k
Activation
function
Non Linear Model of a Neuron
w
k1
w
k2
w
km
x
1
x
2
x
m
v
k
Induced field
Summing
junction
Synaptic weights
Bias
b
k

Affine Transformation produced by the presence of a bias

Another Non Linear model of a Neuron

Neuron Model
•Thechoiceofactivationfunctiondeterminestheneuronmodel.
Examples:
➢Threshold function:


=
0
1
)(v
if v ≥ 0
if v < 0

➢Piecewise Linear Function
•Piecewise linear function is viewed as a approximation to a non linear amplifier.
•It reduces to a threshold function if the amplification factor of the linear region is made infinitely large.

➢Sigmoid function: )exp(1
1
)(
av
v
−+
=
➢Gaussian function: 












−


=
2
2
1
exp
2
1
)(




v
v

NETWORK ARCHITECTURE
Therearethreedifferentclassesofnetworkarchitectures:
✓Single-layer feed-forward network
✓Multi-layer feed-forward network
✓Recurrent network
The manner in which the neurons of a neural network are structured is intimately
linked with the learning algorithm used to train the network.

Single Layer Feed-Forward Neural Network
Output layer
of
neurons
Input layer
of
source nodes
Feedforward network with a single layer of neurons.
Inalayeredneuralnetworktheneuronsareorganizedintheformoflayers.Inthesimplestformofa
layerednetwork,wehaveaninputlayerofsourcenodesthatprojectontoanoutputlayersofneurons,
butnotvice-versa.Thisisafeedforwardoracyclicnetwork.

Multi Layer Feed-Forward Neural Network
•MFFNNisamoregeneralnetworkarchitecture,wheretherearehidden
layersbetweeninputandoutputlayers.
•Hiddennodesdonotdirectlyreceiveinputsnorsendoutputstothe
externalenvironment.
•MFFNNsovercomethelimitationofsingle-layerNN.
•Theycanhandlenon-linearlyseparablelearningtasks.
Input
layer
Output
layer
Feedforward network with one hidden layer and one output layer.

Deep Learning
•Inamulti-layerneuralnetwork,aswekeepincreasingthenumberof
hiddenlayersthecomputationbecomesveryexpensive.
•Goingbeyondtwotothreelayersbecomesquitedifficultcomputationally.
Suchintensecomputationishandledbygraphicsprocessingunit(GPU).
•Whenthenumberoflayersatthemaximumaretwotothreethenitis
calledasshallowneuralnetwork.
•Whenthenumberoflayersincreasestomorethanthreethenitistermed
asDeepneuralnetwork

Recurrent Neural Network
•Arecurrentneuralnetworkdistinguishesitselffromafeedforwardneuralnetworkinthatithasat
leastonefeedbackloop.
Recurrent neural network with no hidden neurons Recurrent neural network with hidden neurons

McCulloch-Pitts Model of a Neuron
•McCulloch-PittsneuronmodelisoneoftheearliestANNmodel,hasonlytwo
typesofinputsexcitatoryandinhibitory.
•Theexcitatoryinputshaveweightsofpositivemagnitudeandtheinhibitory
weightshaveweightsofnegativemagnitude.
•TheinputsoftheMcCulloch-Pittsmodelcouldbeeither0or1.
•Ithasathresholdfunctionasactivationfunctionandtheoutputis1iftheinput
isgreaterthanequaltoagiventhresholdelse0.
•McCulloch-Pittsneuronmodelcanbeusedtodesignlogicaloperations.For
thatpurpose,theconnectionweightsneedtobecorrectlydecidedalongwith
thethresholdfunction.

McCulloch-Pitts Model of a Neuron

Example
•Johncarriesanumbrellaifitissunnyorifitisraining.Therearefourgiven
situations.WeneedtodecidewhenJohnwillcarrytheumbrella.
Thesituationsareasfollows:
•Situation1:Itisnotrainingnoritissunny
•Situation2:Itisnotrainingbutitissunny
•Situation3:Itisraininganditisnotsunny
•Situation4:Itisraininganditissunny
•ToanalyzethesituationsusingtheMcCulloch-Pittsneuronmodel,wecan
considertheinputsignalsasfollows:
•X1=Itisraining
•X2=ItisSunny

Situation X
1 X
2 Y
sum Y
out
1 0 0 0 0
2 0 1 1 1
3 1 0 1 1
4 1 1 2 1
X
1
X
2
1
1
Y
sum Y
out
&#3627408486;
&#3627408480;&#3627408482;??????=σ
??????=1
2
&#3627408484;
??????&#3627408485;
??????
&#3627408486;
??????&#3627408482;&#3627408481;=??????&#3627408486;
&#3627408480;&#3627408482;??????=ቊ
1,&#3627408485;≥1
0,&#3627408485;<1

•FromthetruthtablewecanconcludethatwhenY
out=1,Johnneedstocarry
anumbrella.
•Henceinsituation2,3,4JohnneedstocarryanUmbrella.
•ItisreallyanimplementationoflogicalORfunctionusingtheMcCulloch-Pitts
neuronmodel.

Perceptron
•Perceptronisthesimplestformofaneuralnetworkusedfortheclassificationof
patternsthatarelinearlyseparable.
•Itconsistsofasingleneuronwithadjustableweightsandbias.
•ItisbuiltaroundaneuronnamelyMcCulloch-Pittsmodelofaneuron.
•Thealgorithmisusedtoadjustthefreeparametersofthisneuralnetwork.
•Rosenblattprovedthatifthepatternsisdrawnfromtwolinearlyseparable
classesthentheperceptronalgorithmconvergesandpositionthedecision
surfaceintheformofahyperplanebetweenthetwoclasses.
•Itislimitedtoperformpatternclassificationwithonlytwoclasses.

A neuron consists of a linear combiner followed by a hard limiter (signum
activation function).

The decision boundary, a hyperplane is defined by:
For the perceptron to function properly, the two
classes C1 and C2 must be linearly separable.

Major Aspects in ANN
•The number of layers in the network
•The direction of signal flow
•The number of nodes in each layer
•The value of weights attached with each interconnection between neurons

MULTI LAYER PERCEPTRON NEURAL NETWORK
•Multi layer perceptron neural network is an important class of feed forward neural network that consists
of a input layer, hidden layers and an output layer.
•The input signal propagates through the network in a forward direction on a layer by layer basis which is
referred to as a multi layer perceptron neural network.
•It is a generalization of the single layer perceptron.
•Multi layer perceptron have been successfully applied to solve difficult and diverse problems by training in
a supervised manner with a highly popular algorithm known as error back propagation (BP) algorithm.
•BP algorithm is based on the error correction learning rule and may be viewed as the generalization of the
least mean square algorithm.
•BP algorithm consists of two phases forward and backward pass.
•In the forward pass the weights are fixed and in the backward pass the weights are adjusted in accordance
with an error correction rule.
•Error signal is propagated backward through network against the direction of synaptic connection hence
the name error back propagation algorithm.
•The synaptic weights are adjusted to make the actual response of the network move closer to the desired
response in a statistical sense.

Back Propagation
Algorithm

Radial Basis Function Neural Network
•Afunctionissaidtobearadialbasisfunction(RBF)ifitsoutputdependsonthe
distanceoftheinputfromagivenstoredvector.
➢TheRBFneuralnetworkisathreelayeredfeedforwardnetwork.
➢InsuchRBFnetworks,thehiddenlayerusesneuronswithRBFsasactivation
functions.
➢Theoutputsofallthesehiddenneuronsarecombinedlinearlyattheoutputnode.
•IthasafasterlearningspeedandrequireslessiterationsascomparedtoMLPwiththe
BackPropagationruleusingthesigmoidactivationfunction.
•Thesenetworkshaveawidevarietyofapplicationssuchas
➢functionapproximation,
➢timeseriesprediction,
➢controlandregression,
➢patternclassificationtasksforperformingcomplex(non-linear)operations.

Radial Basis Function Architecture
x
2
x
m
x
1
y
w
m1
w
11 1m
Radial basis function neural network.
•It consists of one hidden layer with RBF activation function.
•It consists of output layer with linear activation function.)(....)(
111111 mmm txwtxwy −++−=  11....
m
distance of x=(x
1,……,x
m) from center t.tx−

●Herewerequireweights,w
ifromthehiddenlayertotheoutputlayer
only.
●Theweightsw
icanbedeterminedwiththehelpofanyofthestandard
iterativemethodsdescribedearlierforneuralnetworks.
●However,sincetheapproximatingfunctiongivenbelowislinearw.r.
t.w
i,itcanbedirectlycalculatedusingthematrixmethodsoflinear
leastsquareswithouthavingtoexplicitlydeterminew
iiteratively.
●Itshouldbenotedthattheapproximatefunctionf(X)isdifferentiable
withrespecttow
i.)()(
1

=
−==
N
i
iii tXwXfY 

Comparison of RBFNN and FFNN
RBF NN FF NN
Non-linear layered feed-forward
networks.
Non-linear layered feed-forward
networks
Hidden layer of RBF is non-linear, the
output layer of RBF is linear.
Hidden and output layers of FFNN
are usually non-linear.
One singlehidden layer May have morehidden layers.
Neuron model of the hidden neurons is
differentfrom the one of the output
nodes.
Hidden and output neurons share
a common neuron model.
Activation function of each hidden
neuron in a RBF NN computes the
Euclidean distance between input
vector and the center of that unit.
Activation function of each hidden
neuron in a FFNN computes the
inner product of input vector and
the synaptic weight vector of that
neuron

THANK YOU
Tags