Back propagation

NasurdeenAhamed 132 views 15 slides Apr 21, 2020
Slide 1
Slide 1 of 15
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15

About This Presentation

Artificial Neural Networks (ANN)


Slide Content

Back propagation
N NasurudeenAhamed,
Assistant Professor,
CSE

Introduction -Back Propagation
•BackpropagationisasupervisedlearningtechniqueforTraininganeural
networks.
•Calculatethegradientofthecostfunctioninaneuralnetwork.
•Usedbygradientdescentoptimizationalgorithmtoadjustweightofneurons.
•Alsoknownasbackwardpropagationoferrorsastheerroriscalculatedand
distributedbackthroughthenetworkoflayers.
•GoalofBackpropagation:Optimizetheweights.

Back Propagation

Back Propagation
•ActivationFunction:It’sadecisionmakingfunction.
•ThemainpurposeisconverttheinputsignalofanodeinaANNtoanoutput
signal.

Back Propagation
•VariantsofActivationFunction
•LinearFunction
•SigmoidFunction
•TanhFunction(TangentHyperbolicfunction)
•RELUFunction(Rectifiedlinearunit)
•SoftmaxFunction

Back Propagation
•Bias:Thebiasnodeaconsidereda“pseudoinput”toeachneuroninthe
hiddenandoutputlayer.
•It’susedtoovercometheproblemsassociatedwithsituationswherethevalues
ofaninputpatternarezero.Ifanyinputpatternhaszerovalues,theneural
networkcouldnotbetrainedwithoutabiasnode.
•Bias(threshold)activationfunctionwasproposedfirst.

Back Propagation
•Goal:OptimizeTheWeightssothattheneuralnetworkcanlearnhowto
correctlymaparbitraryinputstooutputs.

Back Propagation
•ForwardPass:Input–0.05and0.10

Back Propagation
•Howwecalculatetotalnetinput:
•ApplyActivationFunction:

Back Propagation
•CalculatingtheTotalError:EachOutputNeuronusingtheSquaredError
FunctionandSumthemtogetthetotalerror.

Back Propagation
•BackwardsPass:OurGoalistominimizetheerrorforeachoutputneuronand
thenetworkasawhole.
•HowmuchchangeinW5affectstotalError?
•“GradientWithRespecttoW5”-
•ToDecreasetheerror,thensubtractthevaluefromthecurrentweight.

Back Propagation
•Next,Wewillcontinuethebackwardspassbycalculatingthenewvaluesfor
W1,W2,W3,andW4.

Back Propagation
•Advantages:
•It is simple, fast and easy to program
•Only numbers of the input are tuned and not any other parameter
•No need to have prior knowledge about the network
•It is flexible
•A standard approach and works efficiently
•It does not require the user to learn special functions

Back Propagation
•DisAdvantages:
•Backpropagationpossibly be sensitive to noisy data and irregularity
•The performance of this is highly reliant on the input data
•Needs excessive time for training
•The need for a matrix-based method for backpropagationinstead of mini-batch

Back Propagation
•Applications:
•The neural network is trained to enunciate each letter of a word and a sentence
•It is used in the field ofspeech recognition
•It is used in the field of character and face recognition