Neural Networks Basic Concepts and Deep Learning

rahuljain582793 44 views 57 slides Mar 01, 2025
Slide 1
Slide 1 of 57
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57

About This Presentation

NN


Slide Content

Course
Outcomes
Aftercompletionofthiscourse,studentswillbeableto
Understandmachine-learningconcepts.
UnderstandandimplementClassificationconcepts.
UnderstandandanalysethedifferentRegression
algorithms.
ApplytheconceptofUnsupervisedLearning.
ApplytheconceptsofArtificialNeuralNetworks.

Topics
Biological Neurons and Biological
Neural Networks
Artificial Neural Networks
Perceptron
Activation Functions
Applications of Artificial Neural
Networks (ANNs)
Neural Network
Types of Artificial Neural Networks
FeedforwardNeuralNetworks
(FNN)/Multi-LayerPerceptron
(MLP)
ConvolutionalNeuralNetworks
(CNN)
RecurrentNeuralNetworks
(RNN)
TransformerNeuralNetworks
Autoencoders
Generative Adversarial
Networks(GANs)
CompetitiveNeuralNetworks

Biological Neurons and
Biological Neural
Networks

Biological
Neurons
Thesearethenervecellsinyourbrainandnervoussystem.
Eachneuronhasdendrites(whichreceivesignals),acellbody(which
processesthesignals),andanaxon(whichsendssignalstootherneurons).
Neuronscommunicatewitheachotherthroughelectricalimpulses,
formingcomplexnetworkstohelpusthink,move,feel,andlearn.

Biological
Neural
Networks
Abiologicalneuralnetworkisacollectionofinterconnected
neuronsinyourbrain.
Thesenetworksareresponsibleforeverythingyoudo,like
recognizingfaces,rememberingthings,orsolving
problems.
Theneuronsareconnectedbysynapses.Whenyoulearn
somethingnew,theconnectionsbetweencertainneurons
strengthen,whichhelpsyourememberorperformtasks
better.

Biological
Neural
Networks

Artificial Neural
Networks and
Perceptron

ANN
ANNsarecomputersystemsdesignedtomimichow
biologicalneuronswork,butthey’remadeupofmath,
notcells.
Anartificialneurontakesininformation,processesit,
andsendsanoutput,muchlikehowabiologicalneuron
works.
Whenmanyartificialneuronsareconnectedtogether,
theyformanartificialneuralnetwork,whichcanlearn
todothingslikerecognizingobjectsinpictures,
predictingoutcomes,orplayingvideogames.

ANN
Theterm"Artificialneuralnetwork"referstoabiologically
inspiredsub-fieldofartificialintelligencemodeledafterthe
brain.
AnArtificialneuralnetworkisusuallyacomputational
networkbasedonbiologicalneuralnetworksthatconstruct
thestructureofthehumanbrain.
Similartoahumanbrainhasneuronsinterconnectedto
eachother,artificialneuralnetworksalsohaveneurons
thatarelinkedtoeachotherinvariouslayersofthe
networks.Theseneuronsareknownasnodes.

Thegivenfigureillustratesthetypicaldiagramof
BiologicalNeuralNetwork.
ThetypicalArtificialNeuralNetworklookssomething
likethegivenfigure.

Thegivenfigureillustratesthetypicaldiagramof
BiologicalNeuralNetwork.
Biological Neural NetworkArtificial Neural Network
Dendrites Inputs
Cell nucleus Nodes
Synapse Weights
Axon Output
ThetypicalArtificialNeuralNetworklookssomething
likethegivenfigure.

Perceptron
A‘Perceptron’isthebasicbuildingblock,orsingle
node,ofaneuralnetworkinspiredfromtheneurons
thatarefoundinthebrain.
Itoperatesbytakinginasetofinputs,calculatinga
weightedsum,addingabiasterm,andthenapplying
anactivationfunctiontothissumtoproducean
output.

The inner
working of a
perceptron is
as follows:

Perceptron
Learning
Perceptronlearningreferstohowaperceptronadjusts
itsweightstoimproveaccuracy.
Whentheperceptronmakesawrongprediction,it
learnsbychangingtheweightstogetclosertothe
correctanswernexttime.
Overtime,throughrepeatedadjustments,the
perceptronlearnstomakebetterpredictions.

Activation Function

Activation
Function
Activation:Inbiologicalneurons,activationisthe
firingrateoftheneuronwhichhappenswhenthe
impulsesarestrongenoughtoreachthethreshold.In
artificialneuralnetworks,Amathematicalfunction
knownasanactivationfunctionmapstheinputtothe
output,andexecutesactivations.

Activation
Functions
Thepurposeofanactivationfunctionistointroduce
non-linearityintothemodel,allowingthenetworkto
learnandrepresentcomplexpatternsinthedata.
Theactivationfunctiondecideswhetheraneuron
shouldbeactivatedornotbycalculatingtheweighted
sumandfurtheraddingbiastoit.Thepurposeofthe
activationfunctionistointroducenon-linearityintothe
outputofaneuron.

Linear
Function
Equation:Linearfunctionhastheequationsimilartoasofastraightline
i.e.y=x
Nomatterhowmanylayerswehave,ifallarelinearinnature,thefinal
activationfunctionoflastlayerisnothingbutjustalinearfunctionofthe
inputoffirstlayer.
Range:-infto+inf
Uses:Linearactivationfunctionisusedatjustoneplacei.e.outputlayer.
Issues:Ifwewilldifferentiatelinearfunctiontobringnon-linearity,result
willnomoredependoninput“x”andfunctionwillbecomeconstant,it
won’tintroduceanyground-breakingbehaviortoouralgorithm.
Forexample:Calculationofpriceofahouseisaregressionproblem.House
pricemayhaveanybig/smallvalue,sowecanapplylinearactivationat
outputlayer.Eveninthiscaseneuralnetmusthaveanynon-linearfunction
athiddenlayers.

RELU Function
ItStandsforRectifiedlinearunit.Itisthemostwidelyusedactivation
function.ChieflyimplementedinhiddenlayersofNeuralnetwork.
Equation:-A(x)=max(0,x).Itgivesanoutputxifxispositiveand0
otherwise.
ValueRange:-[0,inf)
Nature:-non-linear,whichmeanswecaneasilybackpropagatethe
errorsandhavemultiplelayersofneuronsbeingactivatedbythe
ReLUfunction.
Uses:-ReLuislesscomputationallyexpensivethantanhandsigmoid
becauseitinvolvessimplermathematicaloperations.Atatimeonlya
fewneuronsareactivatedmakingthenetworksparsemakingit
efficientandeasyforcomputation.
Insimplewords,RELUlearnsmuchfasterthansigmoidandTanh
function.

TanhFunction
Theactivationthatworksalmostalwaysbetterthansigmoidfunctionis
TanhfunctionalsoknownasTangentHyperbolicfunction.It’sactually
mathematicallyshiftedversionofthesigmoidfunction.Botharesimilar
andcanbederivedfromeachother.
Equation :-
f(x) = tanh(x) = 2/(1 + e-2x) –1
OR
tanh(x) = 2 * sigmoid(2x) –1
Value Range :--1 to +1
Nature :-non-linear
Uses:-Usuallyusedinhiddenlayersofaneuralnetworkasit’svalues
liesbetween-1to1hencethemeanforthehiddenlayercomesoutbe
0orveryclosetoit,hencehelpsincenteringthedatabybringingmean
closeto0.Thismakeslearningforthenextlayermucheasier.

Sigmoid
Function
Itisafunctionwhichisplottedas‘S’shapedgraph.
Equation:A=1/(1+e
-x
)
Nature:Non-linear.NoticethatXvaluesliesbetween-2to
2,Yvaluesareverysteep.Thismeans,smallchangesinx
wouldalsobringaboutlargechangesinthevalueofY.
ValueRange:0to1
Uses:Usuallyusedinoutputlayerofabinaryclassification,
whereresultiseither0or1,asvalueforsigmoidfunctionlies
between0and1onlyso,resultcanbepredictedeasilyto
be1ifvalueisgreaterthan0.5and0otherwise.

Softmax
Function
Thesoftmaxfunctionisalsoatypeofsigmoidfunctionbutis
handywhenwearetryingtohandlemulti-class
classificationproblems.
Nature:-non-linear
Uses:-Usuallyusedwhentryingtohandlemultipleclasses.
thesoftmaxfunctionwascommonlyfoundintheoutput
layerofimageclassificationproblems.Thesoftmaxfunction
wouldsqueezetheoutputsforeachclassbetween0and1
andwouldalsodividebythesumoftheoutputs.
Output:-Thesoftmaxfunctionisideallyusedintheoutput
layeroftheclassifierwhereweareactuallytryingtoattain
theprobabilitiestodefinetheclassofeachinput.

Applications of Artificial
Neural Networks
(ANNs)

Applications of
Artificial
Neural
Networks
(ANNs):
Image Recognition (e.g., face detection,
object classification)
Speech Recognition (e.g., virtual assistants
like Siri and Alexa)
Natural Language Processing (NLP) (e.g.,
language translation, text generation)
Medical Diagnosis (e.g., detecting diseases
from medical images or records)
Financial Predictions (e.g., stock market
forecasting, fraud detection)
Autonomous Vehicles (e.g., self-driving cars,
traffic sign recognition)
Recommender Systems (e.g., Netflix,
Amazon, YouTube recommendations)
Robotics (e.g., robot vision, control systems)
Customer Support Chatbots(e.g.,
automating responses to queries)
Game AI (e.g., AI playing video games or
board games like Go)
Time Series Forecasting (e.g., weather
prediction, sales forecasting)
Anomaly Detection (e.g., cybersecurity,
equipment failure detection)
Art Generation (e.g., creating artwork, music
composition)
Social Media Monitoring (e.g., sentiment
analysis, spam detection)
Personalized Marketing (e.g., targeted
advertising, customer behavior prediction)

Neural Network

Perceptron is a single layer neural
networkand a multi-layer perceptron is
called Neural Networks.

Neural
Network

Neural
Network
ThisNeuralNetworkorArtificialNeuralNetworkhas
multiplehiddenlayersthatmakeitamultilayerneural
Networkanditisfeed-forwardbecauseitisanetworkthat
followsatop-downapproachtotrainthenetwork.Inthis
networktherearethefollowinglayers:
InputLayer
HiddenLayer
OutputLayer

Neural
Network

Neural
Network

Neural
Network

Neural
Network

Neural
Network

Neural
Network

Neural
Network
Thebasicruleofthumbisifyoureallydon’tknowwhat
activationfunctiontouse,thensimplyuseRELUasitisa
generalactivationfunctioninhiddenlayersandisusedin
mostcasesthesedays.
Ifyouroutputisforbinaryclassificationthen,sigmoid
functionisverynaturalchoiceforoutputlayer.
Ifyouroutputisformulti-classclassificationthen,
Softmaxisveryusefultopredicttheprobabilitiesofeach
classes.

Types of Artificial
Neural Networks

Types of
Artificial
Neural
Networks
FeedforwardNeuralNetworks(FNN)/Multi-Layer
Perceptron(MLP)
ConvolutionalNeuralNetworks(CNN)
RecurrentNeuralNetworks(RNN)
TransformerNeuralNetworks
Autoencoders
GenerativeAdversarialNetworks(GANs)
CompetitiveNeuralNetworks

Feedforward
Neural
Networks
(FNN) / Multi-
Layer
Perceptron
(MLP)
Thesearethesimplestneuralnetworkswhere
informationflowsinonedirection:fromtheinputtothe
output.
Thinkofitlikeafunnel—yougivesomeinputatthetop
(likenumbers),andthenetworkprocessestheinputlayer
bylayeruntilitreachesafinaldecisionattheoutput(like
yes/no,orclassifyinganimage).
Example:Yougiveitanimage,andittellsyouwhetherit’s
acatoradog.

Convolutional
Neural
Networks
(CNN)
CNNsarespecialneuralnetworksdesignedforimage
data.
Theyhavelayersthatscantheimagepiecebypieceto
findpatternslikeedges,colors,orshapes,whichhelps
thenetworkunderstandthecontentoftheimage.
Example:CNNsareusedinapplicationslikefacial
recognitionoridentifyingobjectsinphotos.

Recurrent
Neural
Networks
(RNN)
RNNsareusedwhenyoudealwithsequencesofdata
(likesentences,timeseries,orspeech).
Thesenetworksrememberwhattheyprocessedearlier
inthesequence,allowingthemtomakedecisions
basedonbothcurrentinputandpastinputs(like
havingashort-termmemory).
Example:RNNscanbeusedtopredictthenextwordin
asentenceorrecognizespokenwordsinspeech.

Transformer
Neural
Networks:
Transformersarepowerfulnetworksforprocessing
languageorsequentialdata.
UnlikeRNNs,theylookatthewholesentenceatonce
insteadofonewordatatime,whichmakesthemfaster
andmoreaccurate.
Theyusesomethingcalledself-attentiontofocuson
importantpartsoftheinputsequence.
Example:GPT-3(themodelbehindchatbotslikethis)isa
transformer,usedforgeneratingtext,answering
questions,ortranslatinglanguages.

Autoencoders
Autoencodersareusedfortaskslikecompressing
dataorfindingpatterns.
Theytakeinputdata,reduceittoasimplerversion
(compression),andthentrytorebuilditbacktoits
originalform.
Example:Autoencodersareusedtocompressimages
intosmallerfilesorcleannoisydata.

Generative
Adversarial
Networks
(GANs)
GANsinvolvetwoneuralnetworksworkingtogether.
Onetriestocreatefakedata(likefakeimages),and
theothertriestodetectwhichdataisrealandwhich
isfake.
Overtime,thegeneratornetworkgetsbetterat
makingrealisticfakedata,whilethediscriminatorgets
betteratspottingfakes.
Example:GANsareusedtocreaterealisticfake
images,likegeneratingpicturesofpeoplewhodon’t
exist.

Competitive
Neural
Networks
Incompetitivenetworks,neuronscompetewitheach
other,andonlythemostactiveoneis“activated.”
Thesenetworksareoftenusedinclusteringtasks
wheresimilardatapointsaregroupedtogether.
Example:Self-OrganizingMaps(SOMs)areatypeof
competitivenetworkusedtoclusterdataintogroups,
likefindingsimilarpatternsinlargedatasets.

Type of Neural Network Purpose Structure Best For How It Works
Feedforward Neural Networks
(FNN) / Multi-Layer Perceptron
(MLP)
Basic neural network for general
tasks like classification and
regression.
Data moves in one direction,
layer by layer (input →hidden
→output).
Classifying simple data like
numbers or basic images.
Takes input, processes it
through layers, and gives a
prediction (e.g., "cat" or "dog").
Convolutional Neural
Networks (CNN)
Specialized for image and video
data.
Uses layers that look at parts of
an image (like scanning) to
detect patterns (edges, shapes).
Recognizing objects in images,
like face detection or medical
image analysis.
Scans parts of the image to
learn what it contains (e.g.,
looks for edges, colors,
textures).
Recurrent Neural Networks
(RNN)
Processes sequences of data,
where order matters.
Includes loops to remember
past data (short-term memory).
Time-series data, speech
recognition, text prediction
(e.g., what comes next).
Remembers previous data (like
words in a sentence or past
events in time) to make a
decision.
Transformer Neural Networks
Fast and efficient for language
and sequential tasks.
Processes whole sequences at
once (no need for memory
loops).
Natural language processing
(NLP) like translation, text
generation.
Looks at all words in a sentence
together and figures out their
relationships using "self-
attention."
Autoencoders
Compresses data, finds
patterns, or reduces noise.
Has an encoder to shrink the
input and a decoder to
reconstruct it.
Data compression, noise
reduction (like removing
background noise).
Compresses the input into a
smaller form and then tries to
rebuild it to match the original.
Generative Adversarial
Networks (GANs)
Generates new, realistic data
like images or videos.
Has two networks: a generator
(creates fake data) and a
discriminator (detects fake
data).
Creating realistic images,
videos, or even music (like
deepfakes).
The generator makes fake data,
and the discriminator tries to
catch the fakes, making the
generator improve over time.
Competitive Neural Networks
Groups similar data into clusters
without supervision.
Neurons compete to be the
most active; only one "wins" and
is activated.
Clustering similar patterns in
data, like organizing large
datasets into groups.
Neurons compete, and the
"winner" learns from the data,
helping to group similar data
points together.
Tags