Unit-7 Representation and Description.pdf

ssuserf35ac9 29 views 68 slides Jul 23, 2024
Slide 1
Slide 1 of 68
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68

About This Presentation

BSc CSIT and Computer Engineering Image processing uni t 11


Slide Content

Representation and
Description

Introduction
•Afteranimagehasbeensegmentedintoregions,theresultingaggregateofsegmented
pixelsusuallyisrepresentedanddescribedinaformsuitableforfurtherprocessing.
•Representingaregioninvolvestwochoices:
I.Representregionsintermsofitsexternalcharacteristics(itsboundary),or
II.Representregionsintermsofitsinternalcharacteristics(pixelscomprisingtheregion)
•Anexternalrepresentationischosenwhentheprimaryfocusisonshapecharacteristics.
•Aninternalrepresentationischosenwhentheprimaryfocusisonregionalproperties,
suchascolorandtexture.
•Thenextstepistodescribetheregionbasedonthechosenrepresentation.
•Forexample:aregionmayberepresentedbyitsboundary,anditsboundarydescribedby
itsomefeaturessuchaslength,orientationetc.
•Featuresshouldbeinsensitivetotranslation,rotation,andscaling.
•Bothboundaryandregionaldescriptorsareoftenusedtogether.
compiled by: Deepak Bhatta 2

Types
•Representationscanbeclassifiedintofollowingtwotypes:
1.ChainCodes
2.Signatures
compiled by: Deepak Bhatta 3

Chain Codes
•Chaincodesareusedtorepresentaboundarybyaconnectedsequenceofstraight-linesegmentsof
specifiedlengthanddirection.
•Typically,thisrepresentationisbasedon4-or8-connectivityofthesegments.
•Thedirectionofeachsegmentiscodedbyusinganumberingscheme,asshowninfigurebelow.
•AboundarycodeformedasasequenceofsuchdirectionalnumbersisreferredtoasFreemanchain
code.
Figure: Directional numbers for
(a) 4-directional chain code, and
(b) 8-directional chain code
(a) (b)
compiled by: Deepak Bhatta 4

Chain Codes
•Themethodgenerallyisunacceptablefortwo
principalreasons:
1.Theresultingchainofcodestendstobequite
longand
2.Anysmalldisturbancescausechangesinthe
codethatmaynotberelatedtotheshapeof
theboundary.
•Anapproachfrequentlyusedtocircumventthe
problemjustdiscussedistoresampletheboundary
byselectingalargergridspacing,asshownin
figureaside. Figure: (a) Digital boundary with resampling
grid superimposed. (b) Result of resampling
(c) 4-directional code. (d) 8-directional chain
code
(a) (b)
(d)(c)
compiled by: Deepak Bhatta 5

Chain code
•Thechaincodeofaboundarydependsonthestartingpoint.However,thecodecanbe
normalizedwithrespecttothestartingpointbytreatingitasacircularsequenceof
directionnumbersandredefiningthestartingpointsothattheresultingsequenceof
numbersisaintegermagnitude.
•Wecannormalizealsoforrotationbyusingthefirstdifferenceofthechaincodeinstead
ofthecodeitself.Thedifferenceisobtainedbycountingthenumberofdirectionchanges.
•Forexample:thefirstdifferenceofthe4-directionchaincode10103322is3133030.
•Ifwetreatthecodeasacircularsequencetonormalizewithrespecttothestartingpoint,
thenfirstelementofthedifferenceiscomputedbyusingthetransitionbetweenthelast
andfirstcomponentsofthechain.
•Heretheresultis33133030.
•Thefirstdifferenceofsmallestmagnitude(i.e.obtainedbytreatingtheresultingarrayasa
circulararrayandrotatingitcyclicallyuntiltheresultingnumericalpatternresultsisthe
smallestpossiblenumber)isknownasshapenumberofthecounter.
compiled by: Deepak Bhatta 6

Chain code
Figure: Chain code, first difference, and shape number
compiled by: Deepak Bhatta 7

Signatures
•Signaturesisa1-Dfunctionalrepresentationofaboundaryandmaybegeneratedinvariousways.
•Oneofthesimplestmethodisplotthedistancefromthecentroidtotheboundaryasafunctionof
angle,asillustratedinfigurebelow.
compiled by: Deepak Bhatta 8

Signatures
•Signaturesgeneratedbytheapproachjustdescribedareinvarianttotranslation,
buttheydodependonrotationandscaling.
•Normalizationwithrespecttorotationcanbeachievedbyfindingawaytoselect
thesamestartingpointtogeneratedthesignature,regardlessoftheshape’s
orientation.
•Onewaytonormalizeforthisresultistoscaleallfunctionssothattheyalways
spanthesamerangeofvalues,say,[0,1]
•Whateverthemethodused,keepinmindthatthebasicideaistoremove
dependencyonsizewhilepreservingthefundamentalshapeofthewaveforms.
compiled by: Deepak Bhatta 9

Signatures
Figure:Effectofnoiseonsignaturesfortwodifferentobjects.
compiled by: Deepak Bhatta 10

Descriptors
•Approachestodescribetheboundaryofaregion.
•Types
▪BoundaryDescriptors
▪RegionalDescriptors
▪RelationalDescriptors
compiled by: Deepak Bhatta 11

Some simple Descriptors
•Thelengthofaboundaryisoneofitssimplestdescriptors.
•Thenumberofpixelsalongaboundarygivesroughapproximationofitslength.
•Thediameterofaboundaryisdefinedas
Diam(B)=�??????�
�,�
[????????????
�,??????�]whereDisadistancemeasureandp
iandp
j
arepointsonboundary.
•Thelinesegmentconnectingthetwoextremepointsthatcomprisethediameteris
calledthemajoraxisoftheboundaryandthelineperpendiculartothemajoraxis
iscalledminoraxis.
compiled by: Deepak Bhatta 12

Shape Number
•Theshapenumberofboundarygenerallybasedonfourdirectionalchaincodesis
definedasthefirstdifferenceofsmallestmagnitude.
•Theordernofashapenumberisdefinedasthenumberofdigitsinits
representation.

compiled by: Deepak Bhatta 13

Figure: shape
number of order 4, 6
and 8 along with
their chain-code
representations, first
difference, and
corresponding shape
numbers.
compiled by: Deepak Bhatta 14

Example: suppose n=18 is
specified for the boundary.
▪To obtain the shape
number of this order,
following steps are done:
1.Find the basic rectangle.
2.The closet rectangle of
order 18 is 3 x 6
rectangle; sub-divide the
rectangle.
3.Align the chain code
directions with the
resulting gird.
4.Obtain the chain code
and use its first difference
to compute the shape
number.
compiled by: Deepak Bhatta 15

Fourier Descriptors
•TheideabehindFourierDescriptorsistotraverse
thepixelsbelongingstoaboundarystartingfrom
anarbitrarypoint,andrecordtheircoordinates.
•Eachvalueintheresultinglistofcoordinatespairs
(x
0,y
0),(x
1,y
1),…….,(x
k-1,y
k-1)isthen
interpretedasacomplexnumberx
k+jy
k,fork=
0,1,….K-1.
•ThediscreteFouriertransform(DFT)ofthislistof
complexnumbersistheFourierdescriptorofthe
boundary.
•TheinverseDFTrestorestheoriginalboundary.
•FollowingfigureshowsaK-pointdigitalboundary
inthexyplaneandthefirsttwocoordinatespairs
(x
0,y
0)and(x
1,y
1).
Figure: Fourier Descriptor of a boundary
Real Axis
Imaginary Axis
compiled by: Deepak Bhatta 16

Steps for Fourier Descriptors
•Definethearbitrarypoint(x
0,y
0)andthecoordinatepointorpairs(x
0,y
0),(x
1,y
1),….,(x
k-1,y
k-1)
areencounteredintraversingtheboundaryincounterclockwisedirection.
•Thesecoordinatescanbeexpressedintheformx(k)=x
kady(k)=y
k.Theboundarycanbe
representedasthesequenceofcoordinatesas:
s(k)=[x(k),y(k)]fork=0,1,2,….K-1.
Theeachcoordinatespaircanbetreatedasacomplexnumbersothat:
s(k)=x(k)+jy(k).
•ThediscreteFouriertransformof1-Dsequence,s(k)canbewrittenas:
a(u)=
1
??????
σ
�=0
??????−1
??????(�)??????
−�2????????????�/??????
where,u=0,1,2,……..,K-1
•Thecomplexcoefficienta(u)arecalledtheFourierdescriptorsoftheboundary.
•TheinverseFouriertransformofthesecoefficientis:
s(k)=
1
??????
σ
�=0
??????−1
??????(??????)??????
�2????????????�/??????
where,k=0,1,2,……..,K-1
compiled by: Deepak Bhatta 17

Pattern Recognition

Human Perception
•Humanshavedevelopedhighlysophisticatedskillsforsensingtheirenvironment
andtakingactionsaccordingtowhattheyobserve.Examples:
➢Recognizingaface,
➢Understandingspokenwords,
➢Readinghandwriting,
➢Distinguishingfreshfoodfromitssmell.Etc.
•Wewouldliketogivesimilarcapabilitiestomachines.
compiled by: Deepak Bhatta 19

What is pattern?
•Apatternisanentity,vaguelydefined,thatcouldbegivenaname,e.g.,
➢Fingerprintimage
➢Handwrittenword
➢Humanface
➢Speechsignal
➢DNASequence,
➢……..
•Apatterninanarraignmentofdescriptors.Thenamefeatureisoftenusedinto
denoteadescriptor.
compiled by: Deepak Bhatta 20

What is pattern recognition?
•Apatternclassisafamilypatternthatsharesomecommonproperties.Patternclassesare
denotedbyw
1,w
2,w
3,……,w
WwhereWisthenumberofclasses.
•Apatternrecognition/patternclassificationisthestudyofhowmachinescan
•Observetheenvironment,
•Learntodistinguishpatternsofinterest,
•Makesoundandreasonabledecisionsaboutthecategoriesofthepatterns.
•Patternrecognitionbymachinesinvolvesforassigningpatternstotheirrespectiveclasses
–automaticallyandwithaslittlehumaninterventionaspossible.
•Patternvectorsarerepresentedbyboldlowercaseletters,suchasx,y,andz,andtakethe
form
x=
�
1
�
2
.
.
.
�
�
Equivalently,x=�
1,�
2,�
3
,
....,��
??????
•Whereeachcomponents,x
i,representthei
th
descriptorandnisthetotalnumberofsuch
descriptorsassociatedwiththepattern.
compiled by: Deepak Bhatta 21

Goal of Pattern Recognition
•Toassignaclasstoeachimage(orobjectwithinanimage)basedonanumerical
representationoftheimage’s(orobject’sproperties).
Techniques:
•Patternclassificationtechniquesareusuallyclassifiedintotwomaingroups:
1.Statistical,and
2.Structural(orsyntactical)
compiled by: Deepak Bhatta 22

Pattern Recognition Application
compiled by: Deepak Bhatta 23

An Example
•Problem:Sortingincoming
fishonconveyorbelt
accordingtospecies.
•Assumethatwehaveonlytwo
kindsoffish:
•Seabass,
•salmon
Figure: Picture Taken from a camera
compiled by: Deepak Bhatta 24

An Example: Decision Process
•Whatkindofinformationcandistinguishonespeciesfromtheother?
➢Length,width,numberandshapeoffins,tailshapeetc.
•Whatcancauseproblemsduringsensing?
➢Lightingconditions,positionoffishontheconveyorbelt,cameranoise,etc.
•Whatarethestepsintheprocess?
➢Captureimage→isolatefish→takemeasurement→makedecision
compiled by: Deepak Bhatta 25

An Example : Selecting Features
•Assumeafishermantoldusthataseabassisgenerallylongerthana
salmon.
•Wecanuselengthasafeatureanddecidebetweenseabassadsalmon
accordingtoathresholdonlength.
•Howcanwechoosethisthreshold?
compiled by: Deepak Bhatta 26

An Example : Selecting Features
compiled by: Deepak Bhatta 27

An Example : Selecting Features
•Eventhoughseabassislongerthansalmonontheaverage,thereare
manyexamplesoffishwherethisobservationdoesnothold.
•Tryanotherfeature,averagelightnessofthefishscales.
compiled by: Deepak Bhatta 28

An Example : Selecting Features
compiled by: Deepak Bhatta 29

An Example: Cost of Error
•Weshouldalsoconsidercostsofdifferenterrorswemakeinour
decisions.
•Forexample,ifthefishpackingcompanyknowsthat:
•Customerswhobuysalmonwillobjectvigorouslyiftheyseesea
bassintheircans.
•Customerswhobuyseabasswillnotbeunhappy,ifthey
occasionallyseesomeexpensivesalmonintheircans.
•Howdoesthisknowledgeaffectourdecision?
compiled by: Deepak Bhatta 30

An Example: Cost of Error
•Assumewealsoobservedthatseabassaretypicallywiderthansalmon.
•Wecanusetwofeaturesinourdecision:
➢Lightness:x
1
➢Width:x
2
•Eachfishimageisnowrepresentedasapoint(featurevector)
x=
�
1
�
2
inatwo-dimensionalfeaturespace.
compiled by: Deepak Bhatta 31

An Example: Multiple Features
compiled by: Deepak Bhatta 32

An Example: Multiple Features
•Doesaddingmorefeaturesalwaysimprovetheresults?
➢Avoidunreliablefeatures.
➢Becarefulaboutcorrelationwithexistingfeatures.
➢Becarefulaboutmeasurementcosts.
➢Becarefulaboutnoiseinthemeasurements.
•Istheresomecurseforworkinginveryhighdimensions?
compiled by: Deepak Bhatta 33

Pattern Recognitions Systems
compiled by: Deepak Bhatta 34

Pattern Recognitions Systems
•Dataacquisitionandsensing
•Measurementsofphysicalvariables.
•Importantissues:bandwidth,resolution,sensitivity,distortion,SNR,latency,
etc.
•Pre-processing
•Removalofnoiseindata
•Isolationofpatternsofinterestfromthebackground
•Featureextraction
•Findinganewrepresentationintermsoffeatures
compiled by: Deepak Bhatta 35

Pattern Recognitions Systems
•Modellearningandestimation
•Learningamappingbetweenfeaturesandpatternsgroupsandcategories
•Classification
•Usingfeaturesandlearnedmodelstoassignapatterntoacategory.
•Postprocessing
•Evaluationofconfidenceindecisions
•Exploitationofcontexttoimproveperformance
•Combinationofexperts
compiled by: Deepak Bhatta 36

The Design Cycle
DataCollection
•Collectingtrainingandtestingdata
•Howcanweknowwhenwehaveadequentlylargeandrepresentativesetof
samples?
Collect Data
Select
Feature
Select
Model
Train
Classifier
Evaluate
Classifier
Figure 8: The Design Cycle
compiled by: Deepak Bhatta 37

The Design Cycle
FeatureSelection
•Domaindependenceandpriorinformation
•Computationalcostandfeasibility
•Discriminativefeatures
•Similarvaluesforsimilarpatterns
•Differentvaluesfordifferentpatterns
•Invariantfeatureswithrespecttotranslation,rotationandscale
•Robustfeatureswithrespecttoocclusion,distortion,deformation,and
variationsinenvironment
compiled by: Deepak Bhatta 38

The Design Cycle
ModelSelection
•Domaindependenceandpriorinformation
•Definitionofdesigncriteria
•Parametricvs.non-parametricmodels.
•Handlingofmissingfeatures
•Computationalcomplexity
•Typesofmodels:templates,decision-theoreticorstatistical,syntacticalor
structural,neural,andhybrid
•Howcanweknowhowclosewearetothetruemodelunderlyingthepatterns?
compiled by: Deepak Bhatta 39

The Design Cycle
Training
•Howcanwelearntherulefromdata?
•Supervisedlearning:
•Ateacherprovidesacategorylabelorcostforeachpatterninthetrainingset.
•Unsupervisedlearning:
•Thesystemformsclustersornaturalgroupingoftheinputspatterns.
•Reinforcementlearning:
•Nodesiredcategoryisgivenbuttheteacherprovidesfeedbacktothesystemsuchasdecisionis
rightorwrong.
compiled by: Deepak Bhatta 40

The Design Cycle
Evaluation
•Howcanweestimatetheperformancewithtrainingsamples?
•Howcanwepredicttheperformancewithfuturedata?
•Problemsofover-fittingandgeneralization.
compiled by: Deepak Bhatta 41

Statistical Pattern Classification Techniques
•Wehavemanystatisticalpatternclassificationtechniques.Inthissection,we
discussthreebasictechniques:
•Theminimumdistanceclassifier
•Thek-nearestneighbors(KNN)classifier,and
•Themaximumlikelihood(orBayesian)classifier.
•Weknowthatobjects’propertiescanberepresentedusingfeaturevectorsthatare
projectedontoafeaturespace.
•Featurevectorsassociatedwithobjectsfromthesameclasswillappearclose
togetherasclustersinthefeaturespace.
•Thejobofastatisticalpatternclassificationtechniquesistofindadiscrimination
curve(orahypersurface,inthecaseofann-dimensionalfeaturespace)thatcan
telltheclustersapart.
compiled by: Deepak Bhatta 42

Classifier concept
Figure: Discrimination functions for a
three-class classifier in a 2D feature
space
compiled by: Deepak Bhatta 43

Statistical Pattern Classification Techniques
Figure: Diagram of a Statistical pattern Classifier
compiled by: Deepak Bhatta 44

Statistical Pattern Classification Techniques
•Astatisticalpatternclassifierhasninputs(thefeaturesoftheobjecttobe
classifiedencodedintofeaturevector,x=�
1,�
2,�
3
,
....,��
??????
andoneoutput(
theclasstowhichtheobjectbelongs,C(x),arerepresentedbyoneofthesymbols
byw
1,w
2,w
3,……,w
WwhereWisthenumberofclasses).Thesymbolsw
Ware
classidentifiers.
•Classifiersmakecomparisonsamongtherepresentationsoftheunknownobject
andtheknowclasses.Thesecomparisonsprovideinformationtomakedecision
aboutwhichclasstoassigntotheunknownpattern.
•Mathematically,thejoboftheclassifieristoapplyaseriesofdecisionrulesthat
dividethefeaturespaceintoWdisjointsubsetsK
w,w=1,2,……,W,eachof
whichincludesthefeaturevectorsxforwhichd(x)=w
w,whered(.)isthe
decisionrule.
compiled by: Deepak Bhatta 45

Minimum Distance Classifier
•Theminimumdistanceclassifier(alsoknowasthenearest-classmeanclassifier)
worksbycomputingadistancemetricbetweenanunknownfeaturevectorandthe
centroids(i.e.meanvectors)ofeachclass:
dj(x)=??????−????????????,
•whered
jisadistancemetricbetweenclassjandtheunknownfeaturevectorx,mj
isthemeanvectorforclassj,definedas
m
j=
1
??????
�
σ
??????????????????�??????
??????
•WhereN
jisthenumberofpatternvectorsformclassw
j.
compiled by: Deepak Bhatta 46

Minimum Distance Classifier
Figure: Example of two classes and their mean vectors.
Figure: Example of three classes with relatively complex
structure
compiled by: Deepak Bhatta 47

K-Nearest Neighbors Classifier
•Thek-nearestneighbours(KNN)classifierworksbycomputingthedistance
betweenanunknownpattern’sfeaturevectorxandtheclosestpointstoitinthe
featurespace,andthenassigningtheunknownpatterntotheclasstowhichthe
majorityoftheksamplespointsbelong.
•Themainadvantagesofthisapproachareitssimplicity(e.g.,noassumptionsneed
tobemadeabouttheprobabilitydistributionofeachclass)andversatility(e.g.,it
handlesoverlappingclassesorclasseswithcomplexstructureaswell.
•Itmaindisadvantageisthecomputationalcostinvolvedincomputingdistances
betweentheunknownsampleandmanystoredpointsinthefeaturespace.
compiled by: Deepak Bhatta 48

K-Nearest Neighbors Classifier
Figure: (a) Example of KNN classifier ( k = 1) for a five-class classifier in a 2D feature space. (b) Minimum
distance classifier results for the same data set.
compiled by: Deepak Bhatta 49

Bayesian Classifier
•TheBayesianClassifieristhataclassificationdecisioncanbemadebasedontheprobability
distributionsofthetrainingsamplesforeachclass;thatis,anunknownobjectisassignedtothe
classtowhichitismorelikelytobelongbasedontheobservedfeatures.
•ThemathematicalcalculationsperformedbyaBayesianclassifierrequirethreeprobability
distributions:
•Theapriori(orprior)probabilityforeachw
k,denotedbyP(w
k).
•Theunconditionaldistributionofthefeaturevectorrepresentingthemeasuredpatterx,denotedbyp(x).
•Theclassconditionaldistribution,thatis,theprobabilityofxgivenclassw
k,denotedbyp(x|w
k).
•ApplyingBayes’rule,thesethreedistributionsarethenusedtocomputetheaposteriori
probabilitythatapatterxcomesfromclassw
k,representedasp(w
k|x),asfollows:
p(w
k|x)=
??????????????????
�
).??????(??????�)
??????(??????)
=
??????????????????
�
).??????(??????�)
σ
??????=1
??????
??????????????????
�
).??????(??????�)
compiled by: Deepak Bhatta 50

Overview of Neural Network in image processing
compiled by: Deepak Bhatta 51
Connectionizm
NNisahighlyinterconnectedstructureinsuchawaythatthestateofone
neuronaffectsthepotentialofthelargenumberofanotherneuronstowhichit
isconnectedaccordingtoweightsofconnections
Not Programming but Training
NNistrainedratherthanprogrammedtoperformthegiventasksinceitis
difficulttoseparatethehardwareandsoftwareinthestructure.Weprogram
notsolutionoftasksbutabilityoflearningtosolvethetasks
�
11�
11�
11�
11
�
11�
11�
11�
11
�
11�
11�
11�
11
�
11�
11�
11�
11
Distributed Memory
NNpresentsandistributedmemorysothatchanging-adaptationofsynapse
cantakeplaceeverywhereinthestructureofthenetwork.

compiled by: Deepak Bhatta 52()
2
xy=
Learning and Adaptation
NNarecapabletoadaptthemselves(thesynapsesconnectionsbetween
units)tospecialenvironmentalconditionsbychangingtheirstructureor
strengthsconnections.
Non-Linear Functionality
Everynewstatesofaneuronisanonlinearfunctionoftheinput
patterncreatedbythefiringnonlinearactivityoftheotherneurons.
Robustness of Assosiativity
NNstatesarecharacterizedbyhighrobustnessorinsensitivityto
noisyandfuzzyofinputdataowingtouseofahighlyredundance
distributedstructure

compiled by: Deepak Bhatta 53
Human brain contains
a massively
interconnected net of
10
10
-10
11
neurons or
nerve cells
Brain Computer: What is it?
Biological Neuron
-The simple
arithmetic
computing”
element

Biological Prototypes and Artificial Neurons
The schematic model
of a biological neuron
Synapses
Dendrites
Soma
Axon
Dendrite
from
other
Axon from
other neuron
1.Somaorbodycell-isalarge,round
centralbodyinwhichalmostallthelogical
functionsoftheneuronarerealized.
2.Theaxon(output),isanervefibre
attachedtothesomawhichcanserveasa
finaloutputchanneloftheneuron.Anaxon
isusuallyhighlybranched.
3.Thedendrites(inputs)-representa
highlybranchingtreeoffibres.Theselong
irregularlyshapednervefibres(processes)
areattachedtothesoma.
4.Synapsesarespecializedcontactsona
neuronwhicharetheterminationpointsfor
theaxonsfromotherneurons.

55
Artificial Neural Networks
ArtificialNeuralNetworksprovide…
-Anewcomputingparadigm
-Atechniquefordevelopingtrainableclassifiers,memories,dimension-reducing
mappings,etc
-Atooltostudybrainfunction

56
Vision, AI and ANNs
•1940s:beginningofArtificialNeuralNetworks
McCullogh&Pitts,1942
S
iw
ix
iq
Perceptronlearningrule(Rosenblatt,1962)
Backpropagation
Hopfieldnetworks(1982)
Kohonenself-organizingmaps
input outputneuron
m M
Sm

57
Vision, AI and ANNs
1950s:beginningofcomputervision
Aim:givetomachinessameorbettervisioncapabilityasours
Drive:AI,roboticsapplicationsandfactoryautomation
Initially:passive,feedforward,layeredandhierarchicalprocess
thatwasjustgoingtoprovideinputtohigherreasoning
processes(fromAI)
Butsoon:realizedthatcouldnothandlerealimages
1980s:Activevision:makethesystemmorerobustbyallowingthe
visiontoadaptwiththeongoingrecognition/interpretation

58
•AMcCulloch-Pittsneuronoperatesonadiscrete
time-scale,t=0,1,2,3,... withtimetickequalto
onerefractoryperiod
•Ateachtimestep,aninputoroutputis
onoroff—1or0,respectively.
•Eachconnectionorsynapsefromtheoutputofoneneuronto
theinputofanother,hasanattachedweight.
Warren McCulloch and Walter Pitts (1943)x (t)
1
x (t)
n
x (t)
2
y(t+1)
w
1
2
n
w
w
axon
q

59
Excitatory and Inhibitory Synapses
•Wecallasynapse
excitatoryifw
i>0,and
inhibitoryifw
i<0.
•Wealsoassociateathresholdqwitheachneuron
•Aneuronfires(i.e.,hasvalue1onitsoutputline)attimet+1iftheweightedsum
ofinputsattreachesorpassesq:
y(t+1) = 1 if and only if Sw
ix
i(t) q

60
From Logical Neurons to Finite Automata
AND
1
1
1.5
NOT
-1
0
OR
1
1
0.5
Brains, Machines, and
Mathematics, 2nd Edition,
1987
X Y→
Boolean Net
X
Y Q
Finite
Automaton

61
Hopfield Networks
•ApaperbyJohnHopfieldin1982wasthecatalyst
inattractingtheattentionofmanyphysiciststo
"NeuralNetworks".
•InanetworkofMcCulloch-Pittsneurons
whoseoutputis1iffSwijsjq
iandisotherwise0,
neuronsareupdatedsynchronously:everyneuronprocessesits
inputsateachtimesteptodetermineanewoutput.

62
Hopfield Networks
•AHopfieldnet(Hopfield1982)isanetofsuchunitssubjecttotheasynchronous
ruleforupdatingoneneuronatatime:
"Pickaunitiatrandom.
IfSwijsjq
i,turniton.
Otherwiseturnitoff."
•Moreover,Hopfieldassumessymmetricweights:
wij=wji

63
Multi-layer Perceptron Classifier

64
Multi-layer Perceptron Classifier

65
Classifiers
•1-Stageapproach
•2-Stageapproach

66
Example: face recognition
•Hereusingthe2-stageapproach:

67
SOM : Self Organized Map
•Provideawayofrepresentingmultidimensionaldatainmuchlowerdimensional
spaces-usuallyoneortwodimensions.
•Vectorquantization:Reducingthedimensionalityofvectors.

68
Nuro Fuzzy Systems
NeuralNetworks
Neuralnetworksaregoodatrecognizingpatterns,theyarenotgood
atexplaininghowtheyreachtheirdecisions.
FuzzySystems
Fuzzylogicsystems,whichcanreasonwithimpreciseinformation,are
goodatexplainingtheirdecisionsbuttheycannotautomatically
acquiretherulestheyusetomakethosedecisions.
Need a Intelligent Hybrid System !!