chapter-1and2gonzalezandwoods-160204072955.pdf

MahaboobPasha37 64 views 114 slides Aug 01, 2024
Slide 1
Slide 1 of 114
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101
Slide 102
102
Slide 103
103
Slide 104
104
Slide 105
105
Slide 106
106
Slide 107
107
Slide 108
108
Slide 109
109
Slide 110
110
Slide 111
111
Slide 112
112
Slide 113
113
Slide 114
114

About This Presentation

Introduction to digital image processing


Slide Content

Chapter-1 and 2
Subject: FIP (181102)
Prof. AsodariyaBhavesh
ECD,SSASIT, Surat

Digital Image Processing, 3
rd
edition by
Gonzalez and Woods

Optics and Human Vision
The physics of light
http://commons.wikimedia.org/wiki/File:Eye-diagram_bg.svg

Light
Light
Particles known as photons
Act as ‘waves’
Two fundamental properties
Amplitude
Wavelength
Frequency is the inverse of wavelength
Relationship between wavelength (lambda) and frequency (f)fc/
Where c = speed of light = 299,792,458 m / s
4

What is Digital Image Processing?
Digital image processing focuses on two major tasks
Improvement of pictorial information for human
interpretation
Processing of image data for storage, transmission and
representation for autonomous machine perception
Some argument about where image processing ends
and fields such as image analysis and computer vision
start

What is DIP? (cont…)
The continuum from image processing to computer
vision can be broken up into low-, mid-and high-level
processes
Low Level Process
Input:Image
Output:Image
Examples:Noise
removal, image
sharpening
Mid Level Process
Input:Image
Output:Attributes
Examples:Object
recognition,
segmentation
High Level Process
Input: Attributes
Output:Understanding
Examples: Scene
understanding,
autonomous navigation
In this course we will
stop here

History of Digital Image Processing
Early 1920s: One of the first applications of digital
imaging was in the news-
paper industry
The Bartlane cable picture
transmission service
Images were transferred by submarine cable between
London and New York
Pictures were coded for cable transfer and reconstructed
at the receiving end on a telegraph printer
Early digital image
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

History of DIP (cont…)
Mid to late 1920s: Improvements to the Bartlane
system resulted in higher quality images
New reproduction
processes based
on photographic
techniques
Increased number
of tones in
reproduced images
Improved
digital imageEarly 15 tone digital
image
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

History of DIP (cont…)
1960s:Improvements in computing technology and
the onset of the space race led to a surge of work in
digital image processing
1964: Computers used to
improve the quality of
images of the moon taken
by the Ranger 7probe
Such techniques were used
in other space missions
including the Apollo landings
A picture of the moon taken
by the Ranger 7 probe
minutes before landing
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

History of DIP (cont…)
1970s:Digital image processing begins to be used in
medical applications
1979:Sir Godfrey N.
Hounsfield & Prof. Allan M.
Cormack share the Nobel
Prize in medicine for the
invention of tomography,
the technology behind
Computerised Axial
Tomography (CAT) scans
Typical head slice CAT
image
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Key Stages in Digital Image Processing:
Image Aquisition
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Image Enhancement
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Image Restoration
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Morphological Processing
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Segmentation
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Object Recognition
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Representation & Description
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Key Stages in Digital Image Processing:
Image Compression
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Key Stages in Digital Image Processing:
Colour Image Processing
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Visible Spectrum
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Light
Diagram of a light wave.
22

Conventional Coordinate for Image Representation
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Digital Image Types : Intensity Image
Intensity image or monochrome image
each pixel corresponds to light intensity
normally represented in gray scale (gray
level).











39871532
22132515
372669
28161010
Gray scale values













39871532
22132515
372669
28161010 











39656554
42475421
67965432
43567065 











99876532
92438585
67969060
78567099 Digital Image Types : RGB Image
Color image or RGB image:
each pixel contains a vector
representing red, green and
blue components.
RGB components

Image Types : Binary Image
Binary image or black and white image
Each pixel contains one bit :
1 represent white
0 represents black











1111
1111
0000
0000
Binary data

Image Types : Index Image
Index image
Each pixel contains index number
pointing to a color in a color table









256
746
941
Index value
Index
No.
Red
component
Green
component
Blue
component
1 0.1 0.5 0.3
2 1.0 0.0 0.0
3 0.0 1.0 0.0
4 0.5 0.5 0.5
5 0.2 0.8 0.9
… … … …
Color Table

Cross Section of the Human Eye
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Human Eye
29

Anatomy of the Human Eye
30 Source:
http://webvision.med.utah.edu/

Human Visual System
Human vision
Corneaacts as a protective lens that roughly focuses
incoming light
Iriscontrols the amount of light that enters the eye
The lenssharply focuses incoming light onto the retina
Absorbs both infra-red and ultra-violet light which can damage
the lens
The retinais covered by photoreceptors(light
sensors) which measure light
31

Photoreceptors
Rods
Approximately 100-150 million rods
Non-uniform distribution across the retina
Sensitive to low-light levels (scotopicvision)
Lower resolution
Cones
Approximately 6-7 million cones
Sensitive to higher-light levels (photopicvision)
High resolution
Detect color by the use of 3 different kinds of cones each of
which is sensitive to red, green, or blue frequencies
Red (L cone) : 564-580 nm wavelengths (65% of all cones)
Green (M cone) : 534-545 nm wavelengths (30% of all cones)
Blue (S cone) : 420-440 nm wavelengths (5% of all cones)
33

Cone (LMS) and Rod (R) responses
http://en.wikipedia.org/wiki/File:Cone-response.svg34

Photoreceptor density across retina
35

Comparison between rods and cones
36
Rods Cones
Used for night vision Used for day vision
Loss causes night blindness Loss causes legal blindness
Low spatial resolution with higher
noise
High spatial resolution with lower
noise
Not present in fovea Concentrated in fovea
Slower time response to light Quicker time response to light
One type of photosensitivepigmentThree types of photosensitive
pigment
Emphasisonmotion detection Emphasison detecting fine detail

Color and Human Perception
Chromatic light
has a color component
Achromatic light
has no color component
has only one property –intensity
37

Image Formation in the Human Eye
(Picture from Microsoft Encarta 2000)
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Brightness Adaptation
Actual light intensity is (basically)
log-compressed for perception.
Human vision can see light
between the glare limit and
scotopicthreshold but not all
levels at the same time.
The eye adjusts to an average
value (the red dot) and can
simultaneously see all light in a
smaller range surrounding the
adaptation level.
Light appears black at the bottom
of the instantaneous range and
white at the top of that range.
39

Weber Ratio ∆I/I

Weber Ratio

Human Visual Perception
Light intensity:
The lowest (darkest) perceptible intensity is the scotopic
threshold
The highest (brightest) perceptible intensity is the glare limit
The difference between these two levels is on the order of 10
10
We can’t discriminate all these intensities at the same time! We
adjust to an average value of light intensities and then discriminate
around the average.
Log compression.
Experimental results show that the relationship between the
perceived amount of light and the actual amount of light in a
scene are generally related logarithmically.
The human visual system perceives brightness as the logarithm of the
actual light intensity and interprets the image accordingly.
Consider, for example, a bright light source that is approximately
6times brighter than another. The eye will perceive the brighter light as
approximately twice the brightness of the darker.
42

Brightness Adaptation and Mach Banding
43
When viewing any scene:
The eye rapidly scans across the fieldof view while
coming to momentary rest at each point of particular
interest.
At each of these points the eye adapts to the average
brightness of the local region surrounding the point of
interest.
This phenomena is known as local brightness
adaptation.
Mach banding is a visual effect that results, in part, from local
brightness adaptation.
The eye over-shoots/under-shoots at edges where the
brightness changes rapidly. This causes ‘false perception’ of
the intensities
Examples follow….

Brightness Adaptation and Mach Banding
44

Brightness Adaptation(Hermann Grid)
45

46

Optical illusion
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Simultaneous Contrast
Simultaneous contrast refers to the way in which two
adjacent intensities (or colors) affect each other.
Example: Note that a blank sheet of paper may appear
white when placed on a desktop but may appear black
when used to shield the eyes against the sun.
Figure 2.9 is a common way of illustrating that the
perceived intensity of a region is dependent upon the
contrast of the region with its local background.
The four inner squares are of identical intensity but are
contextualized by the four surrounding squares
The perceived intensity of the inner squares varies from bright
on the left to dark on the right.
48

Simultaneous Contrast
49

Image Sensing and acquisition
Single sensor
Line sensor
Array sensor
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Image Sensors : Single Sensor
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Image Sensors : Line Sensor
Fingerprint sweep sensor
Computerized Axial Tomography
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

CCD KAF-3200E from Kodak.
(2184 x 1472 pixels,
Pixel size 6.8 microns
2
)
Charge-Coupled Device (CCD)
wUsed for convert a continuous
image into a digital image
wContains an array of light sensors
wConverts photon into electric charges
accumulated in each sensor unit
Image Sensors : Array Sensor

Horizontal Transportation Register
Output Gate
Amplifier
Vertical Transport Register
Gate
Vertical Transport Register
Gate
Vertical Transport Register
Gate
Photosites Output
Image Sensor: Inside Charge-Coupled Device

Image Sensor: How CCD works
abc
ghi
def
abc
ghi
def
abc
ghi
def
Vertical shift
Horizontal shift
Image pixel
Horizontal transport
register
Output

Digital Image Acquisition Process
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Generating a Digital Image
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Image Sampling and Quantization
Image sampling: discretize an image in the spatial domain
Spatial resolution / image resolution: pixel size or number of pixels
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

How to choose the spatial resolution
= Sampling locations
Original image
Sampled image
Under sampling, we lost some image details!
Spatial resolution

How to choose the spatial resolution : Nyquist Rate
Original image
= Sampling locations
Minimum
Period
Spatial resolution
(sampling rate)
Sampled image
No detail is lost!
Nyquist Rate:
Spatial resolution must be less or equal
half of the minimum period of the image
or sampling frequency must be greater or
Equal twice of the maximum frequency.
2mm
1mm

0 0.5 1 1.5 2
-1
-0.5
0
0.5
1
0 0.5 1 1.5 2
-1
-0.5
0
0.5
1 1 ),2sin()(
1  fttx  6 ),12sin()(
2  fttx  Sampling rate:
5 samples/sec
Aliased Frequency
Two different frequencies but the same results !

Effect of Spatial Resolution
256x256 pixels
64x64 pixels
128x128 pixels
32x32 pixels

Spatial Resolution
It is a measure of the smallest discernible detail in an
image
Can be stated in line pairs per unit distance, and
dots(pixels) per unit distance
Dots per unit distance commonly used in printing
and publishing industry (dots per inch)
Newspaper are printed with a resolution of 75 dpi,
magazines at 133 dpi, and glossy brochures at175
dpi
examples

Effect of Spatial Resolution
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Effect of Spatial Resolution
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Can we increase spatial resolution by interpolation ?
Down sampling is an irreversible process.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Image Quantization
Image quantization:
discretizecontinuous pixel values into discrete numbers
Color resolution/ color depth/ levels:
-No. of colors or gray levels or
-No. of bits representing each pixel value
-No. of colors or gray levels N
cis given byb
cN2
where b= no. of bits

Quantization function
Light intensity
Quantization level
0
1
2
N
c-1
N
c-2
Darkest Brightest

Intensity Resolution
It refers to the smallest discernible change in
intensity level
Number of intensity levels usually is an integer
power of two
Also refers to Number of bits used to quantize
intensity as the intensity resolution
Which intensity resolution is good for human
perception 8 bit, 16 bit, or 32 bit

Effect of Quantization Levels or Intensity resolution
256 levels 128 levels
32 levels64 levels

Effect of Quantization Levels (cont.)
16 levels 8 levels
2 levels4 levels
In this image,
it is easy to see
false contour.
Effect of Quantization Levels or Intensity resolution

How to select the suitable size and pixel depth of images
Low detail image Medium detail image High detail image
Lena image Cameraman image
To satisfy human mind
1. For images of the same size, the low detail image may need more pixel depth.
2. As an image size increase, fewer gray levels may be needed.
The word “suitable” is subjective: depending on “subject”.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

IsopreferenceCurve
Curves tend to become more vertical as the detail in
the image increases
Image with a large amount of detail only a few
intensity levels may be needed

Image Interpolation
Used in image resizing (zooming and shrinking),
rotating, and geometric corrections
Interpolation is the process of using known data to
estimate values at unknown locations
Nearest Neighbor interpolation
It assigns to each new location the intensity of its nearest
neighbor in the original image
Produce undesirable artifacts, such as severe distortion of
straight edges
Bilinear Interpolation
We use the four nearest neighbors to estimate the
intensity
V(x, y) = ax + by + cxy+ d

Image Interpolation
Need to solve four equations
Better results than nearest neighbor interpolation, with a
modest increase in computational burden
BicubicInterpolation
Involves sixteen neighbors to estimate intensity
V(x, y) = ∑∑a
ijx
i
y
j
( i, j = 0 to 3)
Need to solve sixteen equations
Gives better results than other methods
More complex
Used in Adobe Photoshop, and Corel Photopaint

Basic Relationship of Pixels
x
y
(0,0)
Conventional indexing method
(x,y)(x+1,y)(x-1,y)
(x,y-1)
(x,y+1)
(x+1,y-1)(x-1,y-1)
(x-1,y+1) (x+1,y+1)

Neighbors of a Pixel
p (x+1,y)(x-1,y)
(x,y-1)
(x,y+1)
4-neighbors of p:
N
4(p)=
(x-1,y)
(x+1,y)
(x,y-1)
(x,y+1)
Neighborhood relation is used to tell adjacent pixels. It is
useful for analyzing regions.
Note: qN
4(p) implies pN
4(q)
4-neighborhood relation considers only vertical and
horizontal neighbors.

p (x+1,y)(x-1,y)
(x,y-1)
(x,y+1)
(x+1,y-1)(x-1,y-1)
(x-1,y+1) (x+1,y+1)
Neighbors of a Pixel (cont.)
8-neighbors of p:
(x-1,y-1)
(x,y-1)
(x+1,y-1)
(x-1,y)
(x+1,y)
(x-1,y+1)
(x,y+1)
(x+1,y+1)
N
8(p)=
8-neighborhood relation considers all neighbor pixels.

p
(x+1,y-1)(x-1,y-1)
(x-1,y+1) (x+1,y+1)
Diagonal neighbors of p:
N
D(p)=
(x-1,y-1)
(x+1,y-1)
(x-1,y+1)
(x+1,y+1)
Neighbors of a Pixel (cont.)
Diagonal -neighborhood relation considers only diagonal
neighbor pixels.

Connectivity
Connectivityisadaptedfromneighborhoodrelation.Twopixelsareconnected
iftheyareinthesameclass(i.e.thesamecolororthesamerangeofintensity)
andtheyareneighborsofoneanother.
For p and qfrom the same class
w4-connectivity: pand qare 4-connected if qN
4(p)
w8-connectivity: pand qare 8-connected if qN
8(p)
wmixed-connectivity (m-connectivity):
pand qare m-connected if qN
4(p) or
qN
D(p) and N
4(p) N
4(q) = 

Adjacency
A pixel pis adjacentto pixel qis they are connected.
Two image subsets S
1and S
2are adjacent if some pixel
in S
1is adjacent to some pixel in S
2
S
1
S
2
We can define type of adjacency: 4-adjacency, 8-adjacency
or m-adjacency depending on type of connectivity.

Path
A pathfrom pixel pat (x,y)to pixel qat (s,t)is a sequence
of distinct pixels:
(x
0,y
0), (x
1,y
1), (x
2,y
2),…, (x
n,y
n)
such that
(x
0,y
0) = (x,y)and (x
n,y
n) = (s,t)
and
(x
i,y
i)is adjacent to (x
i-1,y
i-1), i= 1,…,n
p
q
We can define type of path: 4-path, 8-path or m-path
depending on type of adjacency.

Path (cont.)
p
q
p
q
p
q
8-path from pto q
results in some ambiguity
m-path from pto q
solves this ambiguity
8-path m-path

Distance
For pixel p, q, and zwith coordinates (x,y), (s,t)and (u,v),
Dis a distance functionor metricif
wD(p,q) 0 (D(p,q) = 0 if and only if p= q)
wD(p,q) = D(q,p)
wD(p,z) D(p,q) + D(q,z)
Example: Euclidean distance22
)()(),( tysxqpD
e
-+-

Distance (cont.)
D
4-distance(city-block distance) is defined astysxqpD -+-),(
4
12
10
12
1
2
2
2
2
2
2
Pixels with D
4(p) = 1 is 4-neighbors of p.

Distance (cont.)
D
8-distance(chessboard distance) is defined as),max(),(
8 tysxqpD --
1
2
10
1
2
1
2
2
2
2
2
2
Pixels with D
8(p) = 1 is 8-neighbors of p.
22
2
2
2
222
1
1
1
1

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111
Boundary (Border or Contour)
of a region Ris the set of points that are adjacent to
points in the complement of R.
ofa region is the set of pixels in the region that have
at least one background neighbor.
InnerBorder
OuterBorder

Moire Pattern Effect : Special Case of Sampling
Moire patterns occur when frequencies of two superimposed
periodic patterns are close to each other.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2
nd
Edition.

Human vision: Spatial Frequency vs Contrast

Human vision: Distinguish ability for Difference in brightness
Regions with 5% brightness difference