Lect0 1 digital imaging processing part 1.ppt

LelisaAH 15 views 118 slides Mar 11, 2025
Slide 1
Slide 1 of 118
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101
Slide 102
102
Slide 103
103
Slide 104
104
Slide 105
105
Slide 106
106
Slide 107
107
Slide 108
108
Slide 109
109
Slide 110
110
Slide 111
111
Slide 112
112
Slide 113
113
Slide 114
114
Slide 115
115
Slide 116
116
Slide 117
117
Slide 118
118

About This Presentation

chapter 1


Slide Content

CS589-04 Digital Image ProcessingCS589-04 Digital Image Processing
Lecture 1 Lecture 1
Introduction & FundamentalsIntroduction & Fundamentals
Spring 2008Spring 2008

Weeks 1 & 2 2
Introduction to the courseIntroduction to the course
► Article Reading and Project
Medical image analysis (MRI/PET/CT/X-ray tumor
detection/classification)
Face, fingerprint, and other object recognition
Image and/or video compression
Image segmentation and/or denoising
Digital image/video watermarking/steganography
and detection
Whatever you’re interested …

Weeks 1 & 2 3
Introduction to the courseIntroduction to the course
► Evaluation of article reading and project
Report
Article reading
— Submit a survey of the articles you read and the list of the
articles
Project
— Submit an article including introduction, methods,
experiments, results, and conclusions
— Submit the project code, the readme document, and some
testing samples (images, videos, etc.) for validation
Presentation

Weeks 1 & 2 4
Journals & Conferences Journals & Conferences
in Image Processingin Image Processing
► Journals:
— IEEE T IMAGE PROCESSING
— IEEE T MEDICAL IMAGING
— INTL J COMP. VISION
— IEEE T PATTERN ANALYSIS MACHINE INTELLIGENCE
— PATTERN RECOGNITION
— COMP. VISION AND IMAGE UNDERSTANDING
— IMAGE AND VISION COMPUTING
… …
► Conferences:
— CVPR: Comp. Vision and Pattern Recognition
— ICCV: Intl Conf on Computer Vision
— ACM Multimedia
— ICIP
— SPIE
— ECCV: European Conf on Computer Vision
— CAIP: Intl Conf on Comp. Analysis of Images and Patterns
… …

►What is an image?What is an image?
►A digital image can be considered as a A digital image can be considered as a
discrete representation of data possessing discrete representation of data possessing
both both spatial (layout) spatial (layout) and and intensity (color) intensity (color)
information. information.
►consider treating an image as a consider treating an image as a
multidimensional signalmultidimensional signal..
Weeks 1 & 2 5

Weeks 1 & 2 6
IntroductionIntroduction
► What is Digital Image Processing?

Digital Image
— a two-dimensional function
x and y are spatial coordinates
The amplitude of f is called intensity or gray level at the point (x, y)
Digital Image Processing
— process digital images by means of computer, it covers low-, mid-, and high-level
processes
low-level: inputs and outputs are images
mid-level: outputs are attributes extracted from input images
high-level: an ensemble of recognition of individual objects
Pixel
— the elements of a digital image
( , )f x y

Weeks 1 & 2 7
Origins of Digital Image Origins of Digital Image
ProcessingProcessing


Sent by submarine cable
between London and
New York, the
transportation time was
reduced to less than
three hours from more
than a week

Weeks 1 & 2 8
Origins of Digital Image Origins of Digital Image
ProcessingProcessing

Weeks 1 & 2 9
Sources for ImagesSources for Images
► Electromagnetic (EM) energy spectrumElectromagnetic (EM) energy spectrum
► AcousticAcoustic
► UltrasonicUltrasonic
► ElectronicElectronic
► Synthetic images produced by computerSynthetic images produced by computer

Weeks 1 & 2 10
Electromagnetic (EM) energy spectrumElectromagnetic (EM) energy spectrum
Major uses
Gamma-ray imaging: nuclear medicine and astronomical observations
X-rays: medical diagnostics, industry, and astronomy, etc.
Ultraviolet: lithography, industrial inspection, microscopy, lasers, biological imaging,
and astronomical observations
Visible and infrared bands: light microscopy, astronomy, remote sensing, industry,
and law enforcement
Microwave band: radar
Radio band: medicine (such as MRI) and astronomy

Weeks 1 & 2 11
Examples: Gama-Ray Imaging Examples: Gama-Ray Imaging

Weeks 1 & 2 12
Examples: X-Ray Imaging Examples: X-Ray Imaging

Weeks 1 & 2 13
Examples: Ultraviolet Imaging Examples: Ultraviolet Imaging

Weeks 1 & 2 14
Examples: Light Microscopy Imaging Examples: Light Microscopy Imaging

Weeks 1 & 2 15
Examples: Visual and Infrared Imaging Examples: Visual and Infrared Imaging

Weeks 1 & 2 16
Examples: Visual and Infrared Imaging Examples: Visual and Infrared Imaging

Weeks 1 & 2 17
Examples: Infrared Satellite Imaging Examples: Infrared Satellite Imaging


USA 1993USA 2003

Weeks 1 & 2 18
Examples: Infrared Satellite Imaging Examples: Infrared Satellite Imaging

Weeks 1 & 2 19
Examples: Automated Visual InspectionExamples: Automated Visual Inspection

Weeks 1 & 2 20
Examples: Automated Visual InspectionExamples: Automated Visual Inspection


The area in which
the imaging
system detected
the plate
Results of
automated
reading of the
plate content
by the system

Weeks 1 & 2 21
Example of Radar ImageExample of Radar Image

Weeks 1 & 2 22
Examples: MRI (Radio Band)Examples: MRI (Radio Band)

Weeks 1 & 2 23
Examples: Ultrasound ImagingExamples: Ultrasound Imaging

Weeks 1 & 2 24
Fundamental Steps in DIPFundamental Steps in DIP


Result is more
suitable than
the original
Improving the
appearance
Extracting image
components
Partition an image
into its constituent
parts or objects
Represent image for
computer processing

Weeks 1 & 2 25
Light and EM SpectrumLight and EM Spectrum
c , : Planck's constant.E h h

Weeks 1 & 2 26
Light and EM SpectrumLight and EM Spectrum
►The colors that humans perceive in an object The colors that humans perceive in an object
are determined by the nature of the light are determined by the nature of the light
reflected from the object.reflected from the object.

e.g. green objects reflect light with wavelengths
primarily in the 500 to 570 nm range while absorbing
most of the energy at other wavelength

Weeks 1 & 2 27
Light and EM SpectrumLight and EM Spectrum
►Monochromatic light: void of color
Intensity is the only attribute, from black to white
Monochromatic images are referred to as gray-scale
images
►Chromatic light bands: 0.43 to 0.79 um
The quality of a chromatic light source:
Radiance: total amount of energy
Luminance (lm): the amount of energy an observer perceives
from a light source
Brightness: a subjective descriptor of light perception that is
impossible to measure. It embodies the achromatic notion of
intensity and one of the key factors in describing color sensation.

Weeks 1 & 2 28
Image AcquisitionImage Acquisition
Transform
illumination
energy into
digital images

Weeks 1 & 2 29
Image Acquisition Using a Single SensorImage Acquisition Using a Single Sensor

Weeks 1 & 2 30
Image Acquisition Using Sensor StripsImage Acquisition Using Sensor Strips

Weeks 1 & 2 31
Image Acquisition ProcessImage Acquisition Process

Weeks 1 & 2 32
A Simple Image Formation ModelA Simple Image Formation Model
( , ) ( , ) ( , )
( , ): intensity at the point ( , )
( , ): illumination at the point ( , )
(the amount of source illumination incident on the scene)
( , ): reflectance/transmissivity
f x y i x y r x y
f x y x y
i x y x y
r x y
 
at the point ( , )
(the amount of illumination reflected/transmitted by the object)
where 0 < ( , ) < and 0 < ( , ) < 1
x y
i x y r x y

Weeks 1 & 2 33
Some Typical Ranges of illuminationSome Typical Ranges of illumination
► IlluminationIllumination
Lumen — A unit of light flow or luminous flux
Lumen per square meter (lm/m
2
) — The metric unit of measure
for illuminance of a surface
On a clear day, the sun may produce in excess of 90,000 lm/m
2
of
illumination on the surface of the Earth
On a cloudy day, the sun may produce less than 10,000 lm/m
2
of
illumination on the surface of the Earth
On a clear evening, the moon yields about 0.1 lm/m
2
of illumination
The typical illumination level in a commercial office is about 1000 lm/m
2

Weeks 1 & 2 34
Some Typical Ranges of ReflectanceSome Typical Ranges of Reflectance
► ReflectanceReflectance
0.01 for black velvet
0.65 for stainless steel
0.80 for flat-white wall paint
0.90 for silver-plated metal
0.93 for snow

Weeks 1 & 2 35
Image Sampling and QuantizationImage Sampling and Quantization
Digitizing the
coordinate
values
Digitizing the
amplitude
values

Weeks 1 & 2 36
Image Sampling and QuantizationImage Sampling and Quantization

Weeks 1 & 2 37
Representing Digital ImagesRepresenting Digital Images

Weeks 1 & 2 38
Representing Digital ImagesRepresenting Digital Images
►The representation of an M×N numerical
array as

(0,0) (0,1) ... (0, 1)
(1,0) (1,1) ... (1, 1)
( , )
... ... ... ...
( 1,0) ( 1,1) ... ( 1, 1)
f f f N
f f f N
f x y
f M f M f M N
 
 

 
 
 
    

Weeks 1 & 2 39
Representing Digital ImagesRepresenting Digital Images
►The representation of an M×N numerical
array as

0,0 0,1 0, 1
1,0 1,1 1, 1
1,0 1,1 1, 1
...
...
... ... ... ...
...
N
N
M M M N
a a a
a a a
A
a a a


   
 
 
 

 
 
 

Weeks 1 & 2 40
Representing Digital ImagesRepresenting Digital Images
►The representation of an M×N numerical
array in MATLAB

(1,1) (1,2) ... (1, )
(2,1) (2,2) ... (2, )
( , )
... ... ... ...
( ,1) ( ,2) ... ( , )
f f f N
f f f N
f x y
f M f M f M N
 
 
 
 
 
 

Weeks 1 & 2 41
Representing Digital ImagesRepresenting Digital Images
► Discrete intensity interval [0, L-1], L=2
k
► The number b of bits required to store a M × N
digitized image
b = M × N × k

Weeks 1 & 2 42
Representing Digital ImagesRepresenting Digital Images

Weeks 1 & 2 43
Spatial and Intensity ResolutionSpatial and Intensity Resolution
►Spatial resolution
— A measure of the smallest discernible detail in an
image
— stated with line pairs per unit distance, dots (pixels) per
unit distance, dots per inch (dpi)
►Intensity resolution
— The smallest discernible change in intensity level
— stated with 8 bits, 12 bits, 16 bits, etc.

Weeks 1 & 2 44
Spatial and Intensity ResolutionSpatial and Intensity Resolution

Weeks 1 & 2 45
Spatial and Intensity ResolutionSpatial and Intensity Resolution

Weeks 1 & 2 46
Spatial and Intensity ResolutionSpatial and Intensity Resolution

Weeks 1 & 2 47
Image InterpolationImage Interpolation
►Interpolation — Process of using known data
to estimate unknown values
e.g., zooming, shrinking, rotating, and geometric correction
►Interpolation (sometimes called resampling)
— an imaging method to increase (or decrease) the
number of pixels in a digital image.
Some digital cameras use interpolation to produce a larger image
than the sensor captured or to create digital zoom
http://www.dpreview.com/learn/?/key=interpolation

Weeks 1 & 2 48
Image Interpolation: Image Interpolation:
Nearest Neighbor InterpolationNearest Neighbor Interpolation
f
1(x
2,y
2) =
f(round(x
2), round(y
2))
=f(x
1,y
1)
f(x
1,y
1)
f
1
(x
3
,y
3
) =
f(round(x
3
), round(y
3
))
=f(x
1
,y
1
)

Weeks 1 & 2 49
Image Interpolation: Image Interpolation:
Bilinear InterpolationBilinear Interpolation
2
( , )
(1 )(1 ) (, ) (1 ) ( 1, )
(1 ) (, 1) ( 1, 1)
( ), ( ), , .
f x y
a b f l k a b f l k
a b f l k ab f l k
l floor x k floor y a x l b y k
     
     
     
   
   
(x,y)

Weeks 1 & 2 50
Image Interpolation: Image Interpolation:
Bicubic InterpolationBicubic Interpolation
3 3
3
0 0
(, )
i j
ij
i j
f xy axy
 

►The intensity value assigned to point (x,y) is obtained by
the following equation

►The sixteen coefficients are determined by using the
sixteen nearest neighbors.
http://en.wikipedia.org/wiki/Bicubic_interpolation

Weeks 1 & 2 51
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 52
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 53
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 54
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 55
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 56
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 57
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 58
Examples: InterpolationExamples: Interpolation

Weeks 1 & 2 59
Basic Relationships Between Pixels
►Neighborhood
►Adjacency
►Connectivity
►Paths
►Regions and boundaries

Weeks 1 & 2 60
Basic Relationships Between Pixels
►Neighbors of a pixel p at coordinates (x,y)

4-neighbors of p, denoted by N
4
(p):
(x-1, y), (x+1, y), (x,y-1), and (x, y+1).

4 diagonal neighbors of p, denoted by N
D(p):
(x-1, y-1), (x+1, y+1), (x+1,y-1), and (x-1, y+1).

8 neighbors of p, denoted N
8
(p)
N
8(p) = N
4(p) U N
D(p)

Weeks 1 & 2 61
Basic Relationships Between Pixels
►Adjacency
Let V be the set of intensity values
4-adjacency: Two pixels p and q with values from V are
4-adjacent if q is in the set N
4
(p).
8-adjacency: Two pixels p and q with values from V are
8-adjacent if q is in the set N
8
(p).

Weeks 1 & 2 62
Basic Relationships Between Pixels
►Adjacency
Let V be the set of intensity values
m-adjacency: Two pixels p and q with values from V
are m-adjacent if

(i) q is in the set N
4(p), or
(ii) q is in the set N
D
(p) and the set N
4
(p) ∩ N
4
(p) has no pixels whose
values are from V.

Weeks 1 & 2 63
Basic Relationships Between Pixels
►Path

A (digital) path (or curve) from pixel p with coordinates (x
0, y
0) to pixel
q with coordinates (x
n, y
n) is a sequence of distinct pixels with
coordinates

(x
0, y
0), (x
1, y
1), …, (x
n, y
n)
Where (x
i
, y
i
) and (x
i-1
, y
i-1
) are adjacent for 1 i n.
≤ ≤
Here n is the length of the path.

If (x
0, y
0) = (x
n, y
n), the path is closed path.
We can define 4-, 8-, and m-paths based on the type of adjacency
used.

Weeks 1 & 2 64
Examples: Adjacency and Path
0 1 1 0 1 1 0 1 10 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 00 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 10 0 1 0 0 1 0 0 1
V = {1, 2}

Weeks 1 & 2 65
Examples: Adjacency and Path
0 1 1 0 1 1 0 1 10 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 00 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 10 0 1 0 0 1 0 0 1
V = {1, 2}
8-adjacent

Weeks 1 & 2 66
Examples: Adjacency and Path
0 1 1 0 1 1 0 1 10 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 00 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 10 0 1 0 0 1 0 0 1
V = {1, 2}
8-adjacent m-adjacent

Weeks 1 & 2 67
Examples: Adjacency and Path
001,11,1 1 11,21,2 1 11,31,3 0 1 1 0 1 1 0 1 1 0 1 1
002,12,1 2 22,22,2 0 02,32,3 0 2 0 0 2 0 0 2 0 0 2 0
003,13,1 0 03,23,2 1 13,33,3 0 0 1 0 0 1 0 0 1 0 0 1
V = {1, 2}
8-adjacent m-adjacent
The 8-path from (1,3) to (3,3):
(i)(1,3), (1,2), (2,2), (3,3)
(ii)(1,3), (2,2), (3,3)
The m-path from (1,3) to (3,3):
(1,3), (1,2), (2,2), (3,3)

Weeks 1 & 2 68
Basic Relationships Between Pixels
►Connected in S
Let S represent a subset of pixels in an image. Two
pixels p with coordinates (x
0, y
0) and q with coordinates
(x
n, y
n) are said to be connected in S if there exists a
path
(x
0, y
0), (x
1, y
1), …, (x
n, y
n)
Where
,0 ,( , )
i i
i i n x y S   

Weeks 1 & 2 69
Basic Relationships Between Pixels
Let S represent a subset of pixels in an image
►For every pixel p in S, the set of pixels in S that are connected to p
is called a connected component of S.
►If S has only one connected component, then S is called Connected
Set.
►We call R a region of the image if R is a connected set
►Two regions, R
i and R
j are said to be adjacent if their union forms a
connected set.
►Regions that are not to be adjacent are said to be disjoint.

Weeks 1 & 2 70
Basic Relationships Between Pixels
►Boundary (or border)
The boundary of the region R is the set of pixels in the region that
have one or more neighbors that are not in R.
If R happens to be an entire image, then its boundary is defined as
the set of pixels in the first and last rows and columns of the image.
►Foreground and background
An image contains K disjoint regions, R
k, k = 1, 2, …, K. Let R
u denote
the union of all the K regions, and let (R
u
)
c
denote its complement.

All the points in R
u
is called foreground;
All the points in (R
u
)
c

is called background.

Weeks 1 & 2 71
Question 1
►In the following arrangement of pixels, are the two
regions (of 1s) adjacent? (if 8-adjacency is used)
1 1 1
1 0 1
0 1 0
0 0 1
1 1 1
1 1 1
Region 1
Region 2

Weeks 1 & 2 72
Question 2
►In the following arrangement of pixels, are the two
parts (of 1s) adjacent? (if 4-adjacency is used)
1 1 1
1 0 1
0 1 0
0 0 1
1 1 1
1 1 1
Part 1
Part 2

Weeks 1 & 2 73
►In the following arrangement of pixels, the two
regions (of 1s) are disjoint (if 4-adjacency is used)
1 1 1
1 0 1
0 1 0
0 0 1
1 1 1
1 1 1
Region 1
Region 2

Weeks 1 & 2 74
►In the following arrangement of pixels, the two
regions (of 1s) are disjoint (if 4-adjacency is used)
1 1 1
1 0 1
0 1 0
0 0 1
1 1 1
1 1 1
foreground
background

Weeks 1 & 2 75
Question 3
►In the following arrangement of pixels, the circled
point is part of the boundary of the 1-valued pixels if
8-adjacency is used, true or false?
0 0 0 0 0
0 1 1 0 0
0 1 1 0 0
0 1 1 1 0
0 1 1 1 0
0 0 0 0 0

Weeks 1 & 2 76
Question 4
►In the following arrangement of pixels, the circled
point is part of the boundary of the 1-valued pixels if
4-adjacency is used, true or false?
0 0 0 0 0
0 1 1 0 0
0 1 1 0 0
0 1 1 1 0
0 1 1 1 0
0 0 0 0 0

Weeks 1 & 2 77
Distance Measures
►Given pixels p, q and z with coordinates (x, y), (s, t),
(u, v) respectively, the distance function D has
following properties:
a.D(p, q)

0 [D(p, q) = 0, iff p = q]
b.D(p, q) = D(q, p)
c.D(p, z)

D(p, q) + D(q, z)

Weeks 1 & 2 78
Distance Measures
The following are the different Distance measures:
a. Euclidean Distance :
D
e
(p, q) = [(x-s)
2
+ (y-t)
2
]
1/2
b. City Block Distance:
D
4(p, q) = |x-s| + |y-t|
c. Chess Board Distance:
D
8(p, q) = max(|x-s|, |y-t|)

Weeks 1 & 2 79
Question 5
►In the following arrangement of pixels, what’s the
value of the chessboard distance between the
circled two points?
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0

Weeks 1 & 2 80
Question 6
►In the following arrangement of pixels, what’s the
value of the city-block distance between the circled
two points?
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0

Weeks 1 & 2 81
Question 7
►In the following arrangement of pixels, what’s the
value of the length of the m-path between the
circled two points?
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0

Weeks 1 & 2 82
Question 8
►In the following arrangement of pixels, what’s the
value of the length of the m-path between the
circled two points?
0 0 0 0 0
0 0 1 1 0
0 0 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0

Weeks 1 & 2 83
Introduction to Mathematical Operations in Introduction to Mathematical Operations in
DIPDIP
►Array vs. Matrix Operation

11 12
21 22
b b
B
b b
 

 
 
11 12
21 22
a a
A
a a
 

 
 
11 11 12 21 11 12 12 22
21 11 22 21 21 12 22 22
*
a b a b a b a b
A B
a b a b a b a b
  

 
 
 
11 11 12 12
21 21 22 22
.*
a b a b
A B
a b a b
 

 
 
Array product
Matrix product
Array
product
operator
Matrix
product
operator

Weeks 1 & 2 84
Introduction to Mathematical Operations in Introduction to Mathematical Operations in
DIPDIP
►Linear vs. Nonlinear Operation

H is said to be a linear operator;
H is said to be a nonlinear operator if it does not meet the
above qualification.
 ( , ) ( , )H f x y g x y
Additivity
Homogeneity
 
 
( , ) ( , )
( , ) ( , )
( , ) ( , )
( , ) ( , )
i i j j
i i j j
i i j j
i i j j
H a f x y a f x y
H a f x y H a f x y
aH f x y a H f x y
ag x y a g x y
  
 
  
 
  
 
 

Weeks 1 & 2 85
Arithmetic OperationsArithmetic Operations
►Arithmetic operations between images are array
operations. The four arithmetic operations are
denoted as
s(x,y) = f(x,y) + g(x,y)
d(x,y) = f(x,y) – g(x,y)
p(x,y) = f(x,y) × g(x,y)
v(x,y) = f(x,y) ÷ g(x,y)

Weeks 1 & 2 86
Example: Addition of Noisy Images for Noise ReductionExample: Addition of Noisy Images for Noise Reduction
Noiseless image: f(x,y)
Noise: n(x,y) (at every pair of coordinates (x,y), the noise is uncorrelated
and has zero average value)
Corrupted image: g(x,y)
g(x,y) = f(x,y) + n(x,y)
Reducing the noise by adding a set of noisy images,
{g
i
(x,y)}

1
1
( , ) ( , )
K
i
i
g x y g x y
K



Weeks 1 & 2 87
Example: Addition of Noisy Images for Noise ReductionExample: Addition of Noisy Images for Noise Reduction

 
 
1
1
1
1
( , ) ( , )
1
( , ) ( , )
1
( , ) ( , )
( , )
K
i
i
K
i
i
K
i
i
E g x y E g x y
K
E f x y n x y
K
f x y E n x y
K
f x y



 
 
 
 
  
 
 
   
 




1
1
( , ) ( , )
K
i
i
g x y g x y
K


2
( , )
1
( , )
1
1
( , )
1
2
2 2
( , )
1
g x y K
g x y
i
K
i
K
n x y
i
K
i
n x y
K
 
 





 

Weeks 1 & 2 88
Example: Addition of Noisy Images for Noise ReductionExample: Addition of Noisy Images for Noise Reduction
►In astronomy, imaging under very low light levels
frequently causes sensor noise to render single
images virtually useless for analysis.
►In astronomical observations, similar sensors for
noise reduction by observing the same scene over
long periods of time. Image averaging is then used
to reduce the noise.

Weeks 1 & 2 89

Weeks 1 & 2 90
An Example of Image Subtraction: Mask Mode An Example of Image Subtraction: Mask Mode
RadiographyRadiography
Mask h(x,y): an X-ray image of a region of a patient’s body
Live images f(x,y): X-ray images captured at TV rates after injection
of the contrast medium
Enhanced detail g(x,y)
g(x,y) = f(x,y) - h(x,y)
The procedure gives a movie showing how the contrast medium
propagates through the various arteries in the area being
observed.

Weeks 1 & 2 91

Weeks 1 & 2 92
An Example of Image MultiplicationAn Example of Image Multiplication

Weeks 1 & 2 93
Set and Logical OperationsSet and Logical Operations

Weeks 1 & 2 94
Set and Logical OperationsSet and Logical Operations
► Let A be the elements of a gray-scale image
The elements of A are triplets of the form (x, y, z),
where x and y are spatial coordinates and z denotes the
intensity at the point (x, y).

► The complement of A is denoted A
c


{( , , )|( , , ) }
2 1; is the number of intensity bits used to represent
c
k
A x y K z x y z A
K k z
  
 
{( , , )| ( , )}A x y z z f x y 

Weeks 1 & 2 95
Set and Logical OperationsSet and Logical Operations
► The union of two gray-scale images (sets) A and B is
defined as the set
{max( , )| , }
z
A B ab a Ab B   

Weeks 1 & 2 96
Set and Logical OperationsSet and Logical Operations

Weeks 1 & 2 97
Set and Logical OperationsSet and Logical Operations

Weeks 1 & 2 98
Spatial OperationsSpatial Operations
► Single-pixel operations
Alter the values of an image’s pixels based on the intensity.
e.g.,

( )s T z

Weeks 1 & 2 99
Spatial OperationsSpatial Operations
► Neighborhood operations
The value of this pixel is determined
by a specified operation involving
the pixels in the input image with
coordinates in S
xy

Weeks 1 & 2 100
Spatial OperationsSpatial Operations
► Neighborhood operations

Weeks 1 & 2 101
Geometric Spatial TransformationsGeometric Spatial Transformations
► Geometric transformation (rubber-sheet
transformation)
— A spatial transformation of coordinates
— intensity interpolation that assigns intensity values to the spatially
transformed pixels.
► Affine transform
( , ) {( , )}x y T vw
  
11 12
21 22
31 32
0
1 1 0
1
t t
x y v w t t
t t
 
 

 
 
 

Weeks 1 & 2 102

Weeks 1 & 2 103
Intensity Assignment Intensity Assignment
► Forward Mapping
It’s possible that two or more pixels can be transformed to the
same location in the output image.
► Inverse Mapping
The nearest input pixels to determine the intensity of the output
pixel value.
Inverse mappings are more efficient to implement than forward
mappings.
( , ) {( , )}x y T vw
1
( , ) {( , )}v w T x y

Weeks 1 & 2 104
Example: Image Rotation and Intensity Example: Image Rotation and Intensity
InterpolationInterpolation

Weeks 1 & 2 105
Image RegistrationImage Registration
► Input and output images are available but the
transformation function is unknown.
Goal: estimate the transformation function and use it to
register the two images.
► One of the principal approaches for image registration
is to use tie points (also called control points)
 The corresponding points are known precisely in the
input and output (reference) images.

Weeks 1 & 2 106
Image RegistrationImage Registration
►A simple model based on bilinear approximation:

1 2 3 4
5 6 7 8
Where ( , ) and ( , ) are the coordinates of
tie points in the input and reference images.
x cv cw cvw c
y cv cw cvw c
v w x y
   

   

Weeks 1 & 2 107
Image RegistrationImage Registration

Weeks 1 & 2 108
Image TransformImage Transform
►A particularly important class of 2-D linear transforms,
denoted T(u, v)

1 1
0 0
( , ) ( , ) ( , , , )
where ( , ) is the input image,
( , , , ) is the ker ,
variables and are the transform variables,
= 0, 1, 2, ..., M-1 and = 0, 1,
M N
x y
T u v f x y r x yu v
f x y
r x yu v forward transformation nel
u v
u v
 
 

..., N-1.

Weeks 1 & 2 109
Image TransformImage Transform
►Given T(u, v), the original image f(x, y) can be recoverd
using the inverse tranformation of T(u, v).

1 1
0 0
( , ) ( , ) ( , , , )
where ( , , , ) is the ker ,
= 0, 1, 2, ..., M-1 and = 0, 1, ..., N-1.
M N
u v
f x y T u v s x y u v
s x y u v inversetransformation nel
x y
 
 


Weeks 1 & 2 110
Image TransformImage Transform

Weeks 1 & 2 111
Example: Image Denoising by Using DCT Example: Image Denoising by Using DCT
TransformTransform

Weeks 1 & 2 112
Forward Transform KernelForward Transform Kernel

1 1
0 0
1 2
1 2
( , ) ( , ) ( , , , )
The kernel ( , , , ) is said to be SEPERABLE if
( , , , ) ( , ) ( , )
In addition, the kernel is said to be SYMMETRIC if
( , ) is functionally equal to ( ,
M N
x y
T u v f x y r x y u v
r x y u v
r x y u v r x u r y v
r x u r y v
 
 



1 1
), so that
( , , , ) ( , ) ( , )r x y u v r x u r y u

Weeks 1 & 2 113
The Kernels for 2-D Fourier TransformThe Kernels for 2-D Fourier Transform
2 ( / / )
2 ( / / )
The kernel
( , , , )
Where = 1
The kernel
1
( , , , )
j ux M vy N
j ux M vy N
forward
r x yuv e
j
inverse
s x yuv e
MN


 



Weeks 1 & 2 114
2-D Fourier Transform2-D Fourier Transform

1 1
2 ( / / )
0 0
1 1
2 ( / / )
0 0
( , ) ( , )
1
( , ) ( , )
M N
j ux M vy N
x y
M N
j ux M vy N
u v
Tuv f x ye
f x y Tuve
MN


 
 
 
 

 





Weeks 1 & 2 115
Probabilistic MethodsProbabilistic Methods
Let , 0, 1, 2, ..., -1, denote the values of all possible intensities
in an digital image. The probability, (), of intensity level
occurring in a given image is estimated as

i
k
k
z i L
M N p z
z


( ) ,
where is the number of times that intensity occurs in the image.
k
k
k k
n
p z
MN
n z

1
0
( ) 1
L
k
k
p z



1
0
The mean (average) intensity is given by
= ( )
L
k k
k
m z p z


Weeks 1 & 2 116
Probabilistic MethodsProbabilistic Methods
1
2 2
0
The variance of the intensities is given by
= ( ) ( )
L
k k
k
z m p z



th
1
0
The moment of the intensity variable is
( ) = ( ) ( )
L
n
n k k
k
n z
u z z m p z




Weeks 1 & 2 117
Example: Comparison of Standard Example: Comparison of Standard
Deviation ValuesDeviation Values
31.614.3 49.2

Weeks 1 & 2 118
HomeworkHomework
http://cramer.cs.nmt.edu/~ip/assignments.html
Tags