CEC366 Unit -1 FUNDAMENTALS OF DIGITAL IMAGE PROCESSING

ssusere2e534 341 views 144 slides Oct 15, 2024
Slide 1
Slide 1 of 144
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101
Slide 102
102
Slide 103
103
Slide 104
104
Slide 105
105
Slide 106
106
Slide 107
107
Slide 108
108
Slide 109
109
Slide 110
110
Slide 111
111
Slide 112
112
Slide 113
113
Slide 114
114
Slide 115
115
Slide 116
116
Slide 117
117
Slide 118
118
Slide 119
119
Slide 120
120
Slide 121
121
Slide 122
122
Slide 123
123
Slide 124
124
Slide 125
125
Slide 126
126
Slide 127
127
Slide 128
128
Slide 129
129
Slide 130
130
Slide 131
131
Slide 132
132
Slide 133
133
Slide 134
134
Slide 135
135
Slide 136
136
Slide 137
137
Slide 138
138
Slide 139
139
Slide 140
140
Slide 141
141
Slide 142
142
Slide 143
143
Slide 144
144

About This Presentation

Covers the First Unit of Image Processing in Anna University Syllabus


Slide Content

CEC366 IMAGE PROCESSING
A Picture is worth a thousand words
UNIT I DIGITAL IMAGE FUNDAMENTALS
Steps in Digital Image Processing – Components – Elements of Visual Perception –
Image Sensing and Acquisition – Image Sampling and Quantization – Relationships
between pixels - Color image fundamentals - RGB, HSI models, Two-dimensional
mathematical preliminaries, 2D transforms - DFT, DCT

IMAGE PROCESSING - OBJECTIVES
Improvement of pictorial information for
human interpretation
Processing of image data for storage,
transmission and representation for
autonomous machine perception

IMAGE PROCESSING - IMPORTANCE

IMAGE
Image – A two dimensional function
f (x ,y), where x and y are spatial (plane)
coordinates and the amplitude of f at any pair
of coordinates is called intensity or the grey
level of the image at that point
Digital image – When x,y and the amplitude
values of f are all finite and discrete quantities
then the image is called digital image.

Digital Image Representation

Image represented in the computer.
Image Represented in computer

What is digital image Processing?
Processing of digital images by means of a
digital computer is called digital image
processing.
Note : A digital image is composed of finite
number of elements, each of which has a
particular location and value, These elements
are referred to as picture elements,image
elements,pels and pixels.

Application of DIP
•Digital photography
•Satellite Imaging- remote sensing
•Human being authentication- Biometrics
•Medical diagnosis
•Autonomous machine perception
•Industrial Inspection
•Law enforcement
•Film industry

Application of DIP based on
Illuminating sources
Gamma ray imaging – Nuclear medicine and
astronomical observations.
X-ray imaging – Medical diagnostics (eg: X-
ray films, Angiography), Industrial
inspection.
Imaging in the UV band -Lithography,
Industrial inspection, Biological imaging and
astronomical observation.

Areas of application of DIP
Imaging in the visible and infra red bands -
Astronomy, remote sensing, Industry and
law enforcement.
Imaging in the Microwave band – Radar
Imaging in the Radio band – Medicine
(MRI) and astronomy.

Examples: Industrial Inspection
• Human operators are
expensive, slow and
unreliable.
• Make machines do the
job instead.

• Industrial vision systems
are used in all kinds of
industries

Examples: PCB Inspection
Printed Circuit Board (PCB) inspection
•Machine inspection is used to determine that all
components are present and that all solder joints are
acceptable
•Both conventional imaging and x-ray imaging are used

Examples: Law Enforcement
Image processing techniques
are used extensively by law
enforcers
•Number plate recognition for
speed cameras/automated toll
systems
•Fingerprint recognition

Fundamental steps in DIP
Colour image
processing
Wavelets and
Multi resolution
processing
Compression Morphological
Processing
Segmentation
Representation
and description
Object
recognition
Image
restoration
Image
enhancement
Image
acquisition
Knowledge base

Fundamental Steps in Digital Image Processing:
Image Acquisition
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Image Enhancement
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Image Restoration
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Morphological Processing
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Segmentation
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Representation & Description
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Object Recognition
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Image Compression
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Fundamental Steps in Digital Image Processing:
Colour Image Processing
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Representation
& Description
Image
Enhancement
Object
Recognition
Problem Domain
Colour Image
Processing
Image
Compression

Elements of digital image
processing systems

Computer
NETWORK
Image Display Mass Storage
Hard Copy Image processing
Hardware
Image Processing
Software
Image Sensor
Problem Domain

Cont.,
• Image acquisition
• Storage
• Processing
• Communication
• Display
• Image acquisition: Two elements are required
1) Physical device - x- ray imaging systems
(Electrical o/p proportional to light intensity)
2) Digitizer

Cont.,
• Storage: 8 bit image of size 1024 X 1024 pixels
require one million bytes of storage.
• Digital storage for image processing 3 types
1) Short term storage- use during processing
(eg. Computer memory)
2) On-line storage – Fast recall.
(eg. Magnetic disks, Magneto optical)
3) Archival storage – Infrequent access.
(eg. Magnetic tapes, Optical disks)

CONT.,
• Processing and Networking: processing digital
images- algorithm or software.
• Some of the applications – need specialized
image processing hardware (require fast
throughputs).
• Recently- use computer for image processing and
ALU is used in averaging images (noise
reduction).
• Large scale IP –small computer with image
processing hardware.
• Software: specialized modules that performs
specific tasks.

Cont.,
• Communication:
1) local communication between IP systems.
2) Remote communication
• Display :
•Monochrome and color TV monitors.
•Random access CRT.
•Printing devices.

ELEMENTS OF VISUAL
PERCEPTION

Introduction
•In many image processing applications, the
objective is to help a human observer perceive the
visual information in an image. Therefore, it is
important to understand the human visual
system.
•The human visual system consists mainly of the
eye (image sensor or camera), optic nerve
(transmission path), and brain (image information
processing unit or computer).
•It is one of the most sophisticated image
processing and analysis systems.

Outline
•Structure of the human eye
•Image formation in the human eye
•Brightness adaptation and discrimination

Structure of Human Eye

•Three Membranes enclose the eye:
1. The cornea and sclera - outer cover
2. The choroid
3. The Retina
CORNEA and SCLERA
• Cornea -Tough, Transparent tissue that covers
the anterior surface of the eye
• Sclera - Continuous with cornea, Opaque
membrane, covers remaining part

CHOROID
• Below the sclera
• Networks of blood vessels-major source of nutrition
• Ciliary body and Iris diaphragm
LENS
•Concentric layers of fibrous cells
•60%-70% water, 6% fat and more protein
•The lens and the ciliary muscle focus the reflected
lights from objects into the retina to form an image
of the objects.

RETINA
•Pattern vision is afforded by the distribution of
discrete light receptors over the surface of the retina.
•Two types of photoreceptors: rods and cones (light
sensors)
 Cones
•There are 6 to 7 million cones in each eye.
•Concentrated in the central portion of the retina
called the fovea.
•Highly sensitive to color.
•Each cone is connected to its own nerve end, so
human can resolve fine details.
• Cone vision is called photopic or BRIGHT-LIGHT
VISION

 Rods
•There are 75 to 150 million rods in each eye.
•Distributed over the retina surface.
•Several rods are connected to a single nerve end.
•Rods don’t discern fine details.
•Rods give a general picture of the field of view.
•Not involve in color vision and sensitive to low
levels of illumination.
• Rod vision is called scotopic or DIM-LIGHT
VISION.

 Fovea
•Circular indentation in center of retina, about
1.5mm diameter, dense with cones.
•Photoreceptors around fovea responsible for spatial
vision (still images).
•Photoreceptors around the periphery responsible for
detecting motion.
 Blind spot
• Point on retina where optic nerve emerges, devoid
of photoreceptors.

•Fovea size is 1.5 mm in diameter
•1.5 mm  1.5 mm square contain 337000 cones
•5mm  5mm CCD imaging chip

Image formation of the eye
•Flexible lens : the principal difference from an
ordinary optical lens.
•Controlled by the tension in the fibers of the ciliary
body
•To focus on distant objects – flattened
•To focus on objects near eye – thicker
•Near-sighted and far-sighted

y
x h
Focal length = f = 14 to 17 mm
x
y
=
f
h
f
h = 2.55 mm
FDP ON DIP - DEPT. OF ECE, SAEC

•Focal length of the eye : 14 to 17 mm
•Let h be the height in mm of that object in the retinal
image, then
15/100 = h / 17 , h = 2.55mm
•The retinal image is reflected primarily in the area of
the fovea.
FDP ON DIP - DEPT. OF ECE, SAEC

Brightness Adaptation
•The range of light intensity human can adapt to is
in the range of 10
10
•Subjective brightness is a logarithmic function of
the light intensity incident on the eye.
•The visual system does not operate simultaneously
over the 10
10
range. It accomplishes this large
variation by changes in its overall sensitivity, a
phenomenon known as brightness adaptation
•The current sensitivity level of the visual system is
called brightness adaptation level
FDP ON DIP - DEPT. OF ECE, SAEC

FDP ON DIP - DEPT. OF ECE, SAEC

Brightness Discrimination
•Brightness discrimination is the ability of the eye
to discriminate between changes in light intensity
at any specific adaptation level.
•The quantity I
c
/I, where I
c is the increment of
illumination discriminable 50% of the time with
background illumination I, is called the Weber
ratio.
FDP ON DIP - DEPT. OF ECE, SAEC

•Weber ratio (the experiment) :
•I: the background illumination
• : the increment of illumination
•Small Weber ratio indicates good discrimination
•Larger Weber ratio indicates poor discrimination
FDP ON DIP - DEPT. OF ECE, SAEC
CI
II
C/

•Brightness discrimination is poor at low levels of
illumination.
•The two branches in the curve indicate that at
low levels of illumination vision is carried out by
the rods, whereas at high level by the cones.
FDP ON DIP - DEPT. OF ECE, SAEC

Perceived Brightness
•Two phenomena clearly demonstrate that perceived
brightness is not a simple function of intensity.
๏ Mach band pattern
๏ Simultaneous contrast
๏ Optical illusion
FDP ON DIP - DEPT. OF ECE, SAEC

•First Phenomena –
Mach Band
•Visual system tends
to undershoot or
overshoot around
boundary of regions
of different
intensities.
FDP ON DIP - DEPT. OF ECE, SAEC

•The second phenomena, called simultaneous
contrast, a spot may appears to the eye to become
darker as the background gets lighter.
FDP ON DIP - DEPT. OF ECE, SAEC

•Optical illusions occurs when the eye fills in non-
existing information or wrongly perceives
geometrical properties of objects.
FDP ON DIP - DEPT. OF ECE, SAEC

FDP ON DIP - DEPT. OF ECE, SAEC

FDP ON DIP - DEPT. OF ECE, SAEC

Content
•2D Transforms
1.DFT
2.DCT
3.KLT
4.SVD

Why Do Transforms?
•Fast computation
•Conceptual insights for various image processing
•Obtain transformed data as measurement
•For efficient storage and transmission

Image Transforms
•Many times, image processing tasks are best
performed in a domain other than the spatial domain.
•Key steps:
(1) Transform the image
(2) Carry the task(s) in the transformed domain.
(3) Apply inverse transform to return to the spatial
domain.

Image Transforms (cont’d)
•Forward Transformation
•Inverse Transformation






1
0
1
0
1,...,1,0,1,...,1,0),,,(),(),(
M
x
N
y
NvMuvuyxryxfvuT






1
0
1
0
1,...,1,0,1,...,1,0),,,(),(),(
M
u
N
v
NyMxvuyxsvuTyxf
inverse transformation kernel
forward transformation kernel

Image Transforms (cont’d)
•A kernel is said to be separable if:
•A kernel is said to be symmetric if:
),(),(),,,(
21
vyruxrvuyxr 
),(),(),,,(
11
vyruxrvuyxr 
(i.e., r
1
and r
2
are functionally equal)

Cont.,
•Continuous Fourier Transform (FT)
–1D
–2D
•Discrete Fourier Transform (DFT)
–1D
–2D
•Fast Fourier Transform (FFT)

Continuous Fourier Transform (FT)
•Transforms a signal (i.e., function) from the spatial
domain to the frequency domain.
where

How do frequencies show up in an image?
•High frequencies correspond to quickly varying
information (e.g., edges)
•Low frequencies correspond to slowly varying
information (e.g., continuous surface)
Original Image High-passed Low passed

Discrete Fourier Transform (DFT)
(cont’d)
•Forward DFT
•Inverse DFT

Extending DFT to 2D
•Assume that f(x,y) is M x N image.
•Forward DFT
•Inverse DFT:

Extending DFT to 2D (cont’d)
•Special case: f(x,y) is N x N image.
•Forward DFT
•Inverse DFT

DFT Properties: (1) Separability
•The 2D DFT can be computed using 1D transforms
only:
Forward DFT:
Inverse DFT:
2 ( ) 2 ( ) 2 ( )
ux vy ux vy
j j j
N N N
e e e
  

  

kernel is
separable:

DFT Properties: (1) Separability
(cont’d)
•Rewrite F(u,v) as follows:
•Let’s set:
•Then:

DFT Properties: (1) Separability (cont’d)

DFT Properties: (2) Periodicity
•The DFT and its inverse are periodic with period N

DFT Properties: (3) Symmetry
• If f(x,y) is real, then

DFT Properties: (4) Translation
N
f(x,y) f (u,v)
• Translation is spatial domain:
• Translation is frequency domain:

DFT Properties: (4) Translation
(cont’d)
•To move F(u,v) at (N/2, N/2), take
no translation after translation

DFT Properties: (5) Rotation
•Rotating f(x,y) by θ rotates F(u,v) by θ

DFT Properties: (6) Distributive
but …

DFT Properties: (7) Scale

DFT Properties: (8) Average value
So:
Average:
F(u,v) at u=0, v=0:

Magnitude and Phase of DFT
•What is more important?
•Hint: use inverse DFT to reconstruct the image using
magnitude or phase only information
magnitude phase

Magnitude and Phase of DFT (cont’d)
Reconstructed image using
magnitude only
(i.e., magnitude determines the
contribution of each component!)
Reconstructed image using
phase only
(i.e., phase determines
which components are present!)

Energy Compaction !
Most unitary transforms pack a large fraction of the
energy of the image into relatively few of the transform
coefficients. This means that relatively few of the
transform coefficients have significant values. This
property is very useful for compression purposes

Energy Preservation

1-D

2-D 









1
0
1
0
2
1
0
1
0
2
),(),(
N
u
N
v
N
x
N
y
vugyxf
22
fg
a unitary transformation preserves the signal energy. This property is called energy preservation property.

DCT
One dimensional signals
The DCT is defined below
Cu=au∑
x=0
N−1
fxcos[
2x1uπ
2N ]

u=0,1,,N−1
au={
1/N u=0
2/N u=1,,N−1}
The inverse DCT (IDCT) is defined below
fx=∑
u=0
N−1
auCucos[
2x1uπ
2N ]

DCT CONT.,
Two dimensional signals
For 2-D signals it is defined as
Cu,v=auav∑
x=0
N−1

y=0
N−1
fx,ycos[
2x1uπ
2N ]
cos[
2y1vπ
2N ]
fx,y=∑
u=0
N−1

v=0
N−1
auavCu,vcos[
2x1uπ
2N ]
cos[
2y1vπ
2N ]
au u,v=0,1,,N−1 is defined as above and

1-D Basis Functions N=8
1.0
0.5
0
-0.5
-1.0
u=0
1.0
0.5
0
-0.5
-1.0
u=1
1.0
0.5
0
-0.5
-1.0
u=2
1.0
0.5
0
-0.5
-1.0
u=3
1.0
0.5
0
-0.5
-1.0
u=4
1.0
0.5
0
-0.5
-1.0
u=5
1.0
0.5
0
-0.5
-1.0
u=6
1.0
0.5
0
-0.5
-1.0
u=7

Properties of the DCT transform
•The DCT is a real and orthogonal transform. This
property makes it attractive in comparison to the
Fourier transform.
•The DCT has excellent energy compaction
properties. For that reason it is widely used in
image compression standards (as for example
JPEG standards).
•The NxN cosine transform is very close to KL
transform.
•There are fast algorithms to compute the DCT, similar to
the FFT for computing the DFT.

2-D Basis Functions N=4
0
1
2
3
0 1 2 3
u
v

2-D Basis Functions N=8

Cont.,

DRAWBACKS

Original Images

Karhunen-Loeve Images

Properties of KLT
•KL transform co-efficients are uncorrelated and have 0
mean and data dependent.
•Error can be minimized and optimality.
•It has good energy compaction.
•Fast transform.
•It also minimized the expected number of transforms co-
efficients requiered.
•Minimum value of distortion.
Applications 1. (non-universal) compression
2. Pattern recognition: e.g., eigen faces
3. Analyze the principal (“dominating”) components

Applications of KLT
Applications
1. (non-universal) compression
2. Pattern recognition: e.g., eigen faces
3. Analyze the principal (“dominating”) components

SVD Transform



rmλ
rmλ
MrNM
λ
NM
m
m
m
m
m
m
m
m
m
,,2,1,
,,2,1,
ofvectorseigenbe
ofvectorseigenbelet
valueseigenmostataretherethen,If
valueeigenidentical
matricessymmetric,negative,nonareandthen
matrixbelet










t
t
t
t
tt
UU
UU
UU
UU
UUUU
U

 
 





















U
U
U
t
t
mm
mm
r
m
m
r
m
t
mm
m
r
r
r
t
Uwhere
U
where
2
1
2
1
matrix 1rank
0
0
11
1
21
21


separable;
208.1294.2
126.024.0
007.0,803.6
2.13.2
2.02.0
)
ofionapproximatranksquareleastbesttheis
,.3
transformunitaryannotisSVD.2
1
thenknown,are,,2,1,Once.1
Properties
1
21
1
























H
HEx
Uk
rkU
U
rm
k
m
t
mm
m
k
m
m
m
m

Color Image Processing
Session III

Contents
•Color fundamentals.
•Types of Color Models.
•RGB Models.
•HIS Models.
•Conversion of Color Models.

Cont.,
•Color is a powerful descriptor that often
simplifies object identification and extraction
from a scene.
•Humans can differentiate thousands of color
shades and intensities of gray scale.

Color Fundamentals

Color Fundamentals
•Primary colors:
•Red ®, Green (G)
and Blue(B).
•Secondary colors:
magenta (red +
blue), cyan (green
+ blue), and yellow
(red + green)

Color Fundamentals
The characteristics generally used to distinguish
one color from another are Brightness, Hue, and
Saturation.
Hue: Represents dominant color as perceive by
an observer.
Saturation: Relative purity or the amount of white
light mixed with a hue

Cont.,
•Hue and saturation taken together are
called Chromaticity, and therefore, a color
may be characterized by its Brightness and
Chromaticity.

Color Models
The purpose of a color model (also called color space or
color system) is to facilitate the specification of colors
in some standard, generally accept way.
RGB (red, green, blue) : monitor, video camera.
CMY(cyan, magenta, yellow),CMYK (CMY, black)
model for color printing.
HSI model, which corresponds closely with the way
humans describe and interpret color.

Cont.,
• Most color models in use today are oriented either
towards hardware or towards applications (such as
creation of color graphics for animation).
•In digital image processing , the hardware oriented
models used in practice is the RGB model.(color
monitor and color videos cameras).

Color Models -- RGB Model
•In RGB model each color appears in its primary
spectral components of red, green and blue.
• This model is based on a Cartesian co-ordinate
system.

Color Models -- RGB Model

•R, G, B at 3 axis ranging in [0 1] each
Gray scale along the diagonal.
•If each component is quantized into 256 levels
[0:255], the total number of different colors that
can be produced is (2
8
)
3
= 2
24
= 16,777,216
colors.
• RGB safe color:
Quantize each components into 6 levels from 0 to
255.
Cont.,

Cont.,
•Image represented in the RGB color model consists
of three component images, one for each primary
color.
•When fed into an RGB monitor, these three images
combine on the phosphor screen to produce a
composite color image.
•The number of bits used to represent each pixel in
RGB space is called the pixel depth.

Cont.,
•RGB image in which each of the red, green and
blue images is an 8bit image.
•RGB model having depth of 24 bits.
•The term full color image is used often to denote a
24 bit RGB Color image.
•The 24 bit RGB color cube corresponding to the
diagram is shown below.

24 bit RGB color cube

Color Models -- RGB Model

Color Models -- RGB Model
•For most
graphics
images used
for Internet
applications, a
set of 216
colors has
been selected
to represent
“safe colors”
which should
be reliably
displayed on
computer
monitors.

Color Models -- RGB Model

Cont.,
• RGB and CMY are ideally suited for hardware
implementation.
•These models are not suited for human
interpretation.
•When human view color object, it can be described
by its Hue, Saturation and Brightness.

The HSI Color Model
•Human describes color in terms of hue, saturation
and brightness.
–Hue: describe the pure color, pure yellow,
orange, green or red.
–Saturation measures the degree to which a pure
color is diluted by white light.
– Brightness is a subjective descriptor difficult
to be measured.
•Comparison:
–The RGB model is ideal for image color
generation.
–The HSI model is an ideal tool for developing
image processing algorithms based on color
descriptions.

Cont.,
•Color images are not based on the three primary
colors.
•The intensity ( is a most useful color descriptor
of monochromatic images.
• This quantity definitely is measurable and easily
interpretable.

Conceptual Relationship RGB - HSI
•All pointes contained in the plane segment are
defined by the intensity and boundary of the cube
have the same hue.

Hue Measurement

The HSI Color Model

Conversions
•From RGB to HSI
•S=1-[3/(R+G+B)][min(R, G, B)]
•I=(R+G+B)/3






GBif
GBif
H


360










2/12
1
)])((()[(
)]()[(2/1
cos
BGBRGR
BRGR

Conversions (Cont.)
•RG sector (0<H<120)
B = I(1-S)

G = 3I-(R+B)








)60cos(
cos
1
H
HS
IR
o

Conversions (Cont.)
•GB sector (120≤H<240)
–First, let H = H -120
R=I(1-S)

B=3I-(R+G)








)60cos(
cos
1
H
HS
IG
o

Conversions (Cont.)
•BR sector (240 ≤ H ≤ 360)
–First, let H = H -240
G = I(1-S)

R = 3I-(G+B)








)60cos(
cos
1
H
HS
IB
o

The HSI Color Model

Color Models -- HSI Model

Color Models -- HSI Model

Full Color Image Processing
•Two processing methods:
–(1) process each channel (or color
component) separately, as if the color image
were three gray scale images;
–(2) process all channels with each pixel
represented as a vector.

Color Transformations
RGB<->HSI<->CMYK

Color Transformations
Example

Color Complement Transformations
Example

Tone and Color Corrections
Tone Corrections

Smoothing and Sharpening
Color Image Smoothing

Smoothing and Sharpening
Color Image Smoothing

Color Edge Detection

Color Edge Detection

Noise in Color Images

Noise in Color Images
Fig. 6.48(d)
Tags