What is spatial Resolution

reachquadri 115,413 views 32 slides Mar 07, 2012
Slide 1
Slide 1 of 32
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32

About This Presentation

The presentation serves as an introduction to Spatial Resolution and other terms related.


Slide Content

What is Spatial Resolution ?
A presentation for better understanding!
S.A.Quadri
CEDEC , USM , Malaysia

Effect of Spatial resolution on visualization
(Satellite image : Reference http://visibleearth.nasa.gov/view_rec.php?id=1427)

Imageresolution
Itisanumbrellatermthatdescribesthedetailanimageholds.
Thetermappliestorasterdigitalimages,filmimages,andothertypesofimages.
Higherresolutionmeansmoreimagedetails.
Imageresolutioncanbemeasuredinvariousways.
Resolutionquantifieshowcloselinescanbetoeachotherandstillbevisiblyresolved.
Resolutionunitscanbetiedtophysicalsizes(e.g.linespermm,linesperinch),tothe
overallsizeofapicture(linesperpictureheight,alsoknownsimplyaslines,TVlines,or
TVL),ortoangularsubtenant.
Linepairsareoftenusedinsteadoflines.
Alinepaircomprisesadarklineandanadjacentlightline.
Alineiseitheradarklineoralightline.
Aresolutionof10linespermillimetermeans5darklinesalternatingwith5lightlines,or5
linepairspermillimeter(5LP/mm).
Photographiclensandfilmresolutionaremostoftenquotedinlinepairspermillimeter.

Resolutionofdigitalimages
Theresolutionofdigitalimagescanbedescribedinmanydifferentways.
Thetermresolutionisoftenusedforapixelcountindigitalimaging,eventhoughAmerican,Japanese,&
internationalstandardsspecifythatitshouldnotbesoused,atleastinthedigitalcamerafield.
•AnimageofNpixelshighbyMpixelswidecanhaveanyresolutionlessthanNlinesperpictureheight,orNTV
lines.Butwhenthepixelcountsarereferredtoasresolution,theconventionistodescribethepixelresolutionwith
thesetoftwopositiveintegernumbers,wherethefirstnumberisthenumberofpixelcolumns(width)andthe
secondisthenumberofpixelrows(height),forexampleas640by480.
•Anotherpopularconventionistociteresolutionasthetotalnumberofpixelsintheimage,typicallygivenas
numberofmegapixels,whichcanbecalculatedbymultiplyingpixelcolumnsbypixelrowsanddividingbyone
million.
•Otherconventionsincludedescribingpixelsperlengthunitorpixelsperareaunit,suchaspixelsperinchorper
squareinch.
•Accordingtothesamestandards,thenumberofeffectivepixelsthatanimagesensorordigitalcamerahasisthe
countofelementarypixelsensorsthatcontributetothefinalimage,asopposedtothenumberoftotalpixels,
whichincludesunusedorlight-shieldedpixelsaroundtheedges.
Noneofthesepixelresolutionsaretrueresolutions,buttheyarewidelyreferredtoassuch;
theyserveasupperboundsonimageresolution.

Effect of pixel resolutions
Belowisanillustrationofhowthesameimagemightappearatdifferentpixel
resolutions,ifthepixelswerepoorlyrenderedassharpsquares(normally,a
smoothimagereconstructionfrompixelswouldbepreferred,butforillustrationof
pixels,thesharpsquaresmakethepointbetter).

FurtherExplanation
Animagethatis2048pixelsinwidthand1536pixelsinheighthasatotalof2048×1536=3,145,728
pixels.
Onecouldrefertoitas2048by1536ora3.1-megapixelimage.
Unfortunately,thecountofpixelsisnotarealmeasureoftheresolutionofdigitalcamera
images,because:
Colorimagesensorsaretypicallysetuptoalternatecolorfiltertypesoverthelightsensitiveindividual
pixelsensors.
Digitalimagesultimatelyrequireared,green,andbluevalueforeachpixeltobedisplayedorprinted,
butoneindividualpixelintheimagesensorwillonlysupplyoneofthosethreepiecesofinformation.
Theimagehastobeinterpolatedordemosaicedtoproduceallthreecolorsforeachoutputpixel.

Spatialresolution
Themeasureofhowcloselylinescanberesolvedinanimageiscalledspatialresolution,anditdependsonpropertiesofthesystem
creatingtheimage,notjustthepixelresolutioninpixelsperinch(ppi).
Forpracticalpurposestheclarityoftheimageisdecidedbyitsspatialresolution,notthenumberofpixelsinanimage.
Ineffect,spatialresolutionreferstothenumberofindependentpixelvaluesperunitlength.
•Thespatialresolutionofcomputermonitorsisgenerally72to100linesperinch,correspondingtopixelresolutionsof72to100ppi.
•Withscanners,opticalresolutionisusedtodistinguishspatialresolutionfromthenumberofpixelsperinch.
•Ingeographicinformationsystems(GISs),spatialresolutionismeasuredbythegroundsampledistance(GSD)ofanimage,thepixel
spacingontheEarth'ssurface.
•Inastronomyoneoftenmeasuresspatialresolutionindatapointsperarcsecondsubtendedatthepointofobservation,sincethephysical
distancebetweenobjectsintheimagedependsontheirdistanceaway&thisvarieswidelywiththeobjectofinterest.
•Inelectronmicroscopy,lineorfringeresolutionreferstotheminimumseparationdetectablebetweenadjacentparallellines(e.g.
betweenplanesofatoms),whilepointresolutioninsteadreferstotheminimumseparationbetweenadjacentpointsthatcanbeboth
detected&interpretede.g.asadjacentcolumnsofatoms,forinstance.
•InStereoscopic3Dimages,spatialresolutioncouldbedefinedasthespatialinformationrecordedorcapturedbytwoviewpointsofa
stereocamera(left&rightcamera).
Itcouldbearguedthatsuch"spatialresolution"couldaddanimagethatthenwouldnotdependsolelyonpixelcountorDotsperinch
alone,whenclassifyingandinterpretingoverallresolutionofagivenphotographicimageorvideoframe.

Spatial resolution and Pixel count
Just make out difference !
Spatial resolution Pixel count

Spectral resolution
Color images distinguish light of different spectra.
Multi-spectral images resolve even finer differences of spectrum or wavelength than is needed to reproduce
color. That is, they can have higher spectral resolution. i.e. (high strength of each band).
Temporal resolution
Movie cameras and high-speed cameras can resolve events at different points in time.
The time resolution used for movies is usually 15 to 30 frames per second (frames/s),
while high-speed cameras may resolve 100 to 1000 frames/s, or even more.
Radiometric resolution
Radiometric resolution determines how finely a system can represent or distinguish differences of intensity,
and is usually expressed as a number of levels or a number of bits, for example, 8 bits or 256 levels that is
typical of computer image files.
The higher the radiometric resolution, the better subtle differences of intensity or reflectivity can be
represented, at least in theory.
In practice, the effective radiometric resolution is typically limited by the noise level, rather than by the
number of bits of representation.

Resolution in various media
This is a list of resolutions for various media.
Analog and early digital
352×240 : Video CD
300×480 : Umatic, Betamax, VHS, Video8
350×480 : Super Betamax, Betacam
420×480 : LaserDisc, Super VHS, Hi8
640×480 : Analog broadcast (NTSC)
670×480 : Enhanced Definition Betamax
768×576 : Analog broadcast (PAL, SECAM)
Digital
720×480 : D-VHS, DVD, miniDV, Digital8, Digital Betacam
720×480 : Widescreen DVD (anamorphic)
1280×720 : D-VHS, HD DVD, Blue-ray, HDV (miniDV)
1440×1080 : HDV (miniDV)
1920×1080 : HDV (miniDV), AVCHD, HD DVD, Blu-ray, HDCAM SR
2048×1080 : 2K Digital Cinema
4096×2160 : 4K Digital Cinema
7680×4320 : UHDTV
Film
35 mm film is scanned for release on DVD at 1080 or 2000 lines as of 2005.
However some photography sources gives 5380 x 3620 as the resolution of 35mm film.
It is similar to 19.5 Mpix, of course with identical spatial resolution.
IMAX, including IMAX HD and OMNIMAX: approximately 10,000×7000 (7000 lines) resolution.
It is about 70 Mpix, which may be considered to the biggest resolution.

Spatial Resolution and Pixel Size
The image resolution and pixel size are often used interchangeably.
In reality, they are not equivalent. An image sampled at a small pixel size does not necessarily has a high resolution.
The following three images illustrate this point. The first image is a SPOT image of 10 m pixel size.
It was derived by merging a SPOT panchromatic image of 10 m resolution with a SPOT multispectral image of 20 m
resolution.
The effective resolution is thus determined by the resolution of the panchromatic image, which is 10 m.
This image is further processed to degrade the resolution while maintaining the same pixel size.
The next two images are the blurred versions of the image with larger resolution size, but still digitized at the same
pixel size of 10 m.
Even though they have the same pixel size as the first image, they do not have the same resolution

RESOLUTION AND SHARPNESS
To determine resolution, a raster is normally used, employing increasingly fine bars and gaps. A common example in
real images would be a picket fence displayed to perspective.
In the image of the fence, shown in Fig. 1, it is evident that the gaps between the boards become increasingly difficult
to discriminate as the distance becomes greater.
This effect is the basic problem of every optical image.
In the foreground of the image, where the boards and gaps have not yet been squeezed together by the perspective, a
large difference in brightness is recognized.
The more the boards and gaps are squeezed together in the distance, the less difference is seen in the brightness.
To better understand this effect, the brightness values are shown along the yellow arrow in an x / y diagram (Fig. 2).
The brightness difference seen in the y-axis is called contrast.
The curve itself functions like a harmonic oscillation; because the brightness does not change over time but spatially
from left to right, the x-axis is called spatial frequency.

It can be clearly seen in Fig. 1 that the finer the reproduced structure, the more the contrast
will be “slurred” at that point in the image.
The limit of the resolution has been reached when one can no longer clearly differentiate
between the structures.
This means the resolution limit (red circle indicated in Fig. 2) lies at the spatial frequency
where there is just enough contrast left to clearly differentiate between board and gap.

Resolution = Sharpness?
Are resolution and sharpness the same? By looking at the images shown below, one can quickly determine which image
is sharper.
Although the image on the left comprises twice as many pixels, the image on the right, whose contrast at coarse details
is increased with a filter, looks at first glance to be distinctly sharper.
Theresolutionlimitdescribeshowmuchinformationmakesupeachimage,butnothowapersonevaluatesthis
information.
The human eye, in fact, is able to resolve extremely fine details.
This ability is also valid for objects at a greater distance.
The decisive physiological point, however, is that fine details do not contribute to the subjective perception of
sharpness.
Therefore, it’s important to clearly separate the two terms, resolution and sharpness.

MTF
Modulation transfer function describes the relationship between resolution and sharpness, and is the basis for a
scientific confirmation of the phenomenon described earlier.
The modulation component in MTF means approximately the same as contrast.
If we evaluate the contrast (modulation) not only where the resolution reaches its limit, but over as many spatial
frequencies as possible and connect these points with a curve, we arrive at the so-called MTF.
As shown in figure , the x-axis illustrates the already-established spatial frequency expressed in lp/ mm on the y-axis,
instead of the brightness seen in modulation.
A modulation of 1 (or 100%) is the ratio of the brightness of a completely white image to the brightness of a
completely black image.
The higher the spatial frequency—in other words the finer the structures in the image -the lower the transferred
modulation. (lp= lines pair )
Conclusions:
•Sharpness does not depend only on resolution.
•The modulation at lower spatial frequencies is
essential.
•Contrast in coarse details is significantly more imp for
the impression of sharpness than contrast at the
resolution limit.

Resolution of the human eye
The fovea of the human eye (the part of the retina that is responsible for sharp central vision) includes
about 140 000 sensor-cells per square millimeter.
This means that if two objects are projected with a separation distance of more than 4 m on the fovea,
a human with a normal visual acuity (20/20) can resolve them.
Ontheobjectside,thiscorrespondsto0.2mminadistanceof1m(or1minuteofarc).
In practice of course, this depends on whether the viewer is concentrating only on the center of the
viewing field, whether the object is moving very slowly or not at all, and whether the object has good
contrast to the background. Allowing for some amount of tolerance, this would be around 0.3 mm at 1
m distance (= 1.03 minutes of arc ). In a certain range, one can assume a linear relation between
distance and the detail size

This hypothesis can be easily proved !!!
Pin the test pattern displayed in Figure below on a well-lit wall and walk away 10 m.
One should be able to clearly differentiate between the lines and gaps in Figure.
Of course, this requires an ideal visual acuity of 20/20.
Nevertheless, if you can’t resolve the pattern in Figure,
you might consider paying a visit to an ophthalmologist ! 

Let us see significance of spatial resolution and various other related terms:
Four main types of information contained in an optical image are often utilized for
image interpretation:
•Radiometric Information (i.e. brightness, intensity, tone),
•Spectral Information (i.e. color, hue),
•Textural Information,
•Geometric and Contextual Information.
They are illustrated in the following examples,
How we interpret optical images ?

There are different types of images :
•Panchromatic Images
•Multispectral Images
•Color Composite Images
•True Color Composite images
•False Color Composite images
•Natural Color Composite

Panchromaticimage
Apanchromaticimageconsistsofonlyoneband.
Itisusuallydisplayedasagreyscaleimage.
Panchromaticimagemaybesimilarlyinterpretedasablack-and-whiteaerialphotographofthearea.
TheRadiometricInformationisthemaininformationtypeutilizedintheinterpretation.
A panchromatic image extracted from a SPOT panchromatic scene at a ground resolution of 10 m.
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427and
http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)

MultispectralImages
Amultispectralimageconsistsofseveralbandsofdata.
Forvisualdisplay,eachbandoftheimagemaybedisplayedonebandatatimeasagreyscaleimage,orincombinationof3bandsatatime
asacolorcompositeimage.
Interpretationofamultispectralcolorcompositeimagewillrequiretheknowledgeofthespectralreflectancesignatureofthetargetsinthe
scene.
Inthiscase,thespectralinformationcontentoftheimageisutilizedintheinterpretation.
Thefollowing3imagesshowthe3bandsofamultispectralimageextractedfromaSPOTmultispectralsceneatagroundresolutionof20m.
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)

Color Composite Images
In displaying a color composite image, three primary colors (red, green and blue) are used.
When these three colors are combined in various proportions, they produce different colors in the visible
spectrum.
Associating each spectral band (not necessarily a visible band) to a separate primary color results in a
color composite image.

TrueColorComposite
Ifamultispectralimageconsistsofthethreevisualprimarycolorbands(red,green,blue),thethreebandsmaybe
combinedtoproducea"truecolor"image.
Thebands3(redband),2(greenband)and1(blueband)ofaLANDSATTMimageoranIKONOSmultispectral
imagecanbeassignedrespectivelytotheR,G,andBcolorsfordisplay.
Inthisway,thecolorsoftheresultingcolorcompositeimageresemblecloselywhatwouldbeobservedbythehuman
eyes.
A 1-m resolution true-color IKONOS image
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)

False Color Composite
The display color assignment for any band of a multispectral imagecan be done in an entirely arbitrary manner.
In this case, the color of a target in the displayed image does not have any resemblance to its actual colour.
The resulting product is known as a false colour compositeimage.
There are many possible schemes of producing false colour composite images.
Some schemes are suitable for detecting certain objects in the image.
False colour composite multispectral SPOT image
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)

Natural Colour Composite
For optical images lacking one or more of the three visual primary colour bands (i.e. red, green and blue), the spectral
bands (some of which may not be in the visible region) may be combined in such a way that the appearance of the
displayed image resembles a visible colour photograph, i.e. vegetation in green, water in blue, soil in brown or grey, etc.
Somepeoplerefertothiscompositeasa"truecolour"composite.However,thistermismisleadingsinceinmany
instancesthecolorsareonlysimulatedtolooksimilartothe"true"colorsofthetargets.Theterm"naturalcolour"is
preferred.
Natural colour composite multispectral SPOT image
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)

Vegetation Indices
Different bands of a multispectral image may be combined to accentuate the vegetated areas.
One such combination is the ratio of the near-infrared band to the red band. This ratio is known as the
Ratio Vegetation Index (RVI)
RVI = NIR/Red
Normalized Difference Vegetation Index (NDVI)
Since vegetation has high NIR reflectance but low red reflectance, vegetated areas will have higher RVI
values compared to non-vegetated areas. Another commonly used vegetation index is the Normalized
Difference Vegetation Index (NDVI) computed by
NDVI = (NIR -Red)/(NIR + Red)

TexturalInformation
Textureisanimportantaidinvisualimageinterpretation,especiallyforhighspatialresolutionimagery.
Itisalsopossibletocharacterizethetexturalfeaturesnumerically,andalgorithmsforcomputer-aided
automaticdiscriminationofdifferenttexturesinanimageareavailable.
IKONOS 1-m resolution pan-sharpened color image of an oil palm plantation.
Even though the general colour is green throughout, three distinct, land cover types can be identified from the
image texture.
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)

Remote Sensing Satellites
Optical remote sensing makes use of visible, near infrared & short-wave infrared sensors to form images of the earth's surface .
By detecting the solar radiation reflected from targets on the ground.
Different materials reflect and absorb differently at different wavelengths.
Thus, the targets can be differentiated by their spectral reflectance signatures in the remotely sensed images.
Optical remote sensing systems are classified into the following types, depending on the number of spectral bands used in theimaging
process.
Several remote sensing satellites are currently available, providing imagery suitable for various types of applications.
Each of these satellite-sensor platform is characterized by the
•Wavelength bands employed in image acquisition,
•Spatial resolution of the sensor,
•The coverage area and the temporal coverage, i.e. how frequent a given location on the earth surface can be imaged by the
imaging system.

In terms of the spatial resolution, the satellite imaging systems can be classified into:
•Low resolution systems (approx. 1 km or more)
•Medium resolution systems (approx. 100 m to 1 km)
•High resolution systems (approx. 5 m to 100 m)
•Very high resolution systems (approx. 5 m or less)
In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into:
•Optical imaging systems (include visible, near infrared, and shortwave infrared systems)
•Thermal imaging systems
•Synthetic aperture radar (SAR) imaging systems
Optical/thermal imaging systems can be classified according to the number of spectral bands used:
•Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems
•Multispectral (several spectral bands) systems
•Superspectral (tens of spectral bands) systems
•Hyper spectral (hundreds of spectral bands) systems
Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands &
polarization modes used in data acquisition, e.g.:
•Single frequency (L-band, or C-band, or X-band)
•Multiple frequency (Combination of two or more frequency bands)
•Single polarization (VV, or HH, or HV)
•Multiple polarization (Combination of two or more polarization modes)

References :
http://www.ssec.wisc.edu/sose/pirs/pirs_m2_res.html
http://www.crisp.nus.edu.sg/~research/tutorial/opt_int.htm
http://www.arri.de/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf
http://en.wikipedia.org/wiki/Image_resolution

Thank
You