This unit introduces remote sensing as the process of gathering information about the Earth from a distance, primarily using satellite and aerial technologies. It covers the historical development, key components such as energy sources, electromagnetic spectrum, and types of sensors. The unit also e...
This unit introduces remote sensing as the process of gathering information about the Earth from a distance, primarily using satellite and aerial technologies. It covers the historical development, key components such as energy sources, electromagnetic spectrum, and types of sensors. The unit also explores active versus passive remote sensing, atmospheric effects, energy interactions with surface features, and data acquisition techniques. Additionally, it discusses satellite types, resolutions, and recent advancements in satellite technology and their applications.
Size: 613.95 KB
Language: en
Added: Oct 17, 2024
Slides: 40 pages
Slide Content
UNIT 1 – Components of Remote Sensing Introduction Humans are closely associated with remote sensing in day to day activities by collecting and inferring useful information about the surroundings sensed through the eyes. Our eyes act as sensors which are limited to record only the visible portion of the electromagnetic energy and our brain act as a processing unit which stores the viewed information for a limited number of days. This limitation forced mankind to develop a technique capable of acquiring information about an object or phenomena covering almost the entire range of electromagnetic spectrum. The data so acquired is stored in some medium (e.g. DVDs, CDs etc.) for future interpretation and analysis. Present day sensors are installed on board satellite platforms and are capable of imaging large portions of earth and continuously transfer the digital data electronically to the ground stations. The science of Remote Sensing has continuously evolved in the data acquisition methods as well as data processing techniques and the variety of applications it is used for.
The remote sensing technology has advanced particularly towards variety of applications related to land, water and atmosphere issues e.g. water resources development and management, soil and mineral explorations, agricultural and land use practices, air quality monitoring, disaster management and mitigation, ocean studies and many more.
Definition Our eyes are an excellent example of a remote sensing device that are capable of gathering information about the surroundings by judging the amount and nature of the reflectance of visible light energy from some external source (such as the sun or any artificial source of light) as it reflects off objects in our field of view. Gathering information where sensor and the object are not in direct contact with each other may be termed as remote sensing, for example, reading news paper, looking across window, perspective view form high terrace etc. In contrast, a thermometer, which must be in contact with the phenomenon it measures, is not a remote sensing device. In the broadest sense, the term remote sensing can be defined as the science of acquiring information about the earth using instruments which aren’t in direct contact with the earth's surface or features, usually from aircraft or satellites. Instrument aboard satellite or aircraft is usually a sensor which is capable of acquiring information in the entire region of electromagnetic spectrum (i.e. visible light, infrared or radar etc.). Remote sensing offers the ability to observe and collect data for large areas relatively quickly, and is an important source of data for Geographical Information System (GIS) interface.
Components of RS Every remote sensing process involves an interaction of the incident radiation falling over the target of interest in a sense that, the radiation incident over the target is altered on account of the physical properties of the target and reflect back the incident radiation which is recorded by the sensor. This is illustrated by the use of imaging systems (referred as optical remote sensing) where the following seven elements of remote sensing are involved. It should also be noted that remote sensing also involves the sensing of emitted energy and the use of non imaging sensors (referred as thermal remote sensing).
Components of RS i ) Source of Illumination (I) - The foremost requirement for any remote sensing process is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. ii) Radiation and the Atmosphere (II) – as the energy propagates from its source to the target, it interacts with the atmosphere as it passes through. This interaction may take place a second time as the energy travels from the target and back to the sensor. iii) Interaction with the Target (III) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the characteristics of both the target and the radiation. iv) Recording of Energy by the Sensor (IV) - after the energy has been scattered or Emitted from the target, a sensor is required to collect and record the electromagnetic radiation. v) Transmission, Reception, and Processing (V) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital). vi) Interpretation and Analysis (VI) - the processed image is interpreted, visually or digitally or electronically, to extract information about the target which was illuminated. vii)End users and application (VII) - the last element of the remote sensing process is achieved when the useful information is extracted from the imagery reveal some new information, or assist in solving a particular problem.
Components of remote sensing.
Energy source and electromagnetic spectrum The first requirement for remote sensing is to have an energy source to illuminate the target (unless the sensed energy is being emitted by the target). This energy is in the form of electromagnetic radiation. Two characteristics of electromagnetic radiation are particularly important to understand remote sensing. These are the wavelength and frequency. The wavelength is the length of one wave cycle, which can be measured as the distance between successive wave crests. Frequency refers to the number of cycles of a wave passing a fixed point per unit of time. C= λ v Understanding the characteristics of electromagnetic radiation in terms of their wavelength and frequency is crucial to understanding the information to be extracted from remote sensing data.
Energy Sources
Platforms and Sensors Platform is a stage where sensor or camera is mounted to acquire information about a target under investigation. A platform is a vehicle, from which a sensor can be operated. For remote sensing applications, sensors should be mounted on suitable stable platforms As the platform height increases the spatial resolution and observational a Type of Platforms: 1.Ground based Platform 2. Air - borne Platform, and 3. Space-borne Platform
Ground based Platforms Ground based platforms are used to record detailed information about the objects or features of the earth’s surface These are developed for the scientific understanding on the signal-object and signal-sensor interactions. It includes both the laboratory and field study, used for both in designing sensors and identification and characterization of land features. Ex: Crane, Ground based platform Air- borne/ based Platforms: Airborne platforms were the sole non-ground-based platforms for early remote sensing work. Aircraft remote sensing system may also be referred to as sub-orbital or airborne, or aerial remote sensing system
At present, airplanes are the most common airborne platform. Observation platforms include balloons, drones, Aircraft platform and Rockets. Space-borne/ based Platforms : In space- borne remote sensing, sensors are mounted on-board a spacecraft (space shuttle or satellite) orbiting the earth. Space-borne or satellite platform are onetime cost effected but relatively lower cost per unit area of coverage, can acquire imagery of entire earth without taking permission. Space-borne imaging ranges from altitude 250 km to 36000 km. a . Spacecraft as Platform: Remote sensing is also conducted from the space shuttle or artificial satellites. Artificial satellites are manmade objects, which revolve around another object. Satellite can cover much more land space than planes and can monitor areas on a regular basis. Later on with LANDSAT and SPOT satellites program, space photography received a higher impetus
Active and passive remote sensing interference PASSIVE SENSORS Passive sensors do not have their own source of energy. The earth surface is illuminated by sun/solar energy. The reflected solar energy from the earth surface or the emitted electromagnetic energy by the earth surface itself is received by the sensor Photographic camera is a passive sensor when it is used in sunlight, without using its flash. ACTIVE SENSORS In the case of active remote sensing, energy is generated and sent from the remote sensing platform towards the targets. The energy reflected back from the targets are recorded using sensors on board the remote sensing platform. Most of the microwave remote sensing is done through active remote sensing.
Effect of Atmosphere In many respects, remote sensing can be thought of as a reading process. Using various sensors, we remotely collect data that are analysed to obtain information about the objects, areas or phenomena being investigated. In most cases the sensors are electromagnetic sensors either air-borne or space-borne for inventorying. The sensors record the energy reflected or emitted by the target features. In remote sensing, all radiations traverse through the atmosphere for some distance to reach the sensor. As the radiation passes through the atmosphere, the gases and the particles in the atmosphere interact with them causing changes in the magnitude, wavelength, velocity, direction, and polarization.
Energy Interactions The radiation from the energy source passes through some distance of atmosphere before being detected by the remote sensor. The distance travelled by the radiation through the atmosphere is called the path length. The path length varies depending on the remote sensing techniques and sources. The effect of atmosphere on the radiation depends on the properties of the radiation such as magnitude and wavelength, atmospheric conditions and also the path length. Intensity and spectral composition of the incident radiation are altered by the atmospheric effects. The interaction of the electromagnetic radiation with the atmospheric particles may be a surface phenomenon (e.g., scattering) or volume phenomenon (e.g., absorption). Scattering and absorption are the main processes that alter the properties of the electromagnetic radiation in the atmosphere.
Spectral Reflectance Curve for Vegetation Spectral reflectance curve for healthy green vegetation exhibits the "peak-and-valley“. The peaks indicate strong reflection and the valleys indicate predominant absorption of the energy in the corresponding wavelength bands. In general, healthy vegetations are very good absorbers of electromagnetic energy in the visible region. The valleys in the visible portion of the spectrum are due to the pigments in plant leaves. The palisade cells containing sacs of green pigment (chlorophyll) strongly absorb energy in the wavelength bands centered at 0.45 and 0.67 μm within visible region (corresponds to blue and red) . On the other hand, reflection peaks for the green colour in the visible region, which makes our eyes perceive healthy vegetation as green in colour. However, only 10-15% of the incident energy is reflected in the green band.
Spectral Reflectance Curve for Vegetation Spectral reflectance curve for healthy green vegetation exhibits the "peak-and-valley“. The peaks indicate strong reflection and the valleys indicate predominant absorption of the energy in the corresponding wavelength bands. In general, healthy vegetations are very good absorbers of electromagnetic energy in the visible region. The valleys in the visible portion of the spectrum are due to the pigments in plant leaves. The palisade cells containing sacs of green pigment (chlorophyll) strongly absorb energy in the wavelength bands centered at 0.45 and 0.67 μm within visible region (corresponds to blue and red) . On the other hand, reflection peaks for the green colour in the visible region, which makes our eyes perceive healthy vegetation as green in colour. However, only 10-15% of the incident energy is reflected in the green band.
Healthy vegetation therefore shows brighter response in the NIR region compared to the green region. As the leaf structure is highly variable between plant species, reflectance measurements in this range often permit discrimination between species, even if they look same in visible wavelengths. If a plant is subjected to some form of stress that interrupts its normal growth and productivity, it may decrease or cease chlorophyll production. The result is less absorption in the blue and red bands in the palisade. Hence, red and blue bands also get reflected along with the green band, giving yellow or brown colour to the stressed vegetation. Also in stressed vegetation, the NIR bands are no longer reflected by the mesophyll cells, instead they are absorbed by the stressed or dead cells causing dark tones. Beyond 1.3 μm , energy incident upon the plants is essentially absorbed or reflected, with little to no transmittance of energy.
Similar to the reflection and absorption, transmittance of the electromagnetic radiation by the vegetation also varies with wavelength. Transmittance of electromagnetic radiation is less in the visible region and it increases in the infrared region. Vegetation canopies generally display a layered structure. Therefore, the energy transmitted by one layer is available for reflection or absorption by the layers below it. Due to this multi-layer reflection, total infrared reflection from thicker canopies will be more compared to thin canopy cover. From the reflected NIR, the density of the vegetation canopy can thus be interpreted. As the reflectance in the IR bands of the EMR spectrum varies with the leaf structure and the canopy density, measurements in the IR region can be used to discriminate the tree or vegetation species
Spectral Reflectance of Soil Some of the factors effecting soil reflectance are moisture content, soil texture (proportion of sand, silt, and clay), surface roughness, presence of iron oxide and organic matter content. These factors are complex, variable, and interrelated. For example, the presence of moisture in soil decreases its reflectance. As with vegetation, this effect is greatest in the water absorption bands at 1.4, 1.9, and 2.7 μm . On the other hand, similar absorption characteristics are displayed by the clay soils. Clay soils have hydroxyl ion absorption bands at 1.4 and 2.2 μm . Soil moisture content is strongly related to the soil texture. For example, coarse, sandy soils are usually well drained, resulting in low moisture content and relatively high reflectance. On the other hand, poorly drained fine textured soils generally have lower reflectance. In the absence of water, however, the soil itself exhibits the reverse tendency i.e., coarse textured soils appear darker than fine textured soils.
Spectral Reflectance for Water Water provides a semi-transparent medium for the electromagnetic radiation. Thus the electromagnetic radiations get reflected, transmitted or absorbed in water. The spectral responses vary with the wavelength of the radiation and the physical and chemical characteristics of the water. Water in the liquid form shows high reflectance in the visible region between 0.4μm and 0.6μm. Wavelengths beyond 0.7μm are completely absorbed. Thus clear water appears in darker tone in the NIR image. Locating and delineating water bodies with remote sensing data is done more easily in reflected infrared wavelengths because of this absorption property. The reflectance from a water body can stem from an interaction with the water's surface (specular reflection), with material suspended in the water, or with the bottom surface of the water body. Even in deep water, where bottom effects are negligible, the reflectance properties of a water body are not only a function of the water, but also of the material in the water.
T he reflectance of water changes with the chlorophyll concentration involved. Increase in chlorophyll concentration tends to decrease reflectance in blue wavelengths and increase reflectance in green wavelengths. These changes have been used in remote sensing to monitor the presence and to estimate the concentration of algae. Variation in the spectral reflectance in the visible region can be used to differentiate shallow and deep waters, clear and turbid waters, as well as rough and smooth water bodies. Reflectance in the NIR range is generally used for delineating the water bodies and also to study the algal boom and phytoplankton concentration in water.
Some of the important applications of remote sensing technology includes: a) Environmental monitoring and assessment (global warming etc.). b) Land use and land cover global change detection and monitoring. c) Prediction of agricultural yield and crop health monitoring. d) Sustainable resource exploration and management. e) Ocean and wetland studies. f) Weather forecasting. g) Defence and military surveillance. h) Broadcasting and tele-communication.
SATELLITES When a satellite is launched into the space, it moves in a well defined path around the Earth, which is called the orbit of the satellite. Gravitational pull of the Earth and the velocity of the satellite are the two basic factors that keep the satellites in any particular orbit. Spatial and temporal coverage of the satellite depends on the orbit. There are three basic types of orbits in use. Geo-synchronous orbits Polar or near polar orbits Sun-synchronous orbits
Characteristics of satellite orbits Orbital period Altitude Apogee and perigee Inclination Nadir, ground track and zenith
Geosynchronous orbit Geostationary or geosynchronous orbit is the one in which the time required for the satellite to cover one revolution is the same as that for the Earth to rotate once about its polar axis. In order to achieve this orbit period, geo-synchronous orbits are generally at very high altitude; nearly 36,000 km. Geo-synchronous orbits are located in the equatorial plane, i.e with an inclination of 180 degrees. Thus from a point on the equator, the satellite appears to be stationary. The satellites revolve in the same direction as that of the Earth (west to East). Geostationary or geosynchronous orbits are used for communication and meteorological satellites. Example: INSAT, MeteoSAT , GOES, GMS etc. Geostationary or geosynchronous orbits are used for communication and meteorological satellites. Example: INSAT, MeteoSAT , GOES, GMS etc.
Polar (or Near Polar) orbits These orbits have near 90 degree inclination. Polar orbits are usually medium or low orbits (approximately 700-800km) compared to the geo-synchronous orbits. Consequently the orbit period is less, which typically varies from 90-103 minutes. Therefore satellites in the polar orbits make more than one revolution around the earth in a single day. The National Oceanic and Atmospheric Administration (NOAA) series of satellites like NOAA 17, NOAA 18 are all examples of polar orbiting satellites. Taking advantage of the rotation of the earth on its own axis, each time newer segments of the Earth will be under view of the satellite. The satellite's orbit and the rotation of the Earth work together to allow complete coverage of the Earth's surface, after it has completed one complete cycle of orbits.
Sun-synchronous orbits It is a special case of polar orbit. Like a polar orbit, the satellite travels from the north to the south poles as the Earth turns below it. In a sun-synchronous orbit, the satellite passes over the same part of the Earth at roughly the same local time each day. These orbits are between 700 to 800 km altitudes. These are used for satellites that need a constant amount of sunlight. A typical sun synchronous satellite completes 14 orbits a day, and each successive orbit is shifted over the Earth’s surface by around 2875 km at the equator. Also the satellite’s path is shifted in longitude by 1.17deg (approximately 130.54 km) everyday towards west, at the equator “from platforms and sensors”. Landsat satellites and IRS satellites are typical examples of sun-synchronous, near-polar satellites. Repeat cycle of the satellite was 18days and each day 14 orbits were completed.
Resolution Resolution is defined as the ability of an entire remote-sensing system, including lens antennae, display, exposure, processing, and other factors, to render a sharply defined image. Resolution of a remote-sensing is of different types. Spectral Resolution Radiometric Resolution Spatial Resolution Temporal Resolution
Spatial resolution A digital image consists of an array of pixels. Each pixel contains information about a small area on the land surface, which is considered as a single object. Spatial resolution is a measure of the area or size of the smallest dimension on the Earth’s surface over which an independent measurement can be made by the sensor. It is expressed by the size of the pixel on the ground in meters. A measure of size of pixel is given by the Instantaneous Field of View (IFOV). The IFOV is the angular cone of visibility of the sensor, or the area on the Earth’s surface that is seen at one particular moment of time. IFOV is dependent on the altitude of the sensor above the ground level and the viewing angle of the sensor.
The size of the area viewed on the ground can be obtained by multiplying the IFOV (in radians) by the distance from the ground to the sensor. This area on the ground is called the ground resolution or ground resolution cell. It is also referred as the spatial resolution of the remote sensing system. For a homogeneous feature to be detected, its size generally has to be equal to or larger than the resolution cell. If more than one feature is present within the IFOV or ground resolution cell, the signal response recorded includes a mixture of the signals from all the features. When the average brightness of all features in that resolution cell is recorded, any one particular feature among them may not be detectable. Based on the spatial resolution, satellite systems can be classified as follows. Low resolution systems Medium resolution systems High resolution systems High resolution systems
Remote sensing systems with spatial resolution more than 1km are generally considered as low resolution systems. MODIS and AVHRR are some of the very low resolution sensors. When the spatial resolution is 100m – 1km, such systems are considered as moderate resolution systems. IRS WiFS (188m), band 6 i.e., thermal infrared band, of the Landsat TM (120m). Remote sensing systems with spatial resolution approximately in the range 5-100m are classified as high resolution systems. Landsat ETM+ (30m), IRS LISS-III. Very high resolution systems are those which provide less than 5m spatial resolution. GeoEye and Quickbird . The ratio of distance on an image or map, to actual ground distance is referred to as scale. L arge scale maps/images provide finer spatial resolution compared to small scale maps/images.
Spectral resolution Spectral resolution represents the spectral band width of the filter and the sensitiveness of the detector. The spectral resolution may be defined as the ability of a sensor to define fine wavelength intervals or the ability of a sensor to resolve the energy received in a spectral bandwidth to characterize different constituents of earth surface. The finer the spectral resolution, the narrower the wavelength range for a particular channel or band. Many remote sensing systems are multi-spectral, that record energy over separate wavelength ranges at various spectral resolutions. Recent development is the hyper-spectral sensors, which detect hundreds of very narrow spectral bands. Generally surface features can be better distinguished from multiple narrow bands, than from a single wide band.
Radiometric resolution Radiometric resolution of a sensor is a measure of how many grey levels are measured between pure black (no reflectance) to pure white. In other words, radiometric resolution represents the sensitivity of the sensor to the magnitude of the electromagnetic energy. The finer the radiometric resolution of a sensor the more sensitive it is to detecting small differences in reflected or emitted energy or in other words the system can measure more number of grey levels. Radiometric resolution is measured in bits. Finer the radiometric resolution, more the number of grey levels that the system can record and hence more details can be captured in the image.
Temporal Resolution Temporal resolution describes the number of times an object is sampled or how often data are obtained for the same area. The absolute temporal resolution of a remote sensing system to image the same area at the same viewing angle a second time is equal to the repeat cycle of a satellite. The actual temporal resolution of a sensor therefore depends on a variety of factors, including the satellite/sensor capabilities, the swath overlap, and latitude. Because of some degree of overlap in the imaging swaths of the adjacent orbits, more frequent imaging of some of the areas is possible.
I mages of the same area of the Earth's surface at different periods of time show the variation in the spectral characteristics of different features or areas over time. Such multi-temporal data is essential for the following studies. • Land use/ land cove classification • Temporal variation in land use / land cover • Monitoring of a dynamic event like – Cyclone – Flood – Volcano – Earthquake