Upload
brinda
View
63
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Lecture 6. Atmospheric Effects and Corrections . Terminology. Radiant flux Irradiance Radiance Reflection Transmittance. Radiance received at a remote sensor. Radiance ( L T ) from paths 1, 3, and 5 contains intrinsic valuable spectral information about the target of interest. - PowerPoint PPT Presentation
Citation preview
Atmospheric Effects and Corrections
Lecture 6
Terminology
• Radiant flux• Irradiance • Radiance• Reflection• Transmittance
Radiance (LT) from paths 1, 3, and 5 contains intrinsic valuable spectral information about the target of interest.
Conversely, the path radiance (Lp) from paths 2 and 4 includes diffuse sky irradiance or radiance from neighboring areas on the ground. This path radiance generally introduces unwanted radiometric noise in the remotely sensed data and complicates the image interpretation process.
Radiance received at a remote sensor
• Path 1 contains spectral solar irradiance ( Eo) that was attenuated very little before illuminating the terrain within the IFOV.
• We are interested in the solar irradiance from a specific solar zenith angle ( θo)
• The amount of irradiance reaching the terrain is a function of the atmospheric transmittance at this angle (Tθo).
• If all of the irradiance makes it to the ground, then the atmospheric transmittance equals one. If none of the irradiance makes it to the ground, then the atmospheric transmittance is zero.
Radiance received at a remote sensor
• Path 2 contains spectral diffuse sky irradiance ( Ed ) that never reaches the the target study area because of scattering in the atmosphere.
• This energy is often scattered into the IFOV of the sensor system.
• Rayleigh scattering of blue light contributes much to this diffuse sky irradiance. Hence blue band image produced by a remote sensor system is often much brighter than any of the other bands and contains much unwanted diffuse sky irradiance that was scattered into the IFOV of the sensor system.
• Therefore, if possible, we want to minimize its effects. This quantity is referred to as the upward reflectance of the atmosphere (Edu).
Radiance received at a remote sensor
• Path 3 contains modified energy from the Sun that has undergone some Rayleigh, Mie, and/or nonselective scattering and perhaps some absorption and reemission before illuminating the study area.
• Its spectral composition and polarization may be somewhat different from the energy that reaches the ground from path 1.
• This quantity is also referred to as the downward reflectance of the atmosphere (Edu).
Radiance received at a remote sensor
• Path 4 contains radiation that was reflected or scattered by nearby terrain covered by snow, concrete, soil, water, and/or vegetation into the IFOV of the sensor system.
• The energy does not actually illuminate the study area of interest. Therefore, if possible, we would like to minimize its effects.
• Path 2 and Path 4 combine to produce what is commonly referred to as Path Radiance, Lp.
Radiance received at a remote sensor
Path 5 is energy that was also reflected from nearby terrain into the atmosphere, but then scattered or reflected onto the study area.
Generally insignificant.
Radiance received at a remote sensor
Images are arrays of pixels, where each pixel is represented by a brightness value or grey level, generally between 0 and 255. These values are called DNs.
We can determine the radiance at the sensor for any pixel from its DN value, between 0 and 255:
where
Lmax and Lmin are maximum and minimum measurable radiances of the sensor.
k and Lmin are also called gain and offset of the detector.
This information is provided by the sensor manufacturer.
min)( LkDNL pixpxl
max
minmax
DNLLk
Radiance received at a remote sensor
Radiance received at a remote sensorBAND 1 2 3 4 5 7
Lmin (W/m2/sr/μm)
-1.5 -2.8 -1.2 -1.5 -0.37 -0.15
Lmax (W/m2/sr/μm)
152.1 296.8 204.3 206.2 27.19 14.38
Preflight TM-4 and TM-5 spectral range values (from NASA, 1986, Table C-8)
DN value of a pixel in bands 1 and 7 is 100 Maximum DN value in both bands is 255
Radiance at the pixel in band 1? In band 7??
For band 1, k = (152.1+1.5)/255 = 0.602353L pix = (100 x 0.602353 ) – 1.5 = 58.73 W/m2/sr/μm
For band 7, k = (14.38+0.15)/255 = 0.05698Lpix = (100 x 0.05698) – 0.15 = 5.54 W/m2/sr/μm
0cossunE
Remote sensing systems sense wavebands, rather than specific wavelengths. The available irradiance (Eo) in a specific wave band between λ1 and λ2 in the area of interest is
dEsun
2
1
0cos or2
0cosd
Esun
where Δλ = λ2- λ1 is very small and Esun Δ λ is the average irradiance in the band Δλ. d2 (in AU)accounts for varying distance of earth from the Sun. If the reflectance of the pixel of interest is R, then the radiant exitance of the pixel is:
We know that
20cos
dREL sun
Pxl
The irradiance (Esunλ) of the sun in a specific length (λ) at a solar zenith angle of θ is
Radiance received at a remote
sensor
Rd
EE sunPxl
20cos
Pxl
PxlPxlPxlPxlELLddSinLE ...cos
2
0
2/
0
or 0cossunE
dsun
incident ETd
EEo
))cos(( 2
0)(
R is the reflectance. If the atmospheric transmission in the direction θv is Tθv , then the radiance Lsensor arriving at the sensor after traversing the atmosphere is:
However the atmosphere scatters and absorbs a proportion of the solar irradiance. If the scattered or diffused sky irradiance is Ed and Tθo is the atmospheric transmission, i.e., the proportion of radiance transmitted by the atmosphere, in the direction θo, then the total irradiance at the pixel =
oT
The radiance from the pixel due to this irradiance =
22
0 }))cos{((dRdETEL dsunPxl o
path
dsunSensor
LdRTdETEL
vo
}))cos{(( 22
0
Radiance received at a remote sensor
sunE
PxlL
SensorL
PathL
Where Lpath is path radiance.
Radiance received at a remote sensor
sunE
PxlL
SensorL
PathL
???
cos)(
....
)())cos(
))cos(
)(
1...
)(
)()))cos((
}))cos{((
}))cos{((
2
0
0
2
22
0
22
0
22
0
22
0
22
0
path
sun
sensor
sun
pathsensor
d
pathsensordsun
dsunpxl
pathsensorpxl
pathSensorpxl
pathsensor
L
dsun
pathsensordsun
pathdsunsensor
LKnownd
KnownKnown
KnownEKnownL
EdLL
R
negligibleEAssume
LLdRdETE
dRdETEL
LLL
TAssume
TLL
L
TLL
dRdETE
LLdRTdETE
OR
LdRTdETEL
o
o
v
v
v
pxl
o
vo
vo
Bidirectional Reflectance Distribution Function
Bidirectional Reflectance Distribution Function
Bidirectional Reflectance Distribution Function
Bidirectional Reflectance Distribution Function
The bidirectional reflectance distribution function (BRDF) is a theoretical concept that describes the relationship between 1) the geometric characteristics of the solar irradiance, and 2) the remote sensing system viewing geometry; hence the bidirectional terminology (Sandmeier, 1996; Jensen, 2000)
),();,;,();,;,(
ii
rriirrii dE
dLf
Bidirectional Reflectance Distribution FunctionVery difficult to acquire BRDF information about a surface because a) the Sun is constantly moving across the sky, and/or b) it is difficult to acquire multiple images of the terrain from various angles of view in a short
period of time.
This problem resulted in the invention of the goniometer; a specialized instrument that measuresspectral reflectance in a specified number of directions distributed throughout the hemisphere above
a particular surface in a very short time (5-10 minutes), allowing scientists to generate a useful BRDF for that surface.
Bidirectional Reflectance Distribution FunctionBidirectional reflectance factor (R)
);,;,();,;,(
);,;,();,;,(
rriirefrriiref
rriirrii R
dLdLR
dLr is the energy reflected from a surface in a specific direction divided by the radiance dLref , reflected from a loss-less Lambertian reference panel measured under identical illumination geometry. The Rref is a calibration coefficient determined for the spectral reflectance panel used.
The bidirectional reflectance factor (R) is then normalized to an anisotrophy factor (ANIF) to analyze the spectral variability in BRDF data.
The ANIF is calculated by normalizing bidirectional reflectance data R to nadir reflectance, Ro using the equation (Sandmeier et al., 1998a; Sandmeier and Itten, 1999)
)();,;,();,;,(
o
rriirrii R
RANIF
Bidirectional Reflectance Distribution Function(Jensen and Schill, 2000)
Atmospheric absorption -75o
-45o
-15o
0o
15o
45o
75o
Bidirectional Reflectance Distribution Function(Jensen and Schill, 2000)
Atmospheric absorption
0o
30o
60o
90o
150o
270o
Bidirectional Reflectance Distribution Function
(Jensen and Schill, 2000)
0
90
180
270
Bidirectional Reflectance Distribution FunctionBidirectional reflectance factor (R)
An understanding of BRDF is needed in remote sensing to correct for Sun illumination angle and sensor viewing angle effects for -
• Mosaicking images, • Deriving albedo, • Improving land cover classification accuracy, • Enhancing cloud detection, and • Correcting for atmospheric conditions • To identify bands that are least impacted by BRDF, recognize optimal sun/sensor
angle-of-views
(Myneni et al., 1995; Woodcock et al., 1997).
Bidirectional Reflectance Distribution Function
The accurate computation of BRDF required for:
• Making corrections to reflectance measurements of features measured from nadir or off-nadir pointing remote sensing systems.
• To identify bands that are least impacted by BRDF, recognize optimal sun/sensor angle-of-views, and provide insight into radiometrically adjusting remotely sensed data to minimize BRDF effects.
Objectives of atmospheric corrections
High goal of remote sensing:
To identify the composition of objects on ground from remote sensing data
Spectral reflectance curves are used for this purpose However, radiance-at-the-sensor is contaminated by path radiance due to the
atmosphere, hence spectral reflectance estimated from remote sensing data are incorrect
We have to correct the radiance-at-the-sensor to remove atmospheric effects
When is the atmospheric correction really required??
• Mono-temporal data : NO• Classification: NO• Change monitoring and detection: YES• Composition mapping, spectral analysis: YES
Sensor calibration • gain and offset
Atmospheric correction
• image measurement• ground measurements• atmospheric models• sensor view path atmospheric radiance• sensor view path atmospheric
transmittance
DN
Radiance at sensor
Solar and topographic correction
Radiance at ground
• solar exo-atmospheric spectral irradiance
• solar path atmospheric transmittance
• down-scattered radiance• solar angle, DEM
Surface reflectance
Radiometric calibration
Atmospheric corrections: Techniques
• Histogram minimum method aka dark object subtraction – the bootstrap approach
• Empirical line method
• Radiative transfer models – Physical-based approach
Estimation of LP : Dark object subtraction
• Dark-object subtraction techniques derive the corrected DN (digital number) values solely from the digital data with no outside information.
• This type of correction involves subtracting a constant DN value from the entire digital image.
• The assumption is that there is a high probability that at least a few pixels within an image which should be black (0% reflectance). If there are no pixels with zero values, that is the effect of atmospheric scattering
• For example, there are about 45 million pixels in a single TM band – so there very high probability that at least one of them should be black.
Estimation of LP : Dark object subtraction
450-
515
nm B
and
152
5-60
5 nm
Ban
d 2
630-
690
nm B
and
3
775-
900
nm B
and
4
1550
-175
0 nm
Ban
d 5
2090
-235
0 nm
Ban
d 7
LANDSAT ETM+
BANDS
Water absorption
Water absorption
Water bodies have 0% reflectance in the IR region, hence zero DN
Non-zero values over water bodies in the IR consequence of path radiance.
Subtract the non-zero value over water bodies from all pixels. That would make water body perfectly non-reflecting.
In Visible bands, shadows should be black in absence of path radiance.
Hence non-zero values over shadowed areas can be used for dark pixel correction.
Estimation of LP : Dark object subtraction
• Histograms of pixel values in all bands• pixel values of low reflectance areas near zero• exposures of dark colored rocks• deep shadows• clear water
• Lowest pixel values in visible and near-infrared are approximation to atmospheric path radiance
• Minimum values subtracted from image
Estimation of LP : Dark object subtraction
Estimation of LP : Dark object subtraction
How will you calculate path radiance for all bands ??For example, calculate path radiance for a pixel whose DN value is 53 in band 1.
Estimation of LP : Dark object subtraction
How will you calculate path radiance for all bands ??For example, calculate path radiance for a pixel whose DN value is 53 in band 1.
ETM+ Solar Spectral Irradiances Band watts/(m2 * μm)
1 19972 18123 15334 10395 230.87 84.90 8 1362.
Day of Year Distance Day of Year Distance Day of Year Distance Day of Year Distance Day of Year Distance
1 .98331 74 .99446 152 1.01403 227 1.01281 305 .99253
15 .98365 91 .99926 166 1.01577 242 1.00969 319 .9891632 .98536 106 1.00353 182 1.01667 258 1.00566 335 .9860846 .98774 121 1.00756 196 1.01646 274 1.00119 349 .9842660 .99084 135 1.01087 213 1.01497 288 .99718 365 .98333
• DN values of correlated bands are plotted• Least square line fit using standard
regression methods• Resulting offset is approximation for
the atmospheric path radiance offset subtracted from image
Estimation of LP : Dark object subtractionRegression technique
Estimation of LP : Dark object subtractionRegression technique
Does it always work?
The key criterion of atmospheric correction algorithm - ….. Quantify atmospheric influences on satellite image radiometry but at the same time insensitive to surface reflection effects
Estimation of LP : Dark object subtractionRegression technique
So how to correct this image?
Manually select several clear and hazy area pixels in the image
Two spectral bands are selected based on the following criteria:
• The spectral responses of different land cover types, under clear atmospheric conditions, should be highly correlated in the two bands. This will result in a well-defined surface response vector in spectral space called “clear line” (CL)
• The effect of haze should be markedly different in the two bands so that increased atmospheric contamination manifests in increased shift away from the CL
• Typically we would select blue and red bands
Apply a transformation whose coefficients define a direction orthogonal to the CL and whose response magnitude is proportional to the deviation from this line
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
Y. Zhang et al., 2002 (RSE)
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
Schematic diagram of the TM1 – TM3 spectral space illustrating the conceptual components of the HOT. Under clear sky conditions, radiances of common surface cover types, coded as A – K, exhibit high correlation and define a ‘clear line’ (CL). The effect of haze of increasing optical depth, illustrated by the numerical sequences 1 – 18, is to pixels to ‘migrate’ away from the CL. The HOT quantifies the atmospheric contamination level at a pixel location by its perpendicular distance, in spectral space, from the CL.
Y. Zhang et al., 2002 (RSE)
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
Y. Zhang et al., 2002 (Rem Sens Env)
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
1. Select two correlated bands (bands showing similar reflectance characteristics for all objects) but affected by scattering due to atmospheric components to different degrees.
Example: Bands 1 (Blue) and 3 (Red) of ETM/TM
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
2. Mask out areas with obvious haze3. Select some very clear areas that are unaffected by clouds/haze)
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
4. Plot DN (Blue band – X axis) vs DN (Red band – Y axis) of pixels from clear area
Band 1 (Blue)
Band
3 (R
ed)
.. ... .
....
.. ... .
. ...
. ... .
. ...
..... .
.
..... .
....
.. ... .
.
. ... .
. ...
..... .
... ... .
....
.. ... .
. ...
. ... .
. ...
.. ... .
.
..... .
. ...
.. ... .
.
. ... .
. ...
..... .
.
Clea
r line
5. Fit the pixel DNs to the clear line generated by linear regression (slope = α and offset β on x axis.
Haze vector
6. Haze vector is orthogonal to clearline
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
offset) theing(disregard
Or )(
CosDNSinDN
CosDNSinoffsetDN
redblue
redblue
10. Calculate HOT for all pixels as the offset of a pixel from the clear line in the haze vector direction
. ... . . ... .. ... . . ... .
... . . ... .. ... . . .. ... . . ... .
. ... . . .... . . ... .. ... . . .
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
...
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
. ..
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
...
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
. ..
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
...
.. ... .
. ..
.
.. ... .
. ..
.
. ..
. .. ...
.. ... .
.
.. ... .
. ..
.
.. ... .
.
. ..
. .. ...
.. ... .
.
8. Plot all DN (Blue) v/s DN(Red) for all pixels on the image
7. Plot clear line
α
Band 1 (Blue)
Band
3 (R
ed)
Clea
r line
β
9. Haze vector is orthogonal to clear line, hence you can identify haze pixels
. ... . . ... .. ... . . ... .
... . . ... .. ... . . .. ... . . ... .
. ... . . .... . . ... .. ... . . . Haze vector
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
11. Generate HOT Image and determine the HOT values for clear areas and hazy areas
(Not the same image as in the previous slide)
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
12. Plot HOT histogram for different HOT levels for clear and hazy areas
Clear areas
Haze areas
Increasing HOT = > Increasing Haze
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
13. Plot histogram lower bound versus HOT for bands TM1–TM3
14. Estimate radiometric adjustment using a method similar to “dark object subtraction” to normalize the image to the radiometric level of the clearest areas.
From Step 13 plot, note that, for Band TM 1 (Blue), the histogram lower bound for clear pixels (i.e., HOT= 30) is approximately 20 DNs. Consider a hazy pixel with an observed HOT level of 40. It is a member of a histogram with a lower bound 27. This implies that this hazy pixel should have its band 1 DN level reduced by 7 during the radiometric adjustment phase. This procedure can be used to adjust all bands for which the histogram analysis has been done.
Clear pixel
DN
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
Estimation of LP : Haze Removal Algorithm Haze Optimization Transform (HOT)
Results
One dark (X1) and one bright (X2) object selected on the image which can be clearly identified on the ground also
Ground reflectance of X1 and X2 measured using field radiometer (R X1 and R X2).
Radiance-at-the-sensor of X1 and X2 calculated from the image (L X1 and L X2).
The two points plotted on a graph, joined by a line, and the slope (s) and intercept (a) of the line measured.
Equation of the line derived, used for converting all radiance values into reflectance valuesR = (L- a)s R - Reflectance
a - Offsets - Slope
Empirical line method
a
Advantages and disadvantages of bootstrap techniques??
Model-based atmospheric corrections -Visibility
The farthest distance at which one can see a large, black object against the sky at the horizon. It is determined by:• optical properties of the atmosphere; • amount and distribution of light; • characteristics of the objects observed; • properties of the human eye.
Visibility is reduced by the absorption and scattering of light by both gases and particles. However, light scattering by particles is the most important phenomenon responsible for visibility degradation.In clean (background) atmospheric conditions, one can see over distances up to several hundred kms; in polluted atmospheric conditions, visibility is up to 10 km.
(Koschmieder equation)
where εext is the extinction coefficient (per sq km) at 550 nm. It is the sum of extinction coefficients of all gases and particles, which attenuate light, and is therefore a measure of the loss of radiation per unit distance
ext
VIS912.3
Model-based atmospheric corrections – Extinction Coefficient
Transmittance:
Absorbance:
or for gases
oIIT
radiationIncident radiation dTransmitte
T
TA 1loglog 1010
cLAeT cL
Beer’s Law: For monochromatic plane-parallel light entering a medium perpendicular to the surface of the medium:
where c - molar concentration; b- light path length, and ε - molar absorptivity for the ediumMolar absorptivity is constant and the absorbance is proportional to concentration for a given substance dissolved in a given solute and measured at a given wavelength. Molar absorptivities are commonly called molar extinction coefficients.
Units of extinction coefficient – M-1cm-1
T
TA 1lnln
L
Optical thickness or Optical depth
The optical depth expresses the quantity of light removed from a beam by scattering or absorption during its path through a medium.
If I0 is the intensity of radiation at the source and I is the observed intensity after a given path, then optical depth δ is defined by the following equation:
cL
eeII
ext
cLext
~:Hence
0
absorptionMolecularaerosolscatteringMolecular
Optical thickness or Optical depth Optical thickness (δ) has three components:
Optical thickness due to molecular scattering (δMolecular-scattering): • mainly due to N2 and O2
• depends on the pressure level and can be calculated for a given ground elevation.
Optical thickness due to molecular scattering (δMolecular-absorption): • mainly due to O3 and CO2 and H2O • O3 and CO2 can be averaged over large areas. • water vapour absorption is significant and varies with time and space.
Optical thickness due to aerosol (δaerosol): Aerosol scattering is significant and varies with time and space.
absorptionMolecularaerosolscatteringMolecular
Atmospheric transmittance e
Atmospheric transmittance, is therefore, is a function of :
• N2 and O2 concentration, - which depends on the pressure level and can be calculated for a given ground elevation.
• O3 and CO2 and H2O concentration- O3 and CO2 can be averaged over large areas.
- water vapour absorption is significant and varies with time and space.
• Aerosol concentration - varies with time and space.
0
102
cos})({
ETLDNccd
Ro
path
0
102
cos})({
ETLDNccd
Ro
path
What is known?• Sun-earth distance• DN value • Zenith angle • Incoming solar irradiance
What is unknown?• Path radiance• Transmittance
Radiative transfer codes are used to estimate the unknowns
• The physical phenomenon of energy transfer in a medium
• In our case, it refers to electromagnetic radiation in the atmosphere
• The propagation of the radiation through the atmosphere is affected by the processes of absorption and scattering, as well as atmospheric emissions
• The equations of radiative transfer describe the interactions mathematically
• Requires a tremendous amount of data for accurate calculation
Radiative Transfer
Radiative Transfer Model: MODTRAN(Moderate resolution atmospheric transmission)
•Developed in 1989 and patented by the U.S. Air Force•A computer program (FORTRAN code) designed to model transmission of electromagnetic radiation through the atmosphere• Uses radiative transfer equations• Relevant regarding the ultraviolet through the infrared
Radiative Transfer Model based Algorithms for atmospheric corrections
• ATmospheric CORrection (ATCOR)• ATmosphere REMoval (ATREM)• Fast Line-of-sight Atmospheric Analysis of Spectral Hypercubes
(FLAASH)
All the above algorithms use radiative transfer models of atmosphere such as MODerate resolution atmospheric TRANsmission (MODTRAN)
RADIATIVE TRANSFER MODELS: used for modeling the transfer of electromagnetic energy through different layers of the atmosphere
INPUTS TO RADIATIVE TRANSFER MODELS: • Solar azimuth • Location • Wavelength (bands) • Ground elevation • Sensor view angle • Atmospheric transmittance• Atmospheric scattering
WITH ABOVE PARAMETERS, RADIATIVE TRANSFER MODELS YIELD PATH RADIANCE AND TRANSMITTANCE FOR ESTIMATING SURFACE REFLECTANCE
PRACTICAL PROBLEM : 1. Atmosphere is not homogenous vertically nor horizontally 2. Wavelength dependence of radiative transfer3. Viewing and illumination geometry dependence
PRACTICAL PROBLEM: Atmosphere is not homogenous vertically nor horizontally
• Not realistically possible to measure atmospheric parameters for all pixels through the entire atmospheric column
PRACTICAL SOLUTION USED IN ALL ALGORITHMS:• Define standard atmospheres• mid-latitude summer atmosphere• US standard atmosphere 1976• standard tropical atmosphere• desert tropical (arid) atmosphere• fall (autumn) atmosphere• mid-latitude winter• subarctic winter• Vertically profile different standard atmospheres at a number of locations for: • air pressure, • air temperature, • O2, O3, N2, CO2 , O3 concentrations• Use the relation T=e-δ=e-εCL => δ=εCL to estimate δMol-Scatter due to O2 and N2, and δMol-Absorp due to
CO2 and O3 • δScatter due to aerosols? δAbsorp due to water vapour?
• Aerosol type and concentration• Water vapour concentration
The unkown parameters …..
Water vapour concentration
Must have bands in one of the following bands:
• 1.05 – 1.21 μm• 0.87 – 1.02 μm• 0.77 – 0.87 μm
The band depth can be used to estimate the water vapor content (pixel wise)
Aerosol : nature and concentrationδAerosol≈εAerosolCAerosol
• For εAerosol
• Estimate from visibility:• Get the user input for visibility• Use Koschmieder equation (VIS = 3.912/ε) to estimate extinction coefficient from visibility
(Aerosol optical depth not critical if the visibility is high, that is >40 km)
• For CAerosol
• Concentration difficult to estimate for every atmospheric condition, therefore standard types are used:• rural, urban, desert, maritime • The concentration of aerosols measured for different visibility ranges and different aerosol types, and are stored in
lookup tables
• Typical user inputs values are:• sensor type (LANDSAT/ASTER/etc etc) • solar azimuth• sensor viewing angle• Latitude-Longitude• standard atmosphere in the image• visibility• aerosol type• average ground elevation
• Precompiled LUTs:• CO2, N2, O2, O3 concentration and extinction coefficients, (in other words, optical depths due to these gases) for
different atmosphere types• Aerosol concentration and extinction coefficients (optical depths) at different visibility ranges for different
aerosol types
• Concentration (optical depth) due to water vapour calculated from absorption feature depths
MODTRAN models atmospheric propagation of EM radiation (radiative transfer models) in the 0.2 to 100 um spectral range.
• Algorithms use a catalogue (LUT) of atmospheric correction values compiled for different combinations of the above inputs (previous slide)
The catalogue (LUT) consists atmospheric correction functions for:1. Different standard atmospheres (altitude profile of pressure, air temperature, gases (O2, N2, CO2, O3) concentration)
• mid-latitude summer atmosphere• US standard atmosphere 1976• standard tropical atmosphere• desert tropical (arid) atmosphere• fall (autumn) atmosphere• mid-latitude winter• subarctic winter.
2. Different aerosol types: rural, urban, desert, maritime3. Different aerosol concentrations (aerosol optical depth) defined by the visibility. The range provided is 5-40 km, calculated
values are: 5, 7, 10, 15, 23, 40 km. Values for 4 and 80 km are obtained by linear extrapolation. The conditions range from hazy to very clear.
ATCOR-2 LUT derived using MODTRAN
4. Water vapour concentrations (calculated from absorption bands depths – optionally user defined)5. Different ground elevations ranging from 0 to 1 km (calculated values are for 0, 0.5, and 1km ASL; other values interpolated.)6. Solar zenith angles ranging from 0o - 70o in the steps of 10o
7. Different functions for each sensor and each band - the atmospheric correction functions depend on the spectral response of the sensor, thus there are different functions for each sensor and each band
8. Different sensor view angle
The above parameters can be specified by the user, or are read from the image header.For illustration - in ATCOR-2, the number of entries in the look-up tables for the six reflective bands of Landsat TM is about 9000, i.e. 12 x 7 x 6 x 3 x 6 = 9072, including 12 atmospheres, 7 solar zenith angles, 6 visibilities, 3 ground elevations, and 6 bands.
Measured atmospheric data can also be used to calculate new files of look-up tables for the catalogue.
g
path
ETLDNccd
Ro
})*({ 102
After estimation of path radiance, global flux and atmospheric transmission, apply the following equation to derive surface reflectance