The atmospheric effects on radiometric data recorded in the Landsat multispectral scanner system (MSS) bands are compiled for cases of representative and ideal atmospheric conditions. The effects are expressed as a difference between the Earth's surface spectral reflectivity, a0, and the surface-atmosphere system spectral reflectivity, as, derived from the satellite data,
as−a0 = −a0[l+(l/μ0)](B+W) + 2a2 0B + g(μ0)B/2μ0
where μ0 is the cosine of the solar zenith angle, B and W are the backscattering and absorption optical thickness respectively, and the function g( μ0) is the anisotropy of backscattering to the zenith from the direct beam. This formula is accurate only for an atmosphere of low optical thickness. Also, the equation applies only to large areas having a uniform reflectivity, because adjacency effects due to reflection from the terrain surrounding the object pixel and subsequent scattering by the atmosphere are not considered.
It is concluded that in the quantitative monitoring of surface changes from satellites, scattering effects predominate in some applications (for example, bathy-metric mapping of coastal waters), whereas absorption effects predominate in other applications (for example, monitoring desert fringe areas). Different measurements are more appropriate for assessing the scattering effects than for assessing the absorption effects.
These effects on the monitoring of surface changes by the use of Landsat MSS data are discussed in terms of departures of the actual atmosphere at the time of a satellite passage from a ‘minima’ atmosphere having no aerosols and characterized by gaseous absorption corresponding to minimal water vapour amounts.