The atmosphere affects the spatial and spectral distribution of the electromagnetic radiation originating from the sun before it reaches the earth’s surface, and it also attenuates the subsequently reflected energy recorded by a satellite sensor. Gas absorptions, molecule and aerosol scattering are examples of atmospheric processes that influence incident and reflected radiation. Knowledge of these processes, which are not constant over time, must be brought into play to correct for them the satellite sensor readings.
Thus, the amount of reflected energy recorded by a sensor must be processed to separate the atmospheric disturbances from the actual reflectance that was emitted from the objects on the surface of the earth. This step may not be always be needed as it depends on the intended use of the satellite image. Atmospheric image correction requires information on the atmospheric conditions present at the time/period of image acquisition (Richter and Schläpfer, 2012).
While atmospheric correction may not be important for certain applications (e.g., when conducting land cover classification for a single year), it is absolutely necessary when performing a time-series analysis in crop growth. For example, in comparing spectral characteristics of a pixel or group of pixels (an object) acquired on different dates, removal of the influence of atmospheric conditions prior to comparison is crucial.
During atmospheric correction, the image pixel values (i.e., known as Digital Numbers — DNs) are converted to a physically interpretable measure, often referred to, and interpreted, as surface reflectance (Chen and Cheng, 2012; Richter and Schläpfer, 2012). This conversion generally entails two steps. The first is radiometric calibration, which involves (a) the conversion of DNs to radiance, and then (b) to top of atmosphere radiance. The obtained values can be interpreted as radiance observable just outside of the earth’s atmosphere; their derivation from the DN numbers can normally be done with just the metadata that is delivered with the image. Radiance is the amount of radiation, so it is an absolute figure. Reflectance is the proportion of the amount of radiation hitting an object, and the amount reflected off of it, so this is a ratio value. Top of atmosphere reflectance can be derived just like the radiance in step (b) above.
In a next phase, the top-of-atmosphere reflectance is converted to surface reflectance (also known as bottom-of-atmosphere reflectance, or top-of-canopy reflectance or in vegetation studies). Top-of-canopy reflectance can be understood as reflectance as would be measured from just above the vegetation. This phase requires knowledge of atmospheric conditions present during the image acquisition time frame. The resulting image is called atmospherically corrected. Some image provider agencies (like the United States Geological Survey) now deliver atmospherically corrected images free of charge on their EarthExplorer website (search for “Landsat Surface Reflectance”).
Atmospheric correction modules are scarce in open-source remote sensing applications (but see the recent ESA-NASA ACIX exercise). Even in proprietary software (like ATCOR and FLAASH for Erdas) atmospheric correction must often be acquired as a separate license module. A number of atmospheric correction methods can be implemented in normal RS/GIS tools or through home-brewn code.
In the STARS automated image processing workflow, atmospheric correction is performed through the 6S radiative transfer model, a FORTRAN program developed and based on the MODIS atmospheric correction algorithm and adapted in the STARS project to cater for other image types (mostly as delivered by DigitalGlobe and RapidEye). Detailed information about the program can be found here.
The atmospheric correction code eventually produces three outputs; the top-of-atmosphere radiance and reflectance, and the surface reflectance. In the current data format, to obtain 0–1 reflectance values from the cell values, one has to divide by 10,000.