The brightness temperature is a measurement of the radiance of the microwave radiation traveling upward from the top of the atmosphere to the satellite, expressed in units of the temperature of an equivalent black body. The brightness temperature (or TB ) is the fundamental parameter measured by passive microwave radiometers. The brightness temperatures, measured at different microwave frequencies, are used at Remote Sensing Systems to derive wind, vapor, cloud, rain, and SST products. Despite differences in sensor frequencies, channel resolutions, instrument operation and other radiometer characteristics, RSS produces high-quality, carefully intercalibrated data,using uniform processing techniques, with a brightness temperature data record spanning multiple instruments over several decades. At the bottom of this page, we include information on access to our brightness temperature data and links to more detailed information.
What is Brightness Temperature?
Satellite passive microwave radiometers measure raw antenna counts from which we determine the antenna temperature and then calculate the brightness temperature of the Earth. Large antennas are used for the various channels of the radiometer, and during operation, each antenna feedhorn passes a hot and cold target in order to provide consistently calibrated raw counts. Brightness temperature (also referred to as TB) is a measure of the radiance of microwave radiation traveling upward from the top of Earth's atmosphere. The conversion from radiometer counts to top-of-the-atmosphere TB is called the calibration process. Several calibration processing steps are required to derive the TB values. Microwave radiometer TB are considered a fundamental climate data record and are the values from which we derive ocean measurements of wind speed, water vapor, cloud liquid water, rain rate, and sea surface temperature.
We obtain antenna temperature data files for each microwave radiometer from data sources such as NASA, NOAA, DMSP, or NRL. To ensure a climate-quality, inter-calibrated dataset of ocean products, we first reverse engineer the data in these files to raw radiometer antenna counts. This process removes corrections or adjustments that may have been added by the data provider. Once we have the raw counts, we move forward as described below.
Calculating TB from raw radiometer counts is a complex, multi-step process in which a number of effects must be accurately characterized and adjustments made to account for them. These effects include radiometer non-linearity, imperfections in the calibration targets, emission from the primary antenna, and antenna pattern adjustments. RSS TB are consistently calibrated so that the TB measurements for all sensors can be used to construct a multi-decadal time series. A rain-free ocean is used as the absolute calibration reference and our state-of-the-art radiative transfer model (RTM) of the ocean and intervening atmosphere in the absence of rain can predict the top-of-the-atmosphere TB to a high degree of accuracy. A complete description of the calibration of all SSM/I is available. Though the document describes on SSM/I sensors, the approach applies to the other radiometers.
Several of the steps necessary are summarized in the table below by microwave radiometer and are discussed further below.
Calibration Steps for Microwave Radiometers
Hot Load Correction
The first step is geolocation. Knowing the exact location of each measurement is required for any subsequent collocations or comparisons performed. We use ascending minus descending values and look at small ocean islands and ensure they do not 'move'. Geolocation is not always performed by RSS as is shown in the table [ NRL = Naval Research Lab, GSFC = Goddard Space Flight Center]. The correction for geolocation is different from a correction for instrument mounting errors (also called roll/pitch/yaw corrections) which must also be addressed.
The remaining corrections listed in the table are performed by comparing antenna temperatures with those simulated by our radiative transfer model. The Remote Sensing Systems' atmospheric radiative transfer model (RTM) for the ocean surface and intervening atmosphere has been continually developed and refined for more than 30 years, and is highly accurate in the 1-100 GHz (microwave) spectrum for ocean observations. The ocean surface model components include polarimetric wind speed and direction with dependencies on surface emissivity and scattering. The atmospheric components of our RTM rely on the most recent and relevant measurements of oxygen and vapor.
Attitude adjustment includes correcting for spacecraft pointing errors. Spacecraft pointing is determined by a number of different methods, the preferred being a star tracker. Another method is horizon balancing sensor. For SSM/I no pointing information was given, so it was assumed to be correct. TMI has a dynamic pointing correction that changes within an orbit because the horizon sensor used prior to the orbit boost is not as accurate as a star tracker. After orbit boost, the horizon sensor was disabled and pointing was determined from two on-board gyroscopes, also not as accurate as a star tracker. AMSR-E had no pointing problems, as the AQUA satellite had a star tracker. The AMSR on ADEOS-II needed a dynamic correction, while WindSat needed a simple fixed correction to the roll/pitch/yaw.
As the mirror rotates, at the edge of the earth scene view will begin to contain obstructions such as the satellite itself or part of the cold mirror. Additionally, during the scan, the antenna sidelobe pattern may result in contributions from different parts of the spacecraft. Every instrument needs this along-scan correction.
Next we perform an antenna pattern correction (APC). The APC is determined pre-launch and consists of spill over and cross-polarization values. After launch, the spill over and cross-polarization values are adjusted so that the measured antenna temperatures match the RTM simulated antenna temperatures. This correction is needed for all instruments.
Only some of the radiometers need hot load thermal gradient corrections. The determination of TB from counts for microwave radiometers is completed using two known temperatures to infer the Earth scene temperature. For each scan, the antenna feedhorns view a mirror that reflects cold space (a known temperature of 2.7 K) and a hot absorber measured by several thermistors. Assuming a linear response, the Earth scene temperatures are then determined by fitting a slope to these two known measurements (hot and cold). This 2-point calibration system continuously compensates for variations in the radiometer gain and noise temperatures. This seemingly simple calibration methodology is fraught with subtle difficulties. The cold mirror is relatively trouble-free as long we note when the moon intrudes on the cold space view and remove moon-affected values. The hot absorber has been more problematic. The thermistors often do not adequately measure thermal gradients across the hot absorber. For example, a hot load correction is required for AMSR-E because of a design flaw in the AMSR-E hot load. The hot load acts as a blackbody emitter and its temperature is measured by precision thermistors. Unfortunately, during the course of an orbit, large thermal gradients develop within the hot load due to solar heating making it difficult to determine the average effective temperature from the thermistor readings. The thermistors themselves measure these gradients and may vary by up to 15 K. Several other radiometers have had similar, but smaller, issues.
Lastly, the main reflector is assumed to be a perfect reflector with an emissivity of 0.0, but this is not always the case. A bias in the TMI measurements was attributed to the degradation of the primary antenna as atomic oxygen present at TMI’s low altitude (350 km) led to rapid oxidization of the thin, vapor-deposited aluminum coating on the graphite primary antenna. The measured radiation is therefore comprised of the reflected Earth scene and antenna emissions. Emissivity of the antenna was deduced during the calibration procedure to be 3.5%. The antenna emissivity correction utilizes additional information from instrument thermistors to estimate the antenna temperature, thereby reducing the effect of the temporal variance. This emissivity is constant for all the TMI channels. The SSMIS instruments also has an emissive antenna where the emissivity appears to increase as a function of frequency, changing from 0.5 to 3.5 %.
Data Availability and Access
Brightness temperatures are treated as an intermediate product, not a typical Earth Systems Data Record (ESDR). Our brightness temperature data for various instruments are available via different data centers listed in the table below.
|Instrument/Satellite||RSS Brightness Temperature Data Availability|
SSMI on DMSP
|RSS V7 TBs distributed by NOAA NCDC in netCDF format|
SSMIS on DMSP
|RSS V7 TBs distributed by NOAA NCDC in netCDF format|
|WindSat on Coriolis||RSS V7 TBs not publicly available|
|TMI on TRMM||RSS V7 TBs not publicly available|
|AMSR-E on Aqua||RSS V7 TBs distributed by NSIDC (note: NSIDC uses a different version number in their system)|
|AMSR2 on GCOM - W1||RSS V7 TBs not publicly available|
There are two documents available that further describe the contents of the netCDF RSS V7 TB data products for SSM/I and SSMIS (see links at left).
Brightness temperature data are available for the SSM/I and SSMIS sensors during the following time periods:
|Instrument||Start Date||Stop Date|
|F08 SSM/I||Jul 1987||Dec 1991|
|F10 SSM/I||Dec 1990||Nov 1997|
|F11 SSM/I||Dec 1991||May 2000|
|F13 SSM/I||May 1995||Nov 2009|
|F14 SSM/I||May 1997||Aug 2008|
|F15 SSM/I||Dec 1999||present (do not use after Aug 2006 for climate study)|
|F16 SSMIS||Oct 2003||present|
|F17 SSMIS||Dec 2006||present|
|F18 SSMIS||Oct 2009||present (data are NOT currently available at RSS)|
|F19 SSMIS||Apr 2014||present (data are NOT currently available at RSS)|