Electrical Substitution Radiometers (ESRs) are devices that measure optical
radiation by comparison to an equivalent amount of electrical power. ESRs are
sometimes referred to as Electrically Calibrated Radiometers (ECRs) or simply
as absolute radiometers. The fundamentals of an ESR can be understood by
reference to Fig. 1. The radiant flux (optical power)
Φp is incident upon a receiving cavity designed to
optimize the collection of radiation. Upon absorption of the radiation, the
cavity will experience a temperature rise. The receiver is coupled to a
constant temperature heat sink at a reference temperature T0
with a thermal conductor of conductance G. Ignoring losses due to
radiation, convection and stray thermal conductance, the equilibrium (long
time) temperature rise is given by
T - T0 = Φp/G.
When the shutter is imposed to intercept the light beam, electrical power to
the cavity is increased a sufficient amount to maintain the cavity temperature
determined by the temperature sensor system at the shutter open level. Ignoring
corrections and losses mentioned earlier, the optical power is given by
Figure 1. Schematic diagram of the essential components of an electrical substitution radiometer. The total light flux Φp measured in watts is collected by a receiving cavity usually shaped in the form of a cone. The temperature of the cavity, T, and the temperature of the heat sink, T0, are monitored by temperature sensor systems. When the shutter is closed, electrical power equivalent to the optical power is applied by the power supply system thereby establishing the optical power.
ESRs have been in use for 75 years or more and have had their history and development described by Hengstberger in considerable detail . Work on ESR technology was pioneered by Coblentz at NIST, at that time the National Bureau of Standards (NBS), during the early part of this century . He developed a number of radiometers and used them for diverse purposes in photometry and radiometry, including an early measurement of the Stefan-Boltzmann constant . For a variety of technical reasons and the fact that the SI photometric unit, the candela, was defined until 1979 in terms of the output of candles and eventually a fixed point platinum blackbody, radiometry and photometry depended upon characterization of sources for the maintenance of units. The 1979 candela redefinition in terms of optical power helped spur the shifting of radiometric and photometric measurements to detector based technology.
The incorporation of detectors into photometric and radiometric standards was assisted by several developments in the 1970's and 1980's. High quality silicon photodiodes became available and provided a convenient device with which to make optical measurements in the visible wavelength region . Electrical substitution devices were designed and constructed to operate at cryogenic temperatures in order to increase the sensitivity of the devices and to reduce the uncertainties due to radiation and conduction losses. The first cryogenic radiometer at NIST was constructed by Ginnings and Reilly in 1972 to measure thermodynamic temperatures above 0 °C . For a variety of reasons this project did not achieve the desired results, but by building on the experience gained by Ginnings and Reilly, in the mid-1970's Yokley built a cryogenic radiometer system at NIST to measure the radiation temperature of low temperature blackbodies used in a low background environment . This device was used for a number of years to perform specialized calibrations of low flux sources but was not engineered to perform high accuracy measurements and as a consequence produced results with relative uncertainties of several percent.
Quinn and Martin at the National Physical Laboratory (NPL) in the UK developed a high accuracy cryogenic radiometer for use in a radiometric determination of the Stefan-Boltzmann constant . The system that the NPL team developed allowed for the determination of the Stefan-Boltzmann constant with a relative combined standard uncertainty of 0.013%. This important benchmark work contains a detailed analysis of the errors and uncertainties associated with a cryogenic ESR and has led to the adoption of these devices as fundamental radiometric standards with relative combined standard uncertainties of less than 0.01%. NPL staff later developed this device into a radiometer for laser power sources and thereby provided a comparative technique to establish high accuracy radiometric units with other stable detectors . The technology has evolved and at least two companies are offering commercial versions of absolute cryogenic radiometers . The availability and accuracy of the instruments has resulted in their employment by a number of national standards laboratories to provide the basis of radiometric measurement .
The NIST high accuracy cryogenic radiometer (HACR) is shown in Fig. 2 and is described in detail in the technical literature . The heart of the instrument is the absorbing cavity which is connected to a thermally controlled heat sink held at 5 K. The apparatus is evacuated with a vacuum pumping system and has cryogenic fluid reservoirs to provide the low temperature environment for the cavity absorption and electrical heating system operation. Polarized optical radiation from a laser system enters the vacuum vessel of the HACR through a window at Brewster's angle.
Figure 2. Sectional drawing of the NIST high resolution cryogenic radiometer.
A typical mode of operation for the HACR is shown schematically in Fig. 3 and is described in the literature [25, 26]. A secondary or transfer standard detector (TSD) is inserted into the laser beam and intercepts the same beam as measured by the HACR. A TSD is a detector which is calibrated directly with the HACR and can then be used to transfer the detector response unit to other calibrations systems. After appropriate corrections for HACR entrance window transmittance and other systematic effects the absolute response of the TSD is deduced. In Fig. 3 the TSD is depicted as a trap detector, so called because of the arrangement of several silicon photodetectors in a configuration designed to absorb a large portion of the incident light. These silicon devices work well in the 400 nm to 1000 nm wavelength region and different types of TSDs are used in other wavelength regions.
Figure 3. Schematic of the HACR optical arrangement used to characterize transfer standard detectors for use in other optical measurements. Various lasers provide radiation for wavelengths in the UV to the IR. Each wavelength used requires careful characterization of the entrance window for its transmittance and scattering properties.
The TSD can be calibrated directly using the HACR at selected laser wavelengths and by determining the responsivity for regions between the laser wavelengths from a knowledge of the physics of the detector. In the case of silicon photodiodes a considerable amount of effort has gone into modeling the responsivity as a function of wavelength with the result that the responsivity from 400 nm to 1000 nm can be accurately inferred . Gentile and co-workers used this technique to ascertain the NIST unit of detector spectral responsivity with a relative combined standard uncertainty of less than 0.04% in the wavelength range of 406 nm to 920 nm. Other types of detectors can be calibrated using this technique in different wavelength regions, including spectrally flat absorptive bolometer detectors and other semiconductor devices [18, 28]. The complete calibration for semiconductor devices consists of determining the absolute spectral responsivity of the device as a function of wavelength based upon the HACR measurements and appropriate modeling. This information and data on the responsivity spatial uniformity and temperature stability of the detector allow the device to be used as a TSD in other spectral radiometric measurement circumstances.
As an alternative to having a semiconductor detector with a limited response range, a bolometer can be utilized as a TSD by carefully characterizing the stability and the spectral absorption of the absorbing surface material utilized on the bolometer. A careful characterization of the elements of the detector that determine its relative spectral response to a high accuracy allow it to be calibrated with the HACR at visible and near infrared (IR) wavelengths and its response elsewhere inferred from the relative spectral response of the device. In some wavelength regions lasers may not be readily available for use in the manner shown in Fig. 3. An approach to avoid this problem has been the development of cryogenic radiometers that operate with monochromatic light provided by conventional optical sources . These sources typically have less optical power per unit wavelength than a laser and hence the use and characterization of the radiometer can require greater care to achieve high levels of accuracy. In all cases, use of a cryogenic radiometer relies upon the availability of suitable window materials which may present technical challenges in some wavelength regions.