News & Views
How Does Thermal Imaging Work?
Jul 25 2014 Read 3338 Times
Thermal imaging produces images using the infrared part of the electromagnetic (EM) spectrum. Infrared is between visible light and microwaves in the EM spectrum.
What Is Infrared?
All objects with a temperature above absolute zero emit radiation as a result of the thermal motion of their molecules. It is the rotational and vibration changes in molecules that emit infrared radiation.
Objects emit radiation over a wide range of wavelengths with a link between the temperature and the intensity of the radiation at different wavelengths. As the temperature of an object increases, the thermal motion of the molecules also increases. Infrared radiation has a wavelength range of 700 nm to 1 mm - the visible range is 400-700 nm.
How to Detect Infrared
Infrared can be detected in the same way that a digital camera detects light. A lens focuses the infrared radiation onto a detector. The detection triggers an electrical response, which is then processed. After processing takes place, an image is formed called a thermograph.
The detection of infrared radiation relies on the detector seeing a difference in infrared intensity between an object and its surroundings. This difference will be due to:
- The temperature difference between the object and its surroundings, and:
- How well the object, and its surroundings, can emit infrared. This property is known as an object’s emissivity.
Lenses and Detectors
Because standard camera lenses are poor at transmitting infrared, the lenses of thermal imaging devices must be made from a different material that is transparent to infrared radiation. For high quality thermal imaging lenses devices with semiconductor properties are used in the lens. Other materials such as polymers can be designed and used as infrared lenses offering lower cost options.
Detectors in thermal imaging devices come in two basis types: cooled and uncooled. As their names suggest, one operates at colder temperatures than the other.
Cooled detectors operate at a temperature much lower than the surrounding area (typically at 60-100 K). This is because the semiconductor materials used in cooled detectors are very sensitive to infrared radiation. Consequently, if the devices are not cooled, the detectors would be flooded with radiation from the device itself. When the detector absorbs infrared small changes occur in the charge conduction of the semiconductor. These changes are processed into the infrared image or thermograph. Cameras with cooled detectors are designed to detect very small differences in infrared temperature at a high resolution.
Uncooled detectors operate at ambient temperatures. These detectors detect the incoming heat radiation that causes changes to the electrical properties of the detector. Uncooled detectors are not as sensitive as cooled detectors and are cheaper to produce and operate.
Infrared cameras typically produce a monochrome image. The detectors do not distinguish between wavelengths of the incoming radiation, only the intensity of the radiation. For a look at what goes into making a high quality thermograph have a look here: Thermography Shows why Champagne Should be Poured Differently.
Do you like or dislike what you have read? Why not post a comment to tell others / the manufacturer and our Editor what you think. To leave comments please complete the form below. Providing the content is approved, your comment will be on screen in less than 24 hours. Leaving comments on product information and articles can assist with future editorial and article content. Post questions, thoughts or simply whether you like the content.
In This Edition Articles - Why Does Nanotechnology Require Mass Spectrometry Spotlight Features Luminescence, UV & Microplate Readers - New Confocal Laser Scanning Microscope Combine...
View all digital editions
Jul 29 2018 Chicago, IL, USA
Jul 29 2018 Washington DC, USA
Aug 05 2018 Baltimore, MD, USA
Aug 06 2018 Westminster, CO, USA
Aug 06 2018 Berlin, Germany