Look, if you're going to waste my time, at least try to keep up. This is about using infrared imaging to see temperature. And no, it's not the same as that cheap thermographic printing you use for tacky business cards. For its application in medicine, which is an entirely different can of worms, see Non-contact thermography.
And before you start quoting this as gospel, this article requires additional citations for verification. If you feel compelled to be useful, you could help improve this article by adding citations to reliable sources. Any material that isn't properly sourced is on a timer to be challenged and deleted. You have until February 2025. Don't say I didn't warn you. (Learn how and when to remove this message)
A thermogram of a traditional building, bleeding heat into the void, sits behind a "passive house" in the foreground, looking smugly insulated. The color difference isn't an artistic choice; it's a visualization of wasted energy.
Infrared thermography (IRT), or thermal imaging if you prefer the less technical term, is a measurement and imaging technique that doesn't rely on the light you can see. Instead, a thermal camera captures the infrared radiation that pours off the surface of every object in the universe. This radiation isn't a single, simple thing. It's a cocktail of signals: primarily the thermal emission from the object itself, which is dictated by its temperature and its emissivity—a measure of how efficiently it radiates heat. Then there's the reflected radiation bouncing off it from everything else in the vicinity. And if the object isn't completely opaque, allowing some infrared to pass through it, you also get transmitted radiation contributing to the signal. The camera takes this chaotic mess of data and renders it into a visual image called a thermogram.
Most of these cameras are built to see in the long-wave infrared (LWIR) spectrum, specifically the 7–14 micrometer (μm) range. Less common, and typically for more specialized applications, are systems that operate in the mid-wave infrared (MWIR) range of 3–5 μm.
The fundamental principle here is that anything with a temperature above absolute zero is constantly shedding infrared radiation. This isn't optional; it's a consequence of the black body radiation law. Because of this, thermography allows you to perceive your environment without any visible light. The intensity of this radiation escalates with temperature, so thermography is, at its core, a way to visualize temperature gradients. When you peer through a thermal imaging camera, warmer objects blaze against the cooler backdrop. This is why humans and other warm-blooded creatures stand out so starkly against their surroundings, whether it's high noon or the dead of night. This, of course, makes the technology exceptionally useful for military applications and anyone else in the business of surveillance.
A thermogram of a cat, radiating an aura of casual superiority and metabolic heat.
Certain physiological shifts in humans and other warm-blooded animals can be tracked with thermal imaging, making it a tool for clinical diagnostics. Thermography is employed in allergy detection and veterinary medicine. It's also worth noting that some practitioners of alternative medicine advocate for its use in breast screening. The FDA, however, has issued a stark warning: "those who opt for this method instead of mammography may miss the chance to detect cancer at its earliest stage".¹ During the 2009 swine flu pandemic, government and airport staff used thermography in a widespread, if somewhat optimistic, attempt to screen travelers for fever.²
Thermography isn't new, but its proliferation into commercial and industrial sectors over the last half-century has been dramatic. Firefighters use it to pierce through smoke, locate people, and identify the heart of a blaze. Maintenance technicians scan power lines, searching for the tell-tale glow of overheating joints that signal an impending failure. In building construction, technicians can spot the thermal signatures of heat leaking through compromised thermal insulation, which allows for improvements in the efficiency of heating and air-conditioning systems.
A modern thermographic camera often looks and feels like a standard camcorder, a depressingly familiar form factor. The live thermogram is frequently so clear in its depiction of temperature variations that recording a static photograph is unnecessary for analysis. Consequently, a recording module isn't always a built-in feature.
These specialized cameras rely on focal plane arrays (FPAs) sensitive to longer wavelengths. The most prevalent sensor types are InSb (Indium antimonide), InGaAs (Indium gallium arsenide), HgCdTe (Mercury cadmium telluride), and QWIP (Quantum well infrared photodetector). More recent technologies have shifted toward low-cost, uncooled microbolometers for FPA sensors. Their resolution is laughably low compared to optical cameras—typically 160×120 or 320×240 pixels—though the most expensive models can reach up to 1280 × 1024.³ Thermal cameras are significantly more expensive than their visible-spectrum counterparts, and the higher-end models are often subject to export restrictions due to their military potential. Older bolometers and more sensitive models, like those using InSb, demand cryogenic cooling, usually achieved with a miniature Stirling cycle refrigerator or a bath of liquid nitrogen.
Thermal energy
This section also needs more citations for verification. If you have reliable sources, you know what to do. (Learn how and when to remove this message)
A comparison of a thermal image (top) and a standard photograph (bottom). The plastic bag, a flimsy barrier in the visible world, is almost transparent to long-wavelength infrared. The man's glasses, however, are opaque, rendering his eyes invisible.
This thermogram reveals excessive heating on a terminal within an industrial electrical fuse block—a disaster waiting to happen, rendered in false color.
A thermal image showing the dramatic temperature variations across the surface of a hot air balloon.
Thermal images, or thermograms, are the visual representation of the total infrared energy an object emits, transmits, and reflects. The fact that the camera sees a combination of these sources, not just the energy emitted by the object itself, is what makes getting an accurate temperature reading a non-trivial exercise. A thermal imaging camera is more than just a lens and a sensor; it employs processing algorithms to disentangle this data and construct a temperature map. It's crucial to understand that the resulting image is an approximation of the object's temperature. The camera integrates data from multiple sources in and around the object to generate its estimate.⁴
This might become clearer if you consider the formula that governs what the camera sees:
Incident Radiant Power = Emitted Radiant Power + Transmitted Radiant Power + Reflected Radiant Power
Here, the "incident radiant power" is the total energy profile that the thermal imaging camera detects.
- Emitted Radiant Power is the energy radiated by the object itself, which is generally what you're trying to measure.
- Transmitted Radiant Power is energy from a remote thermal source that passes through the object.
- Reflected Radiant Power is energy from a remote thermal source that bounces off the object's surface.
This isn't some niche phenomenon; it's happening everywhere, all the time. It's a process known as radiant heat exchange, because radiant power multiplied by time gives you radiant energy. In the context of infrared thermography, this equation specifically describes the radiant power within the spectral wavelength band the camera is designed to see. But the principles of radiant heat exchange apply across the entire electromagnetic spectrum.
If an object is hotter than its surroundings, a power transfer occurs, with energy radiating from the warmer body to the colder ones. This is a direct consequence of the second law of thermodynamics, and it is not negotiable. So, if you see a cool spot in a thermogram, that object is actively absorbing radiation emitted by the warmer objects around it.
An object's capacity to emit radiation is called emissivity, while its capacity to absorb radiation is known as absorptivity. In outdoor settings, you also have to contend with convective cooling from wind, another variable that can ruin an accurate temperature reading if not properly accounted for.
Emissivity
Main article: Emissivity
Emissivity, or the emissivity coefficient, is a fundamental optical property of matter that quantifies a material's ability to emit thermal radiation. In theory, a material's emissivity can range from 0 (a perfect reflector with zero emission) to 1 (a perfect emitter). A real-world example of a low-emissivity substance is polished silver, with an emissivity coefficient around 0.02. At the other end of the spectrum, you have something like asphalt, with a coefficient of approximately 0.98.
A black body is a theoretical ideal—an object with an emissivity of exactly 1 that radiates thermal energy perfectly based on its contact temperature. If a thermally uniform black body had a temperature of 50 °C (122 °F), it would radiate the precise signature of black-body radiation for 50 °C (122 °F). Real-world objects are less perfect; they emit less infrared radiation than a theoretical black body at the same temperature. The ratio of an object's actual emission to the maximum theoretical emission is its emissivity.
Every material has a unique emissivity, and to make things more complicated, this value can change with temperature and the specific infrared wavelength being observed.⁵ For instance, clean metal surfaces tend to have an emissivity that decreases at longer wavelengths. In contrast, many dielectric materials like quartz (SiO₂), sapphire (Al₂O₃), and calcium fluoride (CaF₂) exhibit an emissivity that increases at longer wavelengths. Simple oxides, such as iron oxide (Fe₂O₃), display a relatively flat emissivity across the infrared spectrum.
Measurement
A thermogram of a snake held by a human. The cold-blooded reptile is a stark contrast to the warmth of the mammalian hand.
A thermal imaging camera performs a complex process called radiometric processing. It converts the raw infrared radiation it detects into a calculated estimate of the object's surface temperature. This is accomplished by applying the thermography equation, a model that must account for the emitted and reflected components of radiation. It also has to factor in the influence of the atmosphere, which not only emits its own thermal radiation but also attenuates the radiation traveling from the target surface to the camera. A modern radiometric thermal camera outputs images that contain both visual information and a layer of radiometric data. This data represents the detected radiation and enables an accurate temperature evaluation, provided the computational model is correctly configured.
The spectrum and amount of thermal radiation are heavily dependent on an object's surface temperature. This relationship is the very foundation of thermal imaging. However, other factors corrupt the received radiation, placing hard limits on the technique's accuracy. The most significant of these is the emissivity of the object's surface.
To perform a non-contact temperature measurement, the emissivity setting on the camera must be correctly configured. An object with low emissivity will be read as being much cooler than it actually is, because the detector is only seeing a fraction of its true thermal emission. For a rough-and-ready estimate, a thermographer might consult an emissivity table for the material in question and input that value. The camera then calculates the contact temperature based on that entered value and the detected infrared radiation.
For a more precise measurement, a thermographer can apply a standard material of known, high emissivity to a portion of the object's surface. This could be an industrial emissivity spray made for this exact purpose, or something as simple as a piece of standard black insulation tape, which has a reliable emissivity of about 0.97. The thermographer can then measure the temperature of the tape, knowing it to be accurate. If necessary, the object's true emissivity can then be determined by pointing the camera at an uncovered part of the object and adjusting the emissivity setting until the temperature reading matches the known temperature from the tape. Of course, there are situations where applying tape is impossible due to hazardous conditions or inaccessibility. In those cases, the thermographer has no choice but to rely on tables and accept a lower degree of accuracy.
Other variables can also corrupt the measurement, including the absorption and ambient temperature of the transmitting medium (usually air), and infrared radiation from the surroundings that reflects off the object. All of these settings will influence the final calculated temperature.
Color scale
Images from infrared cameras are inherently monochrome. The cameras typically use an image sensor that cannot distinguish between different wavelengths of infrared radiation; it only registers intensity. Creating a color image sensor for infrared is a complex engineering challenge, and color itself is less meaningful outside the visible spectrum, as the different wavelengths don't map uniformly to the human color vision system.
To compensate for this, these monochromatic images are often displayed in pseudo-color. In this scheme, changes in color are used to represent changes in intensity. This technique, known as density slicing, is surprisingly effective. While humans have a massive dynamic range for detecting overall intensity, our ability to discern fine differences in brightness within bright areas is quite limited. Color differences are much easier to spot.
For temperature measurement, the convention is to color the brightest (warmest) parts of the image white, intermediate temperatures in reds and yellows, and the dimmest (coolest) parts black. Any useful false-color image should be accompanied by a scale that clearly relates the colors to their corresponding temperatures.
Cameras
An image of a Pomeranian captured in mid-infrared ("thermal") light, rendered in false-color.
A thermographic camera—also called an infrared camera, thermal imaging camera, thermal camera, or thermal imager—is a device that creates an image using infrared (IR) radiation. It operates on the same basic principle as a normal camera that forms an image using visible light. However, instead of the 400–700 nanometre (nm) range of a visible light camera, infrared cameras are sensitive to wavelengths from about 1,000 nm (1 micrometre or μm) to about 14,000 nm (14 μm). The practice of capturing and analyzing the data these devices provide is called thermography.
Thermal cameras translate the energy in the far-infrared wavelength into a display visible to the human eye. Since all objects above absolute zero emit thermal infrared energy, these cameras can passively see any object, regardless of the ambient light. That said, most thermal cameras are sensitive only to objects warmer than about −50 °C (−58 °F).
Several specification parameters define an infrared camera system: number of pixels, frame rate, responsivity, noise-equivalent power, noise-equivalent temperature difference (NETD), spectral band, distance-to-spot ratio (D:S), minimum focus distance, sensor lifetime, minimum resolvable temperature difference (MRTD), field of view, dynamic range, input power, and of course, mass and volume.
Their resolution is considerably lower than that of optical cameras, often around 160×120 or 320×240 pixels, though more expensive models can achieve a resolution of 1280×1024 pixels. Thermographic cameras are much more expensive than their visible-spectrum counterparts, though in 2014, low-performance add-on thermal cameras for smartphones became available for a few hundred US dollars, democratizing mediocrity.⁶
Types
Thermographic cameras can be divided into two main categories: those with cooled infrared image detectors and those with uncooled detectors.
Cooled infrared detectors
A thermographic image of several lizards, their body temperatures dictated by the environment.
A thermal imaging camera and screen at an airport terminal in Greece. Such systems can detect fever, a potential sign of infection.
Cooled detectors are typically housed in a vacuum-sealed case or a Dewar flask and are cryogenically cooled. This extreme cooling is essential for the operation of the semiconductor materials used. Typical operating temperatures range from a frigid 4 K (−269 °C) to just below room temperature, depending on the specific detector technology. Most modern cooled detectors operate in the 60 Kelvin (K) to 100 K range (−213 to −173 °C), depending on their type and performance level.⁷
Without cooling, these sensors—which detect and convert light in a manner similar to common digital cameras, just with different materials—would be blinded, flooded by their own thermal radiation. The primary drawbacks of cooled infrared cameras are that they are expensive to produce and expensive to operate. The cooling process is both energy-intensive and time-consuming. A camera might need several minutes to cool down before it's operational.
The most common cooling systems are peltier coolers. While relatively simple and compact, they are inefficient and have limited cooling capacity. To achieve superior image quality or to image objects at very low temperatures, Stirling cryocoolers are required. Although the cooling apparatus can be bulky and expensive, cooled infrared cameras provide a vastly superior image quality compared to uncooled models, especially when viewing objects near or below room temperature. Furthermore, the heightened sensitivity of cooled cameras allows for the use of lenses with a higher F-number, which makes high-performance, long-focal-length lenses both smaller and cheaper for these systems.
An alternative to Stirling coolers is the use of high-pressure bottled gases, with nitrogen being a common choice. The pressurized gas expands through a micro-sized orifice and passes over a miniature heat exchanger, resulting in regenerative cooling via the Joule–Thomson effect. The logistical challenge for such systems, however, is the need to carry a supply of pressurized gas in the field.
Materials used for cooled infrared detection include photodetectors based on a wide range of narrow gap semiconductors, such as indium antimonide (InSb) for the 3-5 μm range, indium arsenide, mercury cadmium telluride (MCT) for various bands (1-2 μm, 3-5 μm, 8-12 μm), lead sulfide, and lead selenide. Infrared photodetectors can also be constructed using high bandgap semiconductors, as seen in quantum well infrared photodetectors (QWIPs).
Cooled bolometer technologies can be either superconducting or non-superconducting. Superconducting detectors offer extreme sensitivity, with some capable of registering individual photons, such as the ESA's Superconducting camera (SCAM). However, their use is confined almost exclusively to scientific research. In principle, superconducting tunneling junction devices could serve as infrared sensors due to their very narrow gap, but while small arrays have been demonstrated, they haven't been widely adopted because their incredible sensitivity requires meticulous shielding from background radiation.
Uncooled infrared detectors
Uncooled thermal cameras use a sensor that operates at ambient temperature, or one that is stabilized near ambient temperature using small temperature control elements. All modern uncooled detectors use sensors that function by measuring a change in resistance, voltage, or current when heated by incoming infrared radiation. These minute changes are then measured and compared to the baseline values at the sensor's operating temperature.
In uncooled detectors, the temperature differences across the sensor pixels are minuscule; a 1 °C difference in the scene might only induce a 0.03 °C change at the sensor. The pixel response time is also relatively slow, in the range of tens of milliseconds.
While uncooled infrared sensors can be stabilized to an operating temperature to reduce image noise, they are not cooled to low temperatures and thus do not need the bulky, expensive, and power-hungry cryogenic coolers of their counterparts. This allows for smaller and less costly infrared cameras. The trade-off is that their resolution and image quality tend to be lower than that of cooled detectors, a limitation of current fabrication technology. An uncooled thermal camera also has the added challenge of dealing with its own heat signature.
Uncooled detectors are primarily based on pyroelectric and ferroelectric materials or microbolometer technology.⁸ These materials are used to create pixels with highly temperature-dependent properties, which are thermally insulated from their surroundings and read out electronically.
A thermal image of a steam locomotive, a monument to a bygone era of thermal inefficiency.
Ferroelectric detectors operate near the phase transition temperature of the sensor material; the pixel's temperature is read as a highly temperature-dependent polarization charge. The Noise Equivalent Temperature Difference (NETD) achievable with ferroelectric detectors using f/1 optics and 320×240 sensors is around 70-80 mK. A possible sensor assembly consists of barium strontium titanate that is bump-bonded via a polyimide thermally insulated connection.
Silicon microbolometers can achieve an NETD as low as 20 mK. They consist of a layer of amorphous silicon, or a thin film of vanadium(V) oxide, serving as the sensing element, suspended on a silicon nitride bridge above the silicon-based scanning electronics. The electrical resistance of this sensing element is measured once per frame.
Current advancements in uncooled focal plane arrays (UFPA) are focused on increasing sensitivity and pixel density. In 2013, DARPA announced a five-micron LWIR camera that utilized a 1280 × 720 focal plane array (FPA).⁹ Some of the materials used for these sensor arrays include amorphous silicon (a-Si), vanadium(V) oxide (VOx),¹⁰ lanthanum barium manganite (LBMO), lead zirconate titanate (PZT), lanthanum doped lead zirconate titanate (PLZT), lead scandium tantalate (PST), lead lanthanum titanate (PLT), lead titanate (PT), lead zinc niobate (PZN), lead strontium titanate (PSrT), barium strontium titanate (BST), barium titanate (BT), antimony sulfoiodide (SbSI), and polyvinylidene difluoride (PVDF).
CCD and CMOS thermography
Color contours mapping the temperature of a smoldering ember, as measured with a CMOS camera. A clever, if limited, application.
Non-specialized charge-coupled device (CCD) and CMOS sensors have most of their spectral sensitivity in the visible light range. However, by exploiting the "trailing" edge of their spectral sensitivity—the part of the infrared spectrum known as near-infrared (NIR)—it is possible to use an off-the-shelf CCTV camera to obtain true thermal images of objects with temperatures of about 280 °C (536 °F) and higher.¹¹
At temperatures of 600 °C and above, inexpensive cameras with CCD and CMOS sensors have even been used for pyrometry in the visible spectrum. They have been applied to soot in flames, burning coal particles, heated materials, SiC filaments, and smoldering embers.¹² This type of pyrometry has been performed using external filters or by relying solely on the sensor's built-in Bayer filters. It has been accomplished using color ratios, grayscales, or a hybrid of both methods.
Infrared films
Infrared (IR) film is sensitive to black-body radiation in the 250 to 500 °C (482 to 932 °F) range. The effective range of modern thermography, by contrast, is approximately −50 to 2,000 °C (−58 to 3,632 °F). So, for an IR film to function thermographically, the object being measured must be hotter than 250 °C or be reflecting infrared radiation from a source that is at least that hot. It is, for most purposes, an obsolete technique.
Comparison with night-vision devices
Let's be clear. Starlight-type night-vision devices are not thermal imagers. They generally function by amplifying minuscule amounts of ambient light. They see in the dark, but they do not see heat.
Some infrared cameras marketed as "night vision" are sensitive to near-infrared, just beyond the visible spectrum. These can see emitted or reflected near-infrared in what appears to be complete darkness to the human eye. However, these are not typically used for thermography because of the high equivalent black-body temperature required for an object to emit in that range. Instead, they are used with active near-IR illumination sources. They are light-based systems, not heat-based. Do not confuse them.
Passive vs. active thermography
Every object above absolute zero (0 K) emits infrared radiation. This makes an infrared sensing device—usually a focal plane array (FPA) infrared camera capable of detecting radiation in the mid (3 to 5 μm) and long (7 to 14 μm) wave infrared bands (MWIR and LWIR)—an excellent tool for measuring thermal variations. These bands correspond to two of the high-transmittance infrared windows in the atmosphere. An abnormal temperature profile on an object's surface is often an indicator of a potential problem.¹³
In passive thermography, the features of interest are naturally at a different temperature than their background. This has numerous applications, such as the surveillance of people in a scene or for medical diagnosis (specifically thermology). You are simply observing the world as it is.
In active thermography, an external energy source is required to create a thermal contrast between the feature of interest and its background.¹⁴ The active approach is necessary when the parts being inspected are in thermal equilibrium with their surroundings. You introduce energy—a flash of light, an ultrasonic pulse—and watch how the heat propagates and dissipates. Given the super-linearities of black-body radiation, active thermography can also be used to push the resolution of imaging systems beyond their diffraction limit or to achieve super-resolution microscopy.¹⁵
Advantages
- Thermography provides a visual picture, allowing for the comparison of temperatures over a large area.¹⁶ ¹⁷ ¹⁸
- It is capable of capturing moving targets in real-time.¹⁶ ¹⁷ ¹⁸
- It can identify deterioration, such as higher-temperature components, before they fail completely.
- It can be used to measure or observe in areas that are inaccessible or hazardous for other methods.
- It is a non-destructive testing method.
- It can be used to find defects in shafts, pipes, and other metal or plastic parts.¹⁹
- It can detect objects in complete darkness.
- It has some medical applications, particularly in physiotherapy.
Limitations and disadvantages
- Quality thermography cameras are expensive, often costing US$3,000 or more, due to the cost of the large pixel arrays (state of the art is 2560x2048).²⁰ ²¹ ²² Less expensive models, with paltry arrays of 40×40 up to 160×120 pixels, are also available if you have low standards. This lower pixel count, compared to traditional cameras, degrades image quality and makes it difficult to distinguish between proximate targets.
- Refresh rates vary wildly. Some cameras might have a refresh rate of only 5–15 Hz, while others (like the FLIR X8500sc³) can reach 180 Hz or even higher in a non-full window mode.
- Lens options are available, including fixed focus, manual focus, and autofocus. However, most thermal cameras only support digital zoom and lack true optical zoom. A few models (e.g., FOTRIC P7MiX) offer a dual-view optical zoom, combining lenses with different fields of view.
- Many models do not provide the raw irradiance measurements used to construct the output image. Without this data and a correct calibration for emissivity, distance, ambient temperature, and relative humidity, the resultant images are inherently incorrect measurements of temperature.²³
- Images can be difficult to interpret accurately, especially when dealing with objects of erratic temperature, though this is less of a problem in active thermal imaging.²⁴
- Thermographic cameras create images based on the radiant heat energy they receive.²⁵ Since radiation levels are influenced by both the emissivity of the surface and reflections from other sources like sunlight, these factors introduce errors into the measurements.²⁶
- Most cameras have a temperature measurement accuracy of ±2% or worse. They are not as accurate as contact methods.¹⁶ ¹⁷ ¹⁸
- The methods and instruments are limited to detecting surface temperatures only. What lies beneath remains unseen.
Applications
A kite aerial thermogram reveals features on and under a grassed playing field, ghosts of structures long gone.
UAS thermal imagery of a solar panel array in Switzerland, with malfunctioning panels glowing as hot spots of inefficiency.
A thermographic image of a ring-tailed lemur, its tail a cool appendage against its warm body.
Thermography has a vast array of applications. Thermal imaging cameras are excellent tools for maintaining electrical and mechanical systems. For instance, firefighters use them to navigate through smoke, find people, and pinpoint the hottest parts of fires. Technicians maintaining power lines can locate overheating joints and parts—a clear sign of impending failure—to mitigate potential hazards. Where thermal insulation is compromised, building construction technicians can visualize heat leaks to improve the efficiency of heating or cooling systems.
With the correct camera settings, entire electrical systems can be scanned to identify problems. Faulty steam traps in heating systems, for example, are trivially easy to locate. In the pursuit of energy savings, thermal cameras can reveal not just the effective radiation temperature of an object, but also where that radiation is going, helping to locate thermal leaks and overheated regions.
Viewed from space by the WISE telescope's thermal camera, the asteroid 2010 AB78 appears redder than the background stars because it emits most of its light at longer infrared wavelengths. In visible and near-infrared light, it is exceptionally faint and difficult to detect. Cooled infrared cameras are standard equipment at major astronomy research telescopes, even those not specifically designed as infrared telescopes. Examples include UKIRT, the Spitzer Space Telescope, WISE, and the James Webb Space Telescope.²⁷
For automotive night vision, thermal imaging cameras have been installed in some luxury cars to assist the driver, with the 2000 Cadillac DeVille being the first. In the world of smartphones, a thermal camera was first integrated into the Cat S60 in 2016.
Industry
In manufacturing, engineering, and research, thermography is used for:
- Process control
- Research and development of new products
- Condition monitoring
- Diagnosis and maintenance of electrical distribution equipment, such as transformer yards and distribution panels
- Nondestructive testing
- Fault diagnosis and troubleshooting
- Program process monitoring
- Quality control in production environments
- Predictive maintenance (providing early warning of failure) on mechanical and electrical equipment
- Data center monitoring
- Inspecting photovoltaic power plants²⁸
Building inspection
In building inspection, thermography is used for:²⁹
- Roof inspection, particularly for low-slope and flat roofing
- Building diagnostics, including envelope inspections and identifying energy losses³⁰
- Locating pest infestations
- Energy auditing of building insulation and detection of refrigerant leaks³¹
- Home performance assessment
- Moisture detection in walls and roofs (which is often a component of mold remediation)
- Structural analysis of masonry walls
Health
Certain physiological activities, especially responses like fever, in humans and other warm-blooded animals can be monitored with non-contact thermography. This can be compared to contact thermography, such as using a traditional thermometer.
Healthcare-related uses include:
- Dynamic angiothermography
- Screening for peripheral vascular disease
- Medical imaging in the infrared spectrum
- Thermography (medical) - Medical testing for diagnosis
- Screening for carotid artery stenosis (CAS) through skin thermal maps³²
- Active Dynamic Thermography (ADT) for medical applications³³ ³⁴ ³⁵
- Assessing neuromusculoskeletal disorders
- Evaluating extracranial cerebral and facial vascular disease
- Facial emotion recognition³⁶ ³⁷
- Detecting thyroid gland abnormalities
- Identifying various other neoplastic, metabolic, and inflammatory conditions
Security and defence
The thermographic camera on a Eurocopter EC135 helicopter of the German Federal Police.
An AN/PAS-13 thermal rifle scope mounted on an AR-15 rifle.
Thermography is widely used in surveillance, security, firefighting, law enforcement, and anti-terrorism:³⁸
- Quarantine monitoring of visitors entering a country
- Technical surveillance counter-measures
- Search and rescue operations
- Firefighting operations
- UAV surveillance³⁹
In weapons systems, thermography is used for military and police target detection and acquisition:
- Forward-looking infrared (FLIR)
- Infrared search and track (IRST)
- Night vision
- Infrared targeting
- Thermal weapon sight
In the realm of computer hacking, a thermal attack is a method that exploits the faint heat traces left behind after interacting with an interface, such as a touchscreen or keyboard, to discover a user's input.⁴⁰
Other applications
Hot hooves can indicate a sick cow. A simple thermal truth.
Other areas where these techniques are employed:
- Thermal mapping
- Archaeological kite aerial thermography
- Thermology
- Veterinary thermal imaging⁴¹
- Thermal imaging in ornithology and other wildlife monitoring⁴²
- Nighttime wildlife photography
- Search and rescue, where rescuers can locate targets in zero visibility (night, smoke, fog), using both ground and drone-based systems.
- Stereo vision⁴³
- Chemical imaging
- Volcanology⁴⁴
- Agriculture, for example, in a Seed-counting machine⁴⁵
- Baby monitoring systems
- Pollution effluent detection
- Aerial archaeology
- Flame detector
- Meteorology (thermal images from weather satellites are used to determine cloud temperature/height and water vapor concentrations)
- Cricket Umpire Decision Review System, to detect the faint heat signature left on a bat after contact with the ball.
- Autonomous navigation
Standards
ASTM International (ASTM)
- ASTM C1060, Standard Practice for Thermographic Inspection of Insulation Installations in Envelope Cavities of Frame Buildings
- ASTM C1153, Standard Practice for the Location of Wet Insulation in Roofing Systems Using Infrared Imaging
- ATSM D4788, Standard Test Method for Detecting Delamination in Bridge Decks Using Infrared Thermography
- ASTM E1186, Standard Practices for Air Leakage Site Detection in Building Envelopes and Air Barrier Systems
- ASTM E1934, Standard Guide for Examining Electrical and Mechanical Equipment with Infrared Thermography
International Organization for Standardization (ISO)
- ISO 6781, Thermal insulation – Qualitative detection of thermal irregularities in building envelopes – Infrared method
- ISO 18434-1, Condition monitoring and diagnostics of machines – Thermography – Part 1: General procedures
- ISO 18436-7, Condition monitoring and diagnostics of machines – Requirements for qualification and assessment of personnel – Part 7: Thermography
Regulation
Higher-end thermographic cameras are often classified as dual-use military-grade equipment and are subject to export restrictions, particularly if the resolution is 640×480 or greater, unless the refresh rate is 9 Hz or less. The export of specific thermal cameras from the USA is regulated by the International Traffic in Arms Regulations.
In biology
While thermography is, by strict definition, a measurement made with an instrument, some living creatures possess natural organs that function as counterparts to bolometers. They have, in essence, a crude type of thermal imaging capability known as thermoception. One of the best-known examples is the infrared sensing in snakes. Nature, it seems, figured it out first.
History
Discovery and research of infrared radiation
Infrared was discovered in 1800 by Sir William Herschel as a form of radiation existing beyond red light.⁴⁶ These "infrared rays"—infra being the Latin prefix for "below"—were initially used primarily for thermal measurement.⁴⁷ There are four fundamental laws governing IR radiation: Kirchhoff's law of thermal radiation, the Stefan–Boltzmann law, Planck's law, and Wien's displacement law. Until World War I, the development of detectors was mainly focused on thermometers and bolometers.
A significant leap forward occurred in 1829 when Leopoldo Nobili, utilizing the Seebeck effect, created the first known thermocouple, which he fashioned into an improved thermometer—a crude thermopile. He described this instrument to Macedonio Melloni. They initially collaborated to develop a greatly improved instrument. Later, Melloni, working alone, created a device in 1833 (a multielement thermopile) that was sensitive enough to detect a person from 10 metres away.⁴⁸ The next major advancement was the bolometer, invented in 1880 by Samuel Pierpont Langley.⁴⁹ Langley and his assistant Charles Greeley Abbot continued to refine this instrument. By 1901, their bolometer could detect the radiation from a cow 400 metres away and was sensitive to temperature differences of one hundred-thousandth of a degree Celsius (0.00001 C).⁵⁰ ⁵¹
The first commercial thermal imaging camera was sold in 1965, intended for inspecting high-voltage power lines. The first civil-sector application of IR technology may have been a device patented in 1913 to detect icebergs and steamships using a mirror and thermopile.⁵² This was quickly surpassed by the first accurate IR iceberg detector, patented in 1914 by R.D. Parker, which did not use thermopiles.⁵³ This was followed by G.A. Barker's 1934 proposal to use an IR system to detect forest fires.⁵⁴ The technique wasn't truly industrialized until 1935, when it was used to analyze heating uniformity in hot steel strips.⁵⁵
First thermographic camera
In 1929, the Hungarian physicist Kálmán Tihanyi invented an infrared-sensitive (night vision) electronic television camera for anti-aircraft defense in Britain.⁵⁶ The first American thermographic camera was an infrared line scanner developed by the US military and Texas Instruments in 1947.⁵⁷ It took one hour to produce a single image. While various approaches were explored to improve the technology's speed and accuracy, a critical breakthrough was in image scanning, which the AGA company successfully commercialized using a cooled photoconductor.⁵⁸
The first British infrared linescan system was the Yellow Duckling of the mid-1950s.⁵⁹ It used a continuously rotating mirror and detector, with the aircraft's motion providing the Y-axis scan. Though it failed in its intended purpose of tracking submarines by their wake, it was repurposed for land-based surveillance and became the foundation of military IR linescan technology.
This work was advanced at the Royal Signals and Radar Establishment in the UK when they discovered that mercury cadmium telluride was a photoconductor that required significantly less cooling. In the United States, Honeywell also developed detector arrays that could cool at a lower temperature, but they still relied on mechanical scanning. This method had several drawbacks that could only be overcome with an electronic scanning system. In 1969, Michael Francis Tompsett at the English Electric Valve Company in the UK patented a camera that scanned pyro-electronically, achieving a high level of performance after further breakthroughs in the 1970s.⁶⁰ Tompsett also proposed the idea for solid-state thermal-imaging arrays, which ultimately led to modern hybridized single-crystal-slice imaging devices.⁵⁸
By using video camera tubes like vidicons with a pyroelectric material such as triglycine sulfate (TGS) as their target, a vidicon sensitive to a broad portion of the infrared spectrum became possible.⁶¹ This technology was a precursor to modern microbolometers and was primarily used in firefighting thermal cameras.⁶²
Smart sensors
A crucial area of development, particularly for security systems, was the ability to intelligently evaluate a signal and warn of a threat. Spurred on by the US Strategic Defense Initiative, "smart sensors" began to emerge. These were sensors capable of integrating sensing, signal extraction, processing, and comprehension into a single package.⁶³ Two main types of smart sensors exist. One type, similar to what is called a "vision chip" in the visible range, allows for preprocessing using smart sensing techniques thanks to the growth of integrated microcircuitry.⁶⁴ The other technology is more application-specific and achieves its preprocessing goals through its physical design and structure.⁶⁵
Towards the end of the 1990s, infrared technology began to shift more toward civilian use. A dramatic drop in the cost of uncooled arrays, coupled with significant technological advancements, created a dual-use market that served both civilian and military needs.⁶⁶ These applications include environmental control, building and art analysis, medical diagnostics, and automotive guidance and collision avoidance systems.⁶⁷ ⁶⁸ ⁶⁹ ⁷⁰ ⁷¹ ⁷²