Hyperspectral imaging technology has enabled many advances in precision agriculture, which requires more than just basic RGB information.
The Food and Agriculture Organization of the United Nations predicts that the world population will grow by more than 30% by 2050, reaching some 9 billion people. It is essential to provide food for everyone. Farmers need to adopt new precision farming processes that enable them to produce more crops and more efficiently. The variability of the environment and its interpretation require tools that can manage all biophysical and productive factors in their complexity.
Remote sensing techniques have exponentially evolved thanks to technological progress with the spread of multispectral cameras. Hyperspectral imaging is the capture and processing of an image at a very high number of wavelengths. While multispectral imaging can evaluate the process with three or four colors (red, green, blue and near infrared), hyperspectral imaging splits the image into tens or hundreds of colors. By using the technique of spectroscopy, which is used to identify materials based on how light behaves when it hits a subject, hyperspectral imaging obtains more spectra of data for each pixel in the image of a scene.
Unlike radiography, hyperspectral imaging is a non-destructive, non-contact technology that can be used without damaging the object being analyzed. For example, a drone with a hyperspectral camera can detect plant diseases, weeds, soil erosion problems, and can also estimate crop yields.
Remote sensing has driven manufacturers to market hyperspectral chambers for applications ranging from precision farming to water quality control. The underlying phenomenon is spectroscopy. Spectroscopy is a method of investigation that measures and reads spectra resulting from the interaction of electromagnetic radiation with matter. It is related to the absorption, emission, or scattering of electromagnetic radiation from atoms or molecules. In practice, the fundamental principle consists of a beam of electromagnetic radiation on the desired sample and observing how it responds to that stimulus.
The electromagnetic spectrum can be divided into seven regions according to wavelength. Depending on the use, a distinction is made between the various instruments. When light hits a surface, it is reflected, absorbed, or refracted to different degrees depending on the characteristics of the medium. The total unit of energy can be divided into three factors: 𝛼: absorbance – energy absorbed by the body per unit of incident energy; 𝜌: reflectance – energy reflected by the body per unit of incident energy; 𝜏: transmittance – energy passing through the body per unit of incident energy (figure 1).
Unreflected light can be absorbed by the medium, in which case it is disposed of in the form of heat, or refracted. The phenomenon of refraction originates from the fact that the speed of electromagnetic waves in a medium other than vacuum is different, and always less than c (light speed), changing from one medium to another (figure 1 and 2).
Each body emits energy in proportion to its surface temperature. Thanks to Wien’s Law, it is possible to search for the peak of this emission within the electromagnetic radiation spectrum: 𝜆max𝑇𝑠𝑢𝑝=2893 𝜇𝑚𝐾.
Another factor to consider is radiance. This allows measuring the amount of energy coming from the area unit and in the solid angle unit. It is defined as:
Where P is the power [W], A is the radiating surface [m2], Ω is the solid angle [sr], 𝜃 is the angle between the normal and the surface of the observation point [°].
These parameters make it possible to define the view and, thus, the measurement. The angle of observation and incidence influence the reflection of the energy and thus its measurement.
The sensor, whether multispectral or hyperspectral, does not automatically measure radiances, but voltages which are then converted into image units by the electronics of the instrument. The process of transforming radiances in different spectral bands is called “radiance calibration” or “radiometric calibration”. The optical sensors used in remote sensing are used to conduct spatial and temporal analyses. The single acquisition usually consists of an image representing the distribution of the radiance in space at a specific band.
The main unit of a hyperspectral chamber is the optical sensor that acquires a series of images of the electromagnetic spectrum following the physical process of reflection and refraction described in the previous paragraph. Images are processed by a digital signal processor to identify the required information. This is a real recognition factor for physical, chemical, or biological processes. Moreover, an accurate interpretation of the spectral signature and its variations over time leads to the identification of the biochemical elements under study or analysis.
Data sets of hyperspectral origin generally consist of more than 100 spectral bands of relatively narrow bandwidths (5-10 nm), while multispectral data sets usually consist of about 5-10 bands of relatively large bandwidths. The spectral regions commonly used are those of the visible, Near Infrared, and Short-Wave Infrared.
Hyperspectral sensors are passive elements; they collect information as a set of images representing different bands of the electromagnetic spectrum. These images are combined to form a cube of hyperspectral data, which can be processed and analyzed to read the spectral data for a wide range of applications.
A hyperspectral cube consists of a set of images layered one above the other. Each image represents a particular wavelength band, and in a hyperspectral image, each pixel consists of a spectrum over an appropriate spectral region. Each object has a unique characteristic between the different wavelength bands. This unique characteristic is its “spectral signature” (figure 3).
Most hyperspectral imaging systems consist of an imaging lens, a narrow slit, a diffraction grating, and a two-dimensional focal plane array (FPA) detector (usually CCD or CMOS).
The image is projected through the slit onto the diffraction grating, where the light is divided into wavelengths before being projected onto the focal plane matrix. At each X-Y coordinate, one pixel is illuminated at a certain level, depending on the intensity of the light in that position and the wavelength, resulting in a three-dimensional matrix for each narrow-slit width.
The sensor technology can be distinguished in its simplest form of CCD, the Full Frame CCD, in which the light-sensitive area is fully exposed. The data is read by moving the entire pixel matrix one line at a time to the readout registers, which in turn transfer one pixel at a time to the converter.
On the other hand, the other type of active sensor, CMOS or Active Pixel Sensor (APS) is an image sensor that contains most of the necessary functionality in each individual pixel, using CMOS technology. The light arrives through the lens and is processed by the color filter before reaching the pixel matrix. Once the filtered light reaches the matrix, each individual pixel converts the light into an amplified voltage, which is then processed by the rest of the sensor. The main parts of a CMOS sensor are the color filter, the pixel matrix, the digital controller, and the analog-digital converter.
Although CCDs and APSs are both digital image sensors that exploit the photoelectric effect, they differ in their data processing and construction techniques. The differences are already noticeable in the architecture of the two sensors.
The main advantage of hyperspectral instruments is undoubtedly the great accuracy that can be achieved in the reconstruction of the spectral signature, continuous in the electromagnetic spectrum, starting from measurements that are discrete in the spectrum. The main disadvantage of the hyperspectral lies in the high data size for the processing step/elaboration process even if, over time, it has seen a noticeable improvement thanks to the advent of embedded microelectronics (figure 4 and 5).
In combining the spectral information provided by spectroscopy and the spatial information provided by imaging, hyperspectral imaging (HSI) offers improved knowledge on the composition and distribution of components in a product (table 1).
The realization of hyperspectral sensors involves different materials for various fields of application. Silicon is used for the acquisition of Near Infrared (NIR), ultraviolet, visible and short-wave regions; indium arsenide (InAs) and GaAs has a spectral response between 900-1700 nm; indium gallium arsenide (InGaAs) extends the previous range to 2600 nm, and mercury cadmium tellurium (MCT or HgCdTe) has a wide spectral range and high quantum efficiency to achieve mid-regional infrared (about 2500 to 25. 000 nm) and NIR region (about 800).
Hyperspectral for agriculture
Hyperspectral imaging can be used for a variety of applications, including mineralogy, agriculture, astronomy, and surveillance, through UAV solutions. This technology has enabled many advances in precision agriculture, which requires more than just basic RGB information. For example, video images flying over a field would not be able to accurately distinguish real plants from fake ones. By looking at the spectral content in pixels, hyperspectral solutions can detect chlorophyll or very small color changes in foliage.
An example is the inspection of orange groves for citrus blight. Citrus blight destroys the vitality of trees and can spread throughout the grove. One of the first signs of this disease is a by-product secreted on the surface of the leaves. With hyperspectral imaging, this can be seen 300-400 meters above crops covering a large area quickly (mounted on UAVs).
In precision farming, a distinction is made between ground sensing technologies, also known as proximal sensing, and remote sensing technologies. The former collects data on crop and process conditions through fixed or mobile ground sensors, while remote-sensing technologies use optical satellite, or drone-mounted sensors, which generate multispectral and hyperspectral images of crops.
The use of drones is associated with a reduction in the time needed to acquire images and data and allows great flexibility of intervention together with greater spatial resolution. The drones lead to the optimization of the use of pesticides, fertilizer, seeds as well as an estimated water saving of up to 25% less than the quantity used with more traditional technologies (figure 6).
Gamaya is an analysis platform that uses a combination of hyperspectral imaging data, together with corresponding historical weather and climate records to provide farmers with pest and disease warnings, yield forecasts, and input application rate requirements.
FluroSat uses various sensing methods to capture and analyze hyperspectral images of cotton and wheat fields to predict disease and help farmers make crop health decisions. Multi-spectral cameras can measure whether a plant is healthy or not, but hyperspectral images can go further and diagnose the exact reason for that state. In addition, the FluroSat platform displays nitrogen maps and suggests to agronomists the exact locations where to take tissue samples.
ImpactVision uses hyperspectral imaging to help companies in the food chain to determine the quality and maturity of food products. In meat, ImpactVision can determine tenderness, allowing meat producers to guarantee the quality of their meat for a premium price. Another example is avocado ripeness, which ImpactVision can measure through images.
Hyperspectral analysis has the potential to develop applications in any discipline. Data analysis is delicate and must be treated in a careful and conscious way in order to avoid providing incorrect information. As far as the control and protection of the environment is concerned, hyperspectral remote sensing proves to be useful both in the prevention phase and in the subsequent phase of analysis of the consequences of any event. In the study of landslides, for example, during the prevention phase it is possible to control the humidity and soil deformation, the arrangement of the lithological layers, the vegetation cover, and water losses. In the phase following the landslide event, the damage can be estimated by delimiting the area affected by the phenomenon.
This article was originally published on EE Times Europe.
Maurizio Di Paolo Emilio holds a Ph.D. in Physics and is a telecommunication engineer and journalist. He has worked on various international projects in the field of gravitational wave research. He collaborates with research institutions to design data acquisition and control systems for space applications. He is the author of several books published by Springer, as well as numerous scientific and technical publications on electronics design.