“A satellite has no conscience.” — Edward R Murrow
This sentence may be true in regards to the fact that a satellite is a non-living entity, but I firmly believe that it has the potential to awaken the consciousness of the human race. Since the first photograph of the earth was taken by the sub-orbital V-2 rocket which was launched by the US in 1946, satellite imagery has come a long way.
Over the years it has made us realize how our actions affect our planet, be it from detecting holes in the ozone layer due to the excessive amounts of Chlorofluorocarbons emitted by our air conditioners and home fresheners or the environmental changes caused due to global warming.
Satellite imagery has also played a vital role in the field of science and astrophysics. It was due to satellite imagery that we were able to see the first-ever pictorial proof of a black hole. Something so mysterious, we thought we’d never be able to see it. From knowing the age of the Sun to discovering the Kuiper Belt, a belt that protects our solar system from harmful interstellar objects. Had it not been for satellite imagery we would be living in a state of constant unawareness of the state of our planet and the marvelous universe that surrounds it.
In this blog, we’ll walk through the entire domain of satellite imagery. Starting with a summary of the history behind Satellite imagery, moving on to the different technologies used for the same, the future scope where we’ll talk about the latest developments in satellite imagery.
History of Satellite Imagery
The very first images from space were of our planet and they were taken on suborbital flights. As mentioned before the U.S-launched a V-2 flight on October 24, 1946, that took one image every 1.5 seconds. Having an apogee of 65 miles (105 km), these photos were taken at a height five times higher than the previous record, the 13.7 miles (22 km) by the Explorer II balloon mission in 1935.
The first satellite clicked orbital photographs of the Earth, which were clicked on August 14, 1959, by the U.S. Explorer 6. The first satellite photographs of the Moon might have been taken on October 6, 1959, by the Soviet satellite Luna 3, on a mission to photograph the far side of the Moon.
The Blue Marble photograph was taken from space in 1972, and has, since then, become very popular in the media and among the public. Also in 1972, the United States started the Landsat program, the largest program for the acquisition of imagery of Earth from space. Landsat Data Continuity Mission, the most recent Landsat satellite, was launched on 11 February 2013. In 1977, the first real-time satellite imagery was acquired by the United States’s KH-11 satellite system.
The satellite images were made from pixels. The first crude image taken by the satellite Explorer 6 shows a sunlit area of the Central Pacific Ocean and its cloud cover. The photo was taken when the satellite was about 17,000 mi (27,000 km) above the surface of the earth on August 14, 1959. At the time, the satellite was crossing Mexico.
The Various Technologies used in Satellite Imagery
There are many types of satellite imagery. But for the sake of the blog we’ll mainly talk about:
- Visible Imagery
- Infrared Imagery
- Water Vapour Imagery
- Hyperspectral Imaging
- Images made by measuring the energy
Visible Imagery
Visible satellite pictures can only be viewed during the day since clouds reflect the light from the sun. In these images, clouds show up as white, the ground is normally grey, and the water is dark. In winter, the snow-covered ground will be white, which can make distinguishing clouds more difficult. To help differentiate between clouds and snow, looping pictures can be helpful; clouds will move while the snow won’t. The snow-covered ground can also be identified by looking for terrain features, such as rivers or lakes. Rivers will remain dark in the imagery as long as they are not frozen. If the rivers are not visible, they are probably covered with clouds. Visible imagery is also very useful for seeing thunderstorm clouds building. Satellites will see the developing thunderstorms in their earliest stages before they are detected on radar.
Infrared Imagery
Infrared satellite pictures show clouds both day and night. Instead of using sunlight to reflect off of clouds, the clouds are identified by satellite sensors that measure the heat radiating off of them. The sensors also measure the heat radiating off the surface of the earth. Clouds will be colder than land and water, so they are easily identified. Infrared imagery is useful for determining thunderstorm intensity. Strong to severe thunderstorms will normally have very cold tops. Infrared imagery can also be used for identifying fog and low clouds. The fog product combines two different infrared channels to see fog and low clouds at night, which show up as dark areas on the imagery.
Water Vapour Imagery
Water vapor imagery is created using a wavelength sensitive to the moisture content in the atmosphere. In this imagery, bright blue and white areas indicate the presence of high water vapor or moisture content, whereas dark orange and brown areas indicate little or no moisture present.
Along with this, there are different imaging techniques used to achieve the above three types of Satellite imagery. We shall discuss them below:
Hyperspectral Imaging
Hyperspectral imaging, like other spectral imaging, collects and processes information from across the electromagnetic spectrum. The goal of hyperspectral imaging is to obtain the spectrum for each pixel in the image of a scene, to find objects, identify materials, or detecting processes. There are three general branches of spectral imagers. There are push broom scanners and the related whisk broom scanners (spatial scanning), which read images over time, band sequential scanners (spectral scanning), which acquire images of an area at different wavelengths, and snapshot hyperspectral imaging, which uses a staring array to generate an image in an instant.
Whereas the human eye sees the color of visible light is mostly three bands (long wavelengths — perceived as red, medium wavelengths — perceived as green, and short wavelengths — perceived as blue), spectral imaging divides the spectrum into many more bands. This technique of dividing images into bands can be extended beyond the visible. In hyperspectral imaging, the recorded spectra have fine wavelength resolution and cover a wide range of wavelengths. Hyperspectral imaging measures continuous spectral bands, as opposed to multiband imaging which measures spaced spectral bands.
In astronomy, hyperspectral imaging is used to determine a spatially resolved spectral image. Since a spectrum is an important diagnostic, having a spectrum for each pixel allows more science cases to be addressed. In astronomy, this technique is commonly referred to as integral field spectroscopy, and examples of this technique include FLAMES and SINFONI on the Very Large Telescope, but also the Advanced CCD Imaging Spectrometer on Chandra X-ray Observatory uses this technique.
Measuring Energy
Each of ABI’s (Advanced Baseline Imager)16 channels measures the amount of reflected or emitted energy in a specific wavelength of light along the electromagnetic spectrum to obtain information about Earth’s atmosphere, land, or ocean. ABI’s spectral bands include two visible channels, four near-infrared channels, and ten infrared channels. ABI’s visible channels only see during the daytime, much like our own eyes, because they only capture sunlight reflected off the earth. In contrast, ABI’s infrared channels detect energy that is not visible to the human eye. The infrared channels collect energy that is emitted by objects such as the surface of the earth and clouds. ABI can detect infrared energy day or night. Think of ABI as acting like night vision goggles, which use image enhancement technology to see all available light. Each ABI channel is useful for viewing a particular feature, such as cloud type, water vapor in the atmosphere, ozone, carbon dioxide, or areas of ice or snow.
Translating Satellite Data
Information from ABI is turned into radio waves and is transmitted to antennas on the ground. The information is then sent to computers at a satellite data processing center. This information is transmitted in binary code, a numeric language that uses only two digits: 0 and 1, arranged in eight-character strings that are understandable by computer software. A numeric value of the amount of light detected by each object is digitally recorded. The computer then translates the information into imagery.
Satellite imagery, like pictures on television, is made up of tiny squares called pixels (short for picture elements). Each pixel in satellite imagery represents the relative reflected or emitted light energy recorded for that part of the image. The binary code information transmitted from the satellite assigns a value to each segment of the electromagnetic spectrum. These numbers can be converted to grayscale (each pixel in an image represents only an amount of light, and is composed exclusively of shades of gray) to produce a black and white image in pixel form.
Scientists can assign colors to the “bands” of ABI data based on what color of light they represent on the electromagnetic spectrum. To build a composite image from satellite data that makes sense to the human eye, we need to use colors from the visible portion of the electromagnetic spectrum — red, green, and blue. These three primary colors can be combined at different intensities to form all possible colors. The color red is assigned to the band that represents red light, and blue to the band of data that represents blue light. ABI does not have a true “green” band, so this information is simulated using a lookup table that was created using data from the Japan Meteorological Agency’s Advanced Himawari Imager, which is very similar to the ABI but contains a “green” channel.
Combining data from multiple ABI channels provides even more information. Certain combinations of data from different channels allow us to highlight features of interest. When combined in a particular “recipe” a single image is created, with the colors combining to form all the possible colors perceivable by human vision. The result is a variety of red-green-blue or “RGB” composite imagery, which can highlight atmospheric and surface features that are difficult or more time-consuming to distinguish from single-channel images alone. RGBs provide critical information that gives forecasters situational awareness and helps them understand rapidly changing weather. Often, geographical details like country and state boundaries are added to imagery to help orient the viewer.
Future of Satellite Imaging
With the increase in the number of satellites in various domains, so much so that even students can send satellites with basic imaging payloads to space, It is safe to say that satellite imaging as a payload is going in the right direction. Along with this, another news that has caught the entire world’s attention is the launch of the much-anticipated satellite telescope, the James Webb telescope that will set a new level for satellite imaging in the field of infrared satellite imagery as it is estimated that the telescope will be able to see stars from a period near the occurrence of the Big Bang, which in itself is an interesting debate.
References
https://en.wikipedia.org/wiki/Satellite_imagery#History
https://www.weather.gov/mrx/sattype
https://en.wikipedia.org/wiki/Hyperspectral_imaging#Astronomy