What is Remote Sensing? The Definitive Guide

What is Remote Sensing?
Remote sensingĀ is the science of obtaining the physical properties of an area without being there. It allows users to capture, visualize, and analyze objects and features on the Earthās surface. By collecting imagery, we can classify it into land cover and other types of analyses.
Table of Contents
Chapter 1. Sensor Types
Remote sensing uses a sensor to capture an image. For example, airplanes, satellites, and UAVs have specialized platforms that carry sensors.

The diagram below shows the majorĀ remote sensing technologiesĀ and their typical altitudes.
TYPES OF SENSORS
Each type of sensor has its own advantages and disadvantages. When you want to capture imagery, you have to consider factors likeĀ flight restrictions,Ā image resolutionĀ andĀ coverage.
For example, satellites capture data on a global scale. But drones are a better fit for flying in small areas. Finally, airplanes and helicopters take the middle ground.

IMAGE RESOLUTION
For earth observation, you also have to considerĀ image resolution. Remote sensing divides image resolution into three different types:
- Spatial resolution
- Spectral resolution
- Temporal resolution
SPATIAL RESOLUTION
Spatial resolutionĀ is the detail in pixels of an image. High spatial resolution means more detail and smaller pixel size. Whereas, lower spatial resolution means less detail and larger pixel size.
Typically,Ā UAV imageryĀ has one of the highest spatial resolution. Even though satellites are highest in the atmosphere, they are capable of 50cm pixel size or greater.
READ MORE:Ā Maxar Satellite Imagery: Worldview, GeoEye and IKONOS
SPECTRAL RESOLUTION
Spectral ResolutionĀ is the amount of spectral detail in a band. High spectral resolution means its bands are more narrow. Whereas low spectral resolution has broader bands covering more of the spectrum.

TEMPORAL RESOLUTION

Temporal ResolutionĀ is the time it takes for a satellite to complete a full orbit. UAVs, airplanes, and helicopters are completely flexible. But satellites orbit the Earth in set paths.
Global position system satellites are in medium Earth orbit (MEO). Because they follow a continuous orbital path, revisit times are consistent. This means our GPS receiver canĀ almostĀ always achieve 3 satellites or greater for high accuracy.
READ MORE:Ā Trilateration vs Triangulation ā How GPS Receivers Work
TYPES OF ORBITS
The three types of orbits are:
- Geostationary orbitsĀ match the Earthās rate of rotation.
- Sun-synchronous orbitsĀ keep the angle of sunlight on the surface of the Earth as consistent as possible.
- Polar orbitsĀ pass above or nearly above both poles of Earth.

Itās theĀ satelliteās heightĀ above the Earthās surface that determines the time it takes for a complete orbit. If a satellite has a higher altitude, the orbital period increases.
We categorize orbits by their altitude:
- Low Earth Orbit (LEO)
- Medium Earth Orbit (MEO)
- High Earth Orbit (HEO)
We often find the weather, communications, and surveillance satellites in high Earth orbit. But CubeSats, the ISS, and other satellites are often in low Earth orbit.
Chapter 2. Types of Remote Sensing
The two types of remote sensing sensors are:
- Passive sensors
- Active sensors
ACTIVE SENSORS
The main difference betweenĀ active sensorsĀ is that this type of sensor illuminates its target. Then, active sensors measure the reflected light. For example,Ā Radarsat-2Ā is an active sensor that uses synthetic aperture radar.

Imagine the flash of a camera. It brightens its target. Next, it captures the return light. This is the same principle of how active sensors work.
PASSIVE SENSORS
Passive sensorsĀ measureĀ reflected light emitted from the sun. When sunlight reflects off the Earthās surface, passive sensors capture that light.
For example,Ā LandsatĀ andĀ SentinelĀ are passive sensors. They capture images by sensing reflected sunlight in the electromagnetic spectrum.

Passive remote sensing measures reflected energy emitted from the sun. Whereas active remote sensing illuminates its target and measures its backscatter.
Chapter 3. The Electromagnetic Spectrum
The electromagnetic spectrum ranges from short wavelengths (like X-rays) to long wavelengths (like radio waves).
Our eyes only see the visible range (red, green, and blue). But other types of sensors can see beyond human vision. Ultimately, this is why remote sensing is so powerful.

ELECTROMAGNETIC SPECTRUM

Our eyes are sensitive to the visible spectrum (390-700 nm). But engineers design sensors to capture beyond these wavelengths in theĀ atmospheric window.
For example, near-infrared (NIR) is in the 700-1400 nm range. Vegetation reflects more green light because thatās how our eyes see it.
But itās even more sensitive to near-infrared. Thatās why we useĀ indexes like NDVIĀ to classify vegetation.
SPECTRAL BANDS
Spectral bands are groups of wavelengths. For example, ultraviolet, visible, near-infrared, thermal infrared, and microwave are spectral bands.
We categorize each spectral region based on its frequency (v) or wavelength. There are two types of imagery for passive sensors:
- Multispectral imagery
- Hyperspectral imagery
The main difference betweenĀ multispectral and hyperspectralĀ is theĀ number of bandsĀ andĀ how narrow the bands are.Ā Hyperspectral imagesĀ have hundreds of narrow bands, multispectral images consist of 3-10 wider bands.
MULTISPECTRAL
Multispectral imagery generally refers toĀ 3 to 10 bands. For example, Landsat-8 produces 11 separate images for each scene.

- Coastal aerosol (0.43-0.45 um)
- Blue (0.45-0.51 um)
- Green (0.53-0.59 um)
- Red (0.64-0.67 um)
- Near-infrared NIR (0.85-0.88 um)
- Short-wave infrared SWIR 1 (1.57-1.65 um)
- Short-wave infrared SWIR 2 (2.11-2.29 um)
- Panchromatic (0.50-0.68 um)
- Cirrus (1.36-1.38 um)
- Thermal infrared TIRS 1 (10.60-11.19 um)
- Thermal infrared TIRS 2 (11.50-12.51 um)
HYPERSPECTRAL
Hyperspectral imagery has much narrower bands (10-20 nm). A hyperspectral image hasĀ hundreds of thousands of bands.

For example,Ā HyperionĀ (part of the EO-1 satellite) produces 220 spectral bands (0.4-2.5 um).
Chapter 4. Image Classification

When you examine a photo and you try to pull out features and characteristics from it, this is the act of usingĀ image interpretation. We use image interpretation in forestry, military, and urban environments.
We can interpret features because all objects have their own unique chemical composition. In remote sensing, we distinguish these differences by obtaining theirĀ spectral signature.
SPECTRAL SIGNATURES
In the mining industry, there are overĀ 4000 natural mineralsĀ on Earth. Each mineral has its own chemical composition that makes it different from others.
Itās the objectās chemical composition that drives its spectral signature. You can classify each mineral because it has its own unique spectral signature. When you have more spectral bands, this gives greater potential inĀ image classification.
A spectral signature is the amount of energy reflected in a particular wavelength. Differences in spectral signatures are how we tell objects apart.
IMAGE CLASSIFICATION
When you assign classes to features on the ground, this is the process ofĀ image classification.

The three main methods to classify images are:
- Supervised classification
- Unsupervised classification
- Object-based image analysis
The goal of image classification is to produce land use/land cover. By usingĀ remote sensing software, this is how we classify water, wetlands, trees, and urban areas in land cover.
Chapter 5. Applications and Uses
There areĀ hundreds of applications of remote sensing. From weather forecasting to GPS, itās satellites in space that monitor, protect, and guide us in our daily lives.
LOCAL ISSUES
Commonly, we use UAVs, helicopters, and airplanes for local issues. But satellites can also be useful for local study areas as well.
Here are some of the common sensor technologies:
- Light Detection and Ranging (LiDAR)
- Sound navigation ranging (Sonar)
- Radiometers and spectrometers
We useĀ Light Detection and Ranging (LiDAR)Ā and Sonar. Both are ideal for building topographic models. But the main difference between the two is āwhereā. While LiDAR is best suited for the ground,Ā Sonar works better underwater.

By using these technologies, we buildĀ digital elevation models. Using these topographic models, we can predict flooding risk, archaeological sites, and delineating watersheds (to name a few).
GLOBAL ISSUES
As the world becomes more globalized, we are just starting to see the proliferation of remote sensing. For example, satellites tackle issues including:
- Navigating with global positioning systems
- Climate change monitoring
- Arctic Surveillance
Satellite information is fundamentally important if we are going to solve some of the major challenges of our time. All things considered, itās an expanding field reaching new heights.

For issues like climate change, natural resources, disaster management, and the environment, remote sensing provides a wealth of information on a global scale.
References
Views: 8







