What is Remote Sensing? The Definitive Guide

What Is Remote Sensing

What is Remote Sensing?

Remote sensing is the science of obtaining the physical properties of an area without being there. It allows users to capture, visualize, and analyze objects and features on the Earth’s surface. By collecting imagery, we can classify it into land cover.


Sensor Types Remote Sensing Types Electromagnetic Spectrum
Image Classification Uses Applications

Chapter 1. Sensor Types

Remote sensing uses a sensor to capture an image. For example, airplanes, satellites, and UAVs have specialized platforms that carry sensors.

The diagram below shows the major remote sensing technologies and their typical altitudes.

Remote Sensing Technologies


Each type of sensor has its own advantages and disadvantages. When you want to capture imagery, you have to consider factors like flight restrictions, image resolution and coverage.

For example, satellites capture data on a global scale. But drones are a better fit for flying in small areas. Finally, airplanes and helicopters take the middle ground.

Remote Sensing Sensor Types Advantages Disadvantages


For earth observation, you also have to consider image resolution. Remote sensing divides image resolution into three different types:

  • Spatial resolution
  • Spectral resolution
  • Temporal resolution

Spatial resolution is the detail in pixels of an image. High spatial resolution means more detail and smaller pixel size. Whereas, lower spatial resolution means less detail and larger pixel size.

Typically, drones like DJI capture images with one of the highest spatial resolution. Even though satellites are highest in the atmosphere, they are capable of 50cm pixel size or greater.

READ MORE: DigitalGlobe Satellite Imagery: Worldview, GeoEye and IKONOS

Spatial Resolution Comparison


Spectral Resolution is the amount of spectral detail in a band. High spectral resolution means its bands are more narrow. Whereas low spectral resolution has broader bands covering more of the spectrum.

Spectral Resolution


Temporal Resolution is the time it takes for a satellite to complete a full orbit. UAVs, airplanes, and helicopters are completely flexible. But satellites orbit the Earth in set paths.

Global position system satellites are in medium Earth orbit (MEO). Because they follow a continuous orbital path, revisit times are consistent. This means our GPS receiver can almost always achieve 3 satellites or greater for high accuracy.

READ MORE: Trilateration vs Triangulation – How GPS Receivers Work


The three types of orbits are:

  • Geostationary orbits match the Earth’s rate of rotation.
  • Sun synchronous orbits keep the angle of sunlight on the surface of the Earth as consistent as possible.
  • Polar orbits passes above or nearly above both poles of Earth.

It’s the satellite height above the Earth surface that determines the time it takes for a complete orbit. If a satellite has a higher altitude, the orbital period increases.

We categorize orbits by their altitude:

  • Low Earth Orbit (LEO)
  • Medium Earth Orbit (MEO)
  • High Earth Orbit (HEO)

We often find weather, communications, and surveillance satellites in high Earth orbit. But CubeSats, the ISS and other satellites are often in low Earth orbit.

Satellite Orbits Altitude

Chapter 2. Types of Remote Sensing

The two types of remote sensing sensors are:

  • Passive sensors
  • Active sensors

The main difference between active sensors is that this type of sensor illuminates its target. Then, active sensors measure the reflected light. For example, Radarsat-2 is an active sensor that uses synthetic aperture radar.

Imagine the flash of a camera. It brightens its target. Next, it captures the return light. This is the same principle of how active sensors work.

Active Remote Sensing


Passive sensors measure reflected light emitted from the sun. When sunlight reflects off Earth’s surface, passive sensors capture that light.

For example, Landsat and Sentinel are passive sensors. They capture images by sensing reflected sunlight in the electromagnetic spectrum.

Passive Remote Sensing

Passive remote sensing measures reflected energy emitted from the sun. Whereas active remote sensing illuminates its target and measures its backscatter.

Chapter 3. The Electromagnetic Spectrum

The electromagnetic spectrum ranges from short wavelengths (like X-rays) to long wavelengths (like radio waves).

Our eyes only see the visible range (red, green, and blue). But other types of sensors can see beyond human vision. Ultimately, this is why remote sensing is so powerful.

Electromagnetic Spectrum


Our eyes are sensitive in the visible spectrum (390-700 nm). But engineers design sensors to capture beyond these wavelengths in the atmospheric window.

For example, near-infrared (NIR) is in the 700-1400 nm range. Vegetation reflects more green light because that’s how our eyes see it. But it’s even more sensitive to near-infrared. That’s why we use indexes like NDVI to classify vegetation.

ndvi example

Spectral bands are groups of wavelengths. For example, ultraviolet, visible, near-infrared, thermal infrared, and microwave are spectral bands.

We categorize each spectral region based on its frequency (v) or wavelength. There are two types of imagery for passive sensors:

  • Multispectral imagery
  • Hyperspectral imagery

The main difference between multispectral and hyperspectral is the number of bands and how narrow the bands are. hyperspectral images has hundreds of narrow bands, multispectral images consist of 3-10 wider bands.


Multispectral imagery generally refers to 3 to 10 bands. For example, Landsat-8 produces 11 separate images for each scene.

  • Coastal aerosol (0.43-0.45 um)
  • Blue (0.45-0.51 um)
  • Green (0.53-0.59 um)
  • Red (0.64-0.67 um)
  • Near infrared NIR (0.85-0.88 um)
  • Short-wave infrared SWIR 1 (1.57-1.65 um)
  • Short-wave infrared SWIR 2 (2.11-2.29 um)
  • Panchromatic (0.50-0.68 um)
  • Cirrus (1.36-1.38 um)
  • Thermal infrared TIRS 1 (10.60-11.19 um)
  • Thermal infrared TIRS 2 (11.50-12.51 um)

An example of multispectral imagery


Hyperspectral imagery has much narrower bands (10-20 nm). A hyperspectral image has hundreds of thousands of bands.

For example, Hyperion (part of the EO-1 satellite) produces 220 spectral bands (0.4-2.5 um).

Example of hyperspectral imagery

Chapter 4. Image Classification

When you examine a photo and you try to pull out features and characteristics from it, this is the act of using image interpretation. We use image interpretation in forestry, military, and urban environments.

We can interpret features because all objects have their own unique chemical composition. In remote sensing, we distinguish these differences by obtaining their spectral signature.

Object Based Image Classification Layers


In the mining industry, there are over 4000 natural minerals on Earth. Each mineral has its own chemical composition that makes it different from others.

It’s the objects’ chemical composition that drives its spectral signature. You can classify each mineral because it has its own unique spectral signature. When you have more spectral bands, this gives greater potential in image classification.

A spectral signature is the amount of energy reflected in a particular wavelength. Differences in spectral signatures are how we tell objects apart.

When you assign classes to features on the ground, this is the process of image classification.

The three main methods to classify images are:

The goal of image classification is to produce land use/land cover. By using remote sensing software, this is how we classify water, wetlands, trees, and urban areas in land cover.

Image Classification Techniques Object-Based

Chapter 5. Applications and Uses

There are hundreds of applications of remote sensing. From weather forecasting to GPS, it’s satellites in space that monitor, protect, and guide us in our daily lives.


Commonly, we use UAVs, helicopters, and airplanes for local issues. But satellites can also be useful for local study areas as well.

Here are some of the common sensor technologies:

  • Light Detection and Ranging (LiDAR)
  • Sound navigation ranging (Sonar)
  • Radiometers and spectrometers

We use Light Detection and Ranging (LiDAR) and Sonar are ideal for building topographic models. But the main difference between the two is “where”. While LiDAR is best suited for the ground, Sonar works better underwater.

By using these technologies, we build digital elevation models. Using these topographic models, we can predict flooding risk, archaeological sites, and delineating watersheds (to name a few).


As the world becomes more globalized, we are just starting to see the proliferation of remote sensing. For example, satellites tackle issues including:

  • Navigating with global positioning systems
  • Climate change monitoring
  • Arctic surveillance

Satellite information is fundamentally important if we are going to solve some of the major challenges of our time. All things considered, it’s an expanding field reaching new heights.

For issues like climate change, natural resources, disaster management, and the environment, remote sensing provides a wealth of information on a global scale.

Remote Sensing Trends



Leave a Reply

Your email address will not be published.