What makes a radar image beautiful?

🌍📡 The Quality of a SAR Image: Don’t Get It Twisted!

Elise Colin
7 min read5 days ago

When it comes to assessing the quality of a SAR image, don’t mix everything up — resolution, signal-to-noise ratio, bandwidth, ambiguities… and don’t compare X-band with P-band. The list goes on, and the technical criteria are anything but simple.

I’ve often been asked what I thought about the quality of an image acquired by some new SAR sensor, usually by someone expecting a definitive judgment. At other times, I’ve heard blunt remarks like: “Your radar image looks ugly.”

At moments like these, I can’t help but feel like Christian awkwardly trying to woo Roxane in Cyrano de Bergerac. Roxane’s response? “Is that all? Come on, elaborate, it’s the theme, after all. No doubt, yes… but what else?”.

Because evaluating the quality of a radar image goes far beyond appearances. Beneath its apparent ugliness, there often lies a wealth of exploitable information. So what makes a radar image truly beautiful? Let’s dive into the technical criteria and nuances that define quality. 🚀

The classic criteria to consider

🔍 1. Spatial Resolution
Resolution defines the ability to distinguish two closely spaced objects. In SAR, it depends both on the bandwidth of the transmitted signal and the technological design of the system.

In radar terminology, we refer to frequency bandwidth, but in practice, two key parameters are distinguished:

  • Central frequency (f0) : The average frequency of the transmitted signal.
  • Bandwidth (B) : The range of frequencies covered by the signal, which determines the ability to separate two targets in range.

The simplified radar resolution formulas are as follows:

Range Resolution Δrange =c/ (2B)
where c is the speed of light (3.1⁰⁸m/s), B is the signal bandwidth (in Hz).
This formula shows that distance resolution is inversely proportional to bandwidth: a wider band improves resolution.

Azimuth Resolution: Δazimut ≈λ R / L
where λ is the wavelength of the radar signal (λ=c/f), R is the range and L is the effective length of the antenna synthesized by the motion of the sensor (called the synthetic aperture).

This formula highlights that range resolution is inversely proportional to the signal bandwidth: a wider bandwidth improves resolution.

So beware: we could deduce that the resolution along the distance axis is independent of the center frequency. BUT :
- firstly, we’ll never be able to have a frequency band that’s twice as large as the center frequency (negative frequencies don’t exist)
- secondly, in real life, azimuth resolution is limited by the fact that targets are not perfectly isotropic (they don’t reflect the same amount of signal in all directions in which they are illuminated). This is all the more true the higher the frequency.
Another warning: pixel size can be chosen differently from spatial resolution. You can always oversample an image (choose a much smaller physical pixel size), but the image resolution will remain the same! It can also be undersampled!

2. The Signal to Noise Ratio (SNR)
A good SNR (Signal-to-Noise Ratio) ensures an image with less noise and, consequently, better visibility of details. The SNR is defined as the ratio between the power of the useful signal and that of the unwanted noise present in the image. Mathematically, it is expressed in decibels (dB).

A high SNR indicates that the signal amplitude is significantly greater than the noise, making it easier to detect and interpret structures in the image. Conversely, a low SNR makes it more challenging to extract complex information due to the overwhelming presence of noise.

It is possible to have sensors with very good spatial resolution but suffering from high noise levels, thus compromising the effective quality of the extracted information. This issue is particularly pronounced in satellite imaging, where noise levels are often higher than in airborne imaging. This is due to the greater distance between the sensor and the target, the energy constraints of onboard systems, and atmospheric interferences.

In SAR, the NESZ (Noise-Equivalent Sigma Zero) is the most commonly used metric to capture the effect of system noise on image quality. It can be analytically predicted during the radar design phase and empirically measured over “dark” targets in the SAR image.

These two images from the Capella sensor have the same resolution, but at two different NESZs: -20dB on the left, -10 db on the right. It’s not the resolution that differs here! ( source: https://vekom.com/wp-content/uploads/2020/12/Capella_Space_SAR_System_Performance.pdf)

🌫️ 3. Ambiguities (Range and Azimuth)
Ambiguities in Synthetic Aperture Radar (SAR) imaging are artifacts that occur when energy reflected from an object is misinterpreted and appears replicated elsewhere in the image. These positioning or repetition errors result from limitations in the spatial or temporal sampling of radar signals. Such ambiguities can distort data interpretation by generating ghost copies of real targets, complicating the analysis.

Range ambiguities occur when the interval between two successive pulses (PRT — Pulse Repetition Time) is too short to separate echoes arriving from different distances. Conversely, if this time interval is too long, spectral aliasing can occur, leading to the superposition of information from objects located in different azimuth directions.

Examples of azimuth ambiguities — From Left to right: TerraSAR-X StripMap image, COSMOS-SkyMed, RADARSAT-2 Standard mode image, Sentinel-1 IW image — FROM Leng, Xiangguang & Ji, Kefeng & Zhou, Shilin & Zou, Huanxin. (2017). Azimuth Ambiguities Removal in Littoral Zones Based on Multi-Temporal SAR Images. Remote Sensing. 9. 10.3390/rs9080866.

🌀 4. Post-processing : Granularity or Speckle
Speckle is a typical phenomenon in coherent images such as SAR. Although it appears as noise, it contains valuable information about the texture and structure of targets. Its attenuation must be balanced to retain useful detail. Some image providers include or exclude speckle post-processing.

📈 5. Amplitude dynamics and quantization levels

Most radars record and transmit original data in 16 bits (65,536 intensity levels), which are reduced to 8 bits (256 levels) for visual interpretation and computer analysis.
Beware, too, of quantization, which can degrade the quality of the perceived image, not least because the dynamics involved in radar are so hudge!

Physics is what it is!

When observing the same object, it would be naive to expect identical information — and therefore the same visual appearance — in two radar images, even if they share similar characteristics in terms of resolution, ambiguities, and noise levels. Physics imposes its own rules: changing the central wavelength fundamentally alters how the wave is backscattered by the same medium.

For instance, images acquired at lower frequencies often appear less visually appealing. However, this is not a flaw but rather a direct consequence of the physics governing wave-matter interactions. Lower frequencies capture different types of information — typically less dense in bright points but potentially richer in large-scale structures and better suited for penetrating diffuse media, such as vegetation or soil.

Today, X-band sensors are proliferating, offering fine spatial resolution and an aesthetic often perceived as more detailed. In contrast, sensors operating at lower frequencies are far less common. This is why we eagerly await the BIOMASS mission, which will observe the Earth in the P-band. But let’s be clear: do not expect images resembling those — more familiar and visually detailed — acquired by Sentinel-1. BIOMASS will deliver an entirely different perspective, one far better suited to exploring deep structures and environmental dynamics.

X-band (left) and P-band (right) images of an area containing tropical forest and cultivation. From Dependence of P-Band Interferometric Height on Forest Parameters from Simulation and Observation.

These criteria do not make images beautiful, but they do make them more usable:

📈 1. Radiometric accuracy and calibration
Radiometric accuracy and calibration are fundamental aspects of SAR radar imaging, especially when leveraging data for quantitative measurements and rigorous scientific applications. Unlike purely qualitative analyses based on the visual appearance of images, quantitative interpretation relies on an accurate evaluation of the amplitudes and intensities of the measured radar signals.

In a SAR system, radiometry refers to the ability to accurately measure the intensity of the signal backscattered by a given target. This intensity depends on the electromagnetic properties of the surface (e.g., roughness, moisture) as well as the acquisition geometry (e.g., incidence angle). High radiometric accuracy ensures that the measured intensity faithfully reflects physical reality, which is essential for applications such as surface parameter estimation, agricultural crop analysis, and the monitoring of glaciers and snow cover.

Achieving this radiometric accuracy requires calibration, which is indispensable for correcting instrumental and geometric effects that may distort measurements. Calibration generally falls into two categories:

  1. Internal Calibration — This uses built-in reference signals within the radar system itself to compensate for variations caused by sensor electronics, such as amplifier gain fluctuations or thermal instabilities.
  2. External Calibration — This involves the use of calibration targets (e.g., corner reflectors or calibrated spheres) placed on the ground to adjust for global errors and ensure consistency across multiple acquisitions or sensors.

Radiometric fidelity is particularly critical for physical and statistical model inversion. These models establish relationships between SAR measurements (such as intensity or polarization) and the physical properties of observed targets (e.g., vegetation height or soil moisture content). Any error in calibration or amplitude measurements can introduce significant biases, jeopardizing the reliability of the results — no matter how beautiful the image may appear.

🧭 2. Geometric localization
To be fully exploitable for mapping and geospatial analysis, SAR images must be accurately georeferenced, meaning they need to be aligned with a precise geographic coordinate system. This process ensures that each pixel in the image corresponds to an actual position on the Earth’s surface.

Unlike optical sensors, SAR images are acquired in radar coordinates (range and incidence angle) and must be transformed into geographic coordinates (latitude, longitude, and altitude). This transformation requires:

  • Topographic correction models: These compensate for the effects of terrain variations, such as distortions caused by relief (exaggeration or compression of slopes and valleys).
  • Platform motion compensation: Satellites or aircraft are rarely perfectly stable; variations in orientation (roll, pitch) must be accounted for to avoid geometric distortions.

High geometric accuracy is crucial for several applications:

  • Precision mapping: For instance, monitoring landslides or managing natural resources.
  • SAR interferometry (InSAR): Where georeferencing errors can compromise the analysis of millimeter-scale ground displacements.
  • Multi-temporal analysis: Requiring perfect alignment between multiple acquisitions to detect changes over time.

Accurate geometric localization is therefore a key requirement for ensuring the reliability and usability of SAR data in scientific and operational contexts.

— — -

I hope you now have enough to fuel your praise — or sharpen your critiques — of the “beauty” — or should I say the unique charm — of a radar image.

--

--

Elise Colin
Elise Colin

Written by Elise Colin

Researcher with broad experience in Signal and Image processing, focusing on big data and IA aspects for Earth observation images, and medical images.

No responses yet