“Serving tip” for a radar interferogram

Or how to represent a complex interferometric coherence in colors?

Elise Colin
3 min readFeb 10, 2023

Radar interferometry assumes that we have a pair of complex input images, Z1 and Z2, that have been previously co-registered to each other.
(In the following, we consider images that we have already registered to each other)

We then calculate the interferometric coherence parameter, using convolutions, as previously explained here:

from scipy import signal
from scipy import misc

def interferometric_coherence_2D(Z1,Z2,N):
win = np.ones((N,N))
num = signal.convolve2d(Z1*np.conj(Z2), win, mode='same')
den1 = signal.convolve2d(Z2*np.conj(Z2), win, mode='same')
den2 = signal.convolve2d(Z1*np.conj(Z1), win, mode='same')

gamma=num/np.sqrt(np.abs(den1)*np.abs(den2))
gamma[np.isnan(gamma)]=0

coherence=np.abs(gamma)
phase=np.angle(gamma)

return gamma,coherence,phase

How to make a nice colored representation of the result? By passing in the HSV (Hue, Saturation, Value) representation domain. It is an alternative representations of the RGB color model.
(This initial idea was given to me a long time ago by Patrick Imbo, during his PhD. thesis, thank you Patrick ! )

In this area, the Hue will be used to code the interferometric phase. The saturation will be used to code the value of the coherence (the modulus of the complex coherence). In this way, if the phase is very noisy, the level of coherence will be low, and the corresponding pixels will remain in gray level instead of appearing in color.
Finally, for the Value or brightness, i.e. the total intensity, you will choose your best input image!

That is to say that if you have only a pair of images (Z1 and Z2) it is a good idea to average them together to improve the signal to noise ratio of the speckle. Here is a piece of corresponding code, where the gamma function will have been calculated in the previous step:

from matplotlib.colors import hsv_to_rgb

def interferometric_image(Z1,Z2,gamma):
I=np.abs(Z1**2)+2*np.abs(Z2**2)
I=np.sqrt(I)
thresh=np.mean(I)+1*np.std(I)
I=(I/thresh);
I=I*(I<1)+(I>1);

ang=np.angle(gamma)
ang=ang-np.min(np.ravel(ang));
ang=ang/np.max(np.ravel(ang));

coh=np.abs(gamma)

result = np.dstack((ang,coh,I))
rgb = hsv_to_rgb(result)

return rgb

If you have a time series, it is even better to use the average value of this series as intensity! In this case, replace Z1 and Z2 by this time-averaged image.

Here is a comparison of interferometric extracts computed with TerraSAR-X images in “staring spotlight mode”, the most resolved of this sensor, obtained in the framework of a partnership with the DLR (2015 acquisitions).

Thanks to the spatial averaging, the pattern of the Eiffel Tower beams appears even more beautiful… As for the green screen areas or the river Seine , they remain in gray level, because they decorrelate temporally.

--

--

Elise Colin

Researcher with broad experience in Signal and Image processing, focusing on big data and IA aspects for Earth observation images, and medical images.