This simulator demonstrates synthetic aperture radar (SAR) image formation using backprojection algorithm. The radar signal is generated for each position on a linear track and the measured signal is backprojected on a predefined 1 m square image grid and accumulated at all different positions to form the focused image. Track Y-position is fixed at -2 m.
The radar targets are drawn as a crosses on the "Imaging geometry" plot on the top left. Red circle is the radar and the blue line is the radar track. Square drawn with the dashed lines is the image grid. Grey cone is the antenna beam. Press "Play" button to start the simulation.
The different steps in image generation are explained in more detail below:
For each radar position and each point target, a baseband signal is synthesized. The round-trip delay for a target at range \( r \) is \( \tau = \frac{2r}{c} \) , where \( c \) is the speed of light.
The baseband signal is modeled as a complex exponential:
\[ s(t) = A(\varphi) \exp\left(j 2\pi \left(-f_c \tau - \frac{B\tau}{T} t \right)\right) \]
where \(A(\varphi) \) is the amplitude due to antenna pattern, \( f_c \) is the carrier frequency, \( B \) is the sweep bandwidth, and \( T \) is the sweep duration. Signals from all targets are summed. For simplicity, all targets have reflectivity of 1 regardless of distance to the radar. For more details why the signal looks like this, see for example one of my earlier blog posts.
Using a complex valued signal might initially seem weird, since surely the signals measured in the actual hardware are real voltages. However, this IQ-signal presentation is very useful for modeling baseband signals. Negative frequency means that the received RF signal frequency was below the local oscillator frequency.
This signal corresponds to a rising frequency sweep that starts at frequency of \( f_c - B/2 \) at \(t = 0\) and ends at \(f_c + B/2\) at \(t = T\).
RF carrier frequency \(f_c\) and RF bandwidth \(B\) can be configured. Sweep duration is fixed to 100 µs and sampling speed is fixed at 5 MHz. With these options the maximum frequency from a target at baseband should be below the Nyquist alias frequency for most sensible options.
Antenna pattern is modeled as Gaussian beam:
\[ A(\varphi) = \exp\left(-2\log(2) \left(\frac{\varphi}{\text{HPBW}}\right)^2\right) \]
where \( \varphi \) is angle from boresight and HPBW is the half power beam width.First a window function (\(w_{\text{r}})\) is applied to the measured signal to decrease the sidelobes in range direction. There are many different window functions that would be suitable and in this case a Taylor window is used. It has very useful property that its sidelobe level can be controlled with a parameter. The trade-off of low sidelobe level is that the mainlobe width increases, which decreases the range resolution of the image. "Range window sidelobe level" parameter controls the Taylor window sidelobe level.
\[ s_w(t) = s(t)\, w_{\text{r}}(t) \]
This windowed time-domain signal can be visualized in "Time-domain signal" plot, that can be found by pressing the "Time" button on "Range Profile" plot. Orange plot is the real part of the signal and Blue plot is the imaginary part. The windowed signal is zero-padded and transformed using an inverse FFT to obtain a range profile:
\[ S(r) = \mathcal{F}^{-1}\{ s_w(t) \} \]
Inverse FFT is used since the baseband frequencies are negative. With inverse FFT the resulting range bins start from zero index and with increasing range the more negative frequencies are at positive indices. Normal FFT would generate range profile that is swapped. It would just change indexing in the algorithm and IFFT results in cleaner code.
Zero-padding provides range oversampling, which improves interpolation accuracy and makes plots look nicer. The resulting samples represent complex reflectivity versus range.
For each pixel \( (x,y) \) in the image grid, the distance to the radar position \( (x_n, y_n)\) is computed:
\[ r_n(x,y) = \sqrt{(x-x_n)^2 + (y-y_n)^2} \]
The range profile is interpolated at the calculated distance using linear interpolation. Each target will be visible as an arc in the image and from one channel measurement it's not possible to determine the angle of the target. The image is plotted on "Measured Signal on Image Grid" plot. It's complex valued and it's visualized by plotting the phase of the pixel by changing its hue and absolute value changes the brightness of the pixel. This is typical complex domain coloring often used for visualizing 2d complex valued functions.
A phase reference is computed for each pixel to compensate for phase variation caused by different propagation delay from different positions:
\[ h_n(x,y) = \exp\left(j \frac{4\pi f_c}{c} r_n(x,y)\right) \]
This is the conjugated response of a target that would be at that location in the image (It has inverse phase of \(S(r)\)). It's plotted in "Reference function" plot using complex visualization.The interpolated measurement is multiplied by the reference function:
\[ z_n(x,y) = S(r_n(x,y)) \cdot h_n(x,y) \]
Multiplication by the conjugated reference function removes the phase contribution from the measurement caused by the distance to the pixel. If there was actually a target at that pixel location, then multiplication by the reference function will always result in the same phase. If there was not a target, then the phase will change as the location of the radar changes.
This image is visualized in "Reference × Measurement" plot. Note that as the radar moves, at locations where there are targets the resulting phase is constant.
When the results at different locations are summed up the locations where there was a target add up since the phases align causing a large amplitude at that location in the image, but locations without target result in low output as the phases don't add up coherently.
A Taylor window (\(w_{\text{az}})\) is applied across slow-time (platform positions) to reduce azimuth sidelobes. The idea is the same as with range FFT. By reducing the amplitude of measurements near the edges, the sidelobes caused by in azimuth direction will be reduced with trade-off that resolution in azimuth direction will slightly worsen.
The focused image is formed by coherently summing contributions from all radar positions:
\[ I(x,y) = \sum_{n=1}^{N} w_{\text{az}}(n)\, z_n(x,y) \]
As more measurements from different positions are added, the image resolution increases. The final display can show either the complex image or its magnitude in dB.
There are some presets settings that can be selected under the plots.