BACKGROUND
were and will be further proved from space by SIR-C/X-SAR, the SRTM missions of the
Space Shuttle, and the TerraSAR and Cosmo/SkyMed missions.
1.1 BACKGROUND
Data acquired by radar systems must be interpreted to yield information about the imaged
area. However, the task of information extraction and the detection of significant features are
very time-consuming if performed semi automatically by an image analyst, especially when a
large amount of remote sensing data must be examined. Thus there has been and there is a
development of automatic algorithms to extract relevant information quickly and in
reproducible way. When polarimetric, multi-frequency and multi-temporal images are used
the need of automatic or semi-automatic methods is even larger and all available information
must be fused. To be valuable, an automatic data interpretation process has to be based on a
solid understanding. Actually, these images extracted from satellites need several kinds of
classification for different purposes. The image classification is an important and difficult task
which may also require taking into account the process of image generation. Particularly, for
the SAR images this problem is bigger than in other cases because these images present a lot
of noise, thus, they need particular processing steps. In the last part of this paragraph, we shall
explain shortly which range of electromagnetic spectrum is used for remote sensing. Actually,
there are several radars which work in different ranges. Particularly, the Synthetic Aperture
Radar works in the range of the microwaves shown in Figure 1.1.
Figure 1.1: Electromagnetic Spectrum.
2
GOAL AND MOTIVATIONS
1.2 GOAL AND MOTIVATIONS
The goal of this thesis is to present several approaches to classification of multimodal SAR
images. Due to its advantages, as compared to optical sensors, and to its steadily increasing
resolution, SAR imagery is getting more and more important as a source of remote sensing
information. Basic features of potential interest, which can be observed in SAR data are
homogeneous regions of different medium backscatter, (i.e., of different cross-section or
backscatter coefficient), textured regions such as forests, edges giving information for data
classification and strong scatterers (which are mostly related to man-made targets). However,
appropriate tools are still needed for automated interpretation and extraction of such features
from SAR data.
In this thesis we explain several classification approaches from different points of view. We
used several methods among which the mean one is MRF (Markov Random Field) applied
with ICM (Iterated Conditional Mode) estimation. Through this technique, a lot of
experimental tests have been performed, together with different transformations of the image
data. Other important experiments which are described in this thesis regard the K-Means
unsupervised classification. The above mentioned transformations were done in order to
obtain a better classification. Actually, we did many classifications which provide different
results. We performed an evaluation and assessment of the processing chain to determine
which approach achieved the best classification performance.
1.3 OUTLINE OF THE THESIS
The aim of this thesis is to apply different classification schemes to the analysis of multi-
feature SAR images and to describe the algorithms which were implemented to this end. In
our tests we used two types of images (one multifrequency–SAR image and the other one
originated with L frequency and with 4 polarizations–POLSAR image) and we focused on
Markov Random Fields for supervised classification and on the K-Means algorithm for
unsupervised classification. In order to explain these parts in a better way we summarized in
3
OUTLINE OF THE THESIS
the first chapter the theory about radar and, particularly, the description of Synthetic Aperture
Radar (SAR). In addition, the properties of this category of radar and the related physical
operations are described. To discuss image classification it is necessary to explain the theory
about stochastic processes in order to model the image pixel like a random variable and to
define probability density functions (PDFs) which better model the statistics of such
variables. The other chapters before the discussion of our tests are concentrated about the
description of image transformations and the adopted approaches. The developed program is
based on Markov Random Fields with Iterated Conditional Mode estimation and thus, there is
a short chapter about this development and the theory about this kind of classification and
what it is possible to do with it. The other approach which we used for the test is based, as
described above, on K-Means, applied to POLSAR images. To do this experiment we used,
therefore, a commercial software package, called ENVI, which provides functionalities to
manipulate and show images. Instead, the image transformations were developed with a
specific programming language, namely: IDL (Interactive Data Language) which is a good
interpretative language to compute operations with matrices and vectors through which we
represent images.
Chapter 11 presents the description and the discussion of our tests. In this part several tests
with different transformations are described. Particularly, it is possible to divide this chapter
in two principal parts: one for supervised classification with MRF and the other one for
unsupervised classification with K-Means. These two parts are described from different points
of view: for example, there are classification results which are good with respect to the quality
of classification but which require long processing time, and for some kinds of applications
this may not be acceptable. So, in this way, a comparison taking into account these issues and
the different transformations is presented. For the first part it was possible to do an evaluation
of the classification because we had the test image to compute the accuracy of classification.
In the second part, instead, this was not possible because we did not have the test image for
these experiments. Hence, a description of results and a qualitative comparison between
classification results are provided.
4
PRINCIPLES OF SYNTHETIC APERTURE RADAR (SAR)
2 PRINCIPLES OF SYNTHETIC
APERTURE RADAR (SAR)
2.1 SIDE-LOOKING GEOMETRY OF SAR
SYSTEMS
The geometry of a SAR is shown in Figure 2.1
Figure 2.1: Geometry of a side-looking radar system used for surface imaging. The radar with an antenna of
width D is flying at altitude H with speed v. The incidence angle for the considered resolution cell is θ
v
. The
resolution in range and azimuth are denoted by r
x
and r
y
respectively.
5
BASIC PROPERTIES OF SIDE-LOOKING RADARS
We can see a platform moving with velocity v at altitude H carrying a side-looking radar
antenna that illuminates the Earth’s surface with pulses of electromagnetic radiation. The
direction of travel of the platform is known as the azimuth direction and is denoted by y. The
distance of the radar track is measured in the range direction x. For simplicity, we only deal
with the case where the antenna is oriented along to the flight path, not squinted.
2.1.1 BASIC PROPERTIES OF SIDE-LOOKING RADARS
Considering a radar system mounted on a platform, either airborne or spaceborne, as depicted
in Figure 2.1, an emitted pulse is received after ∆t =2R/c, where c is the speed of light (c ≈
3·10
8
m/s) and R is the path of the emitted wave. The emitted waves can be described by
rectangular pulses of duration τ that are repeatedly sent with a pulse repetition frequency f
T
with T
1
>> τ. The sensor resolution that is the minimum distance between distinguishable
different objects can then be calculated to be ∆R = c τ /2 [2.3]. For the ground resolution this
results in [2.3]
v
x
c
r
θ
τ
sin2
=
(2.1)
where θ
v
is the look angle, as shown in Figure 2.2. Thus, resolution is improved with growing
range distance x. For instance, typical look angle varies from 20 to 60 for RADARSAT
depending on the used mode. The pulse length τ is the main parameter to increase resolution
because θ
v
is constrained to a certain range due to the signal-to-noise ratio. It’s important that
the emitted pulses contains enough energy to guarantee a sufficient amount of reflected
intensity at the receiver, but, at the same time, they must be as short as possible. The best
theoretical signal, a Dirac pulse, is technically not realizable.
There are other considerations to take: the resolution in azimuth r
y
depends on the antenna
aperture in azimuth β
y
. The width of the main antenna lobe at -3 dB is given by
D
y
λ
β =
(2.2)
1
where
T
f
T
1
= and it represents the period of the rectangular pulses.
6
RADIOMETRICAL AND GEOMETRICAL EFFECTS
where λ is the wavelength of the emitted signal and D denotes the antenna length in azimuth.
The resolution in the azimuth direction is
v
yy
D
H
D
R
Rr
θ
λλ
β
cos
===
(2.3)
It depends on the wavelength λ, the altitude H and the antenna size D. As it is difficult to have
a great variation of H and λ for physical reasons, the resolution can be only increased by the
antenna size D.
Figure 2.2: Resolution of SAR, in range and azimuth. Left: Imaging geometry in range direction. Right: target
illumination by the moving sensor used to generate the synthetic aperture.
2.2 RADIOMETRICAL AND GEOMETRICAL
EFFECTS
SAR systems detect the distance between the sensor emitting a microwave pulse and the
target reflecting the energy back to the receiving antenna. This range measurement principle
leads to specific geometric distortions in the processed SAR image, which makes SAR images
more difficult to interpret than optical images, especially for inexperienced users. This kind of
degradation does not occur in imaging of flat, e.g. agricultural, areas, but in imaging of rough
areas, e.g. mountainous regions.
The most relevant SAR specific topographically induced distortions are illustrated in Figure
2.3 and can be summarized as follows [2.4]:
7
RADIOMETRICAL AND GEOMETRICAL EFFECTS
• Foreshortening: For slopes facing the sensor, the area on the ground that is mapped
onto one SAR resolution cell [2.4] is larger than for the case of flat terrain. This so
called foreshortening situation has two consequences: 1) Due to the change in ground
resolution, foreshortening areas appear compressed in the SAR image, i.e. their
extension in range direction is reduced; 2) As a consequence of the energy
concentration, foreshortening areas are characterized by brighter image gray values.
The received energy within a resolution cell is higher due to the larger imaged area.
• Layover: Layover is an extreme case of foreshortening and occurs where, due to
steep terrain slopes, the top of mountain is closer to the sensor than its bottom.
Layover areas appear as particularly bright regions in the image with an inverted
original geometrical order. In Figure 2.3 this is depicted by points three and four: we
can note the reverse position of these points in the slant range image compared to
their actual location in ground coordinates.
Figure 2.3: The most relevant SAR specific topographically induced distortions.
• Elongation: In contrast to the foreshortening effect, slopes facing away from the
sensor lead to rather dark, elongated regions in the SAR image. The explanation for
this behaviour is dual to the one of foreshortening.
• Shadows: Similarly to optical images, areas which are not illuminated by the radar
beam are called radar shadows. In the image, shadow areas appear as dark regions
only corrupted by thermal noise.
8
RADIOMETRIC CORRELATION
We can draw the following conclusions: the topography of the sensed area not only affects the
radiometry of the data (bright areas for layover and foreshortening, dark areas for shadow),
but also their geometry by changing the image resolution, called slant range resolution, in
comparison with the ground range resolution, as a function of the terrain slopes. The
resolution of the slant range data is not constant but varies over the image.
2.2.1 RADIOMETRIC CORRELATION
In order to correct SAR layover areas, the accumulated energy in one image pixel has to be
redistributed among those ground resolution cells, which are mapped onto that particular
pixel. This problem of energy redistribution is also encountered in foreshortening areas,
where each pixel contains the energy from a larger area, comprising generally more than one
ground resolution cell. Concerned terrain facets on a digital elevation model (DEM) are
adjacent to each other and can be approximated to have identical local incidence angles and
reflectivity. Therefore the radiometric correlation can be performed by a division by the size
of the imaged area, which, as a first approximation, is a function of the local incidence angle,
which, in general, is calculated by using DEMs of similar (or, if available, higher) resolution.
2.2.2 GEOMETRIC CORRELATION
In case of geometrically induced effects, the slant range image must be resampled to its
nominal ground resolution. Geocoding is the name of this procedure, which consists in
minimizing geometrical distortions and resampling the image to a homogenous, predefined
map grid. To this end, it’s necessary to know the exact elevation for each pixel. The purpose
of geocoding is to generate a map-like representation of the satellite image, where the SAR
image is aligned with a Cartesian map projection grid. Otherwise, the uncorrected data can
hardly be interpreted due to effects in slant range images, such as mountains which appear to
be leaning towards the sensor.
9
BASIC SCATTERING MECHANISM
2.3 BASIC SCATTERING MECHANISM
Radar systems emit electromagnetic waves with wavelengths ranging from a few centimetres
up to one decimetre (around 10GHz) and receive the backscattered reflection from the imaged
surface. The measured time between emission and reception of the reflected wave is used to
localize the scatterer or target (the term target denotes both single objects and distributed
scatterers). In Figure 2.4 different backscattering mechanisms depending on the micro and
macroscopic properties of the scatterer are showed in Figure 2.4.
Figure 2.4: Different backscattering mechanisms. From left to right: Scattering from a smooth surface, double
bouncing reflection, diffuse scattering from rough surfaces, volume scattering.
• Reflection from smooth surfaces and double bouncing: a smooth surface reflects
very few of the incoming energy back to the emitting antenna. There are only two
cases in which we have a greater reflection of energy, i.e. when the incidence angle θ
v
between sensor and surface is zero or when there is “double bouncing”. For this
reason, highways and calm lakes appear rather dark in SAR images. Buildings usually
appear bright due to the double bouncing effect.
• Reflection from rough surfaces and volume scattering: rough surfaces, i.e. non-
specular reflectors, reflect the wavefront in multiple directions. The sensor only
receives a part of the emitted energy. This effect is called diffuse scattering. The same
effect occurs for volume scattering, where the wavefront partly penetrates the
scatterers (in the forest for example). The wavelength of the system depends on the
volume scattering and on the depth of penetration.
An interesting example is represented by the case of the surface of the sea. When the sea is
calm, the incident angle θ
v
between sensor and surface is not zero and we have a situation in
which it’s not possible to receive the reflected energy. For this reason the receiver maps the
10
SAR FOCUSING
input in a black image. Otherwise, if the sea is slightly rough, a partial reflection of the
emitted signal is obtained, so the signal produced by the sensor is a gray image. Considering
the last case, i.e., when the sea is very rough, the sensor on the platform can receive a large
amount of energy because the surface reflects in all directions including the same direction in
which the radar signal was emitted.
SAR images can be computed by measuring the reflected energy and the time delay between
emission and reflection and by doing an appropriate processing, based on the electromagnetic
properties of the imaged surface.
2.4 SAR FOCUSING
2.4.1 RANGE FOCUSING - PULSE COMPRESSION
Pulse compression is a technique which allows increasing dramatically the range resolution.
In this case, the radar does not emit a rectangular signal modulated by a carrier frequency f
c
,
but a signal of duration τ with a linearly modulated frequency called chirp [2.1]. The phase of
the emitted signal which is limited in time by ∆t = τ is given by
⎟
⎠
⎞
⎜
⎝
⎛
+=
2
2)(
2
Kt
tft
cx
πφ (2.4)
with the signal bandwidth described by B = Kτ. The received signal is passed through a
matched filter [2.2] which is equivalent to a convolution with an ideal chirp. It can be
demonstrated that the range resolution is then determined by (2.5)
v
x
B
c
r
θsin2
=
(2.5)
This is equivalent to emit a rectangular pulse of duration
B
1
'=τ .
11
AZIMUTH FOCUSING – SYNTHETIC APERTURE
2.4.2 AZIMUTH FOCUSING - SYNTHETIC APERTURE
In order to improve the resolution we can use a frequency modulation, similar to the pulse
compression in range, obtained by using the Doppler Effect in the azimuth direction.
Specifically, flying at the speed v, the sensor travels the distance ∆y = v ∆t within ∆t. We
assume that at time t = 0 the platform is at y = 0, where the radial distance R
0
to a given target
is minimal.
Since the travelled distance ∆y is small compared to R
0
, the distance R(t) can be approximated
to the target as a function of time t = y/v by
0
22
0
2
)(
R
tv
RtR +≈
(2.6)
with a phase shift between emitted and received signal of
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
+==
0
22
0
2
2)(2
2)(
R
tv
R
c
tR
ft
cy
λ
π
πφ
(2.7)
This linear modulation of the frequency, seen as a time function, can be processed by
employing a matched filter. As a given target stays within the antenna beam of width R
0
β
y
on
ground for the time vRt
y
/
0
β=∆ the platform movement simulates a large antenna aperture,
also called synthetic aperture, with increased resolution of
2
D
r
y
=
(2.8)
.
Thus, the synthetic aperture neither depends on the wavelength, nor on the distance to the
target. Several advanced signal processing are needed to reach this theoretical resolution. In
addition, especially for SAR systems, it’s necessary to use methods to compensate effects like
range migration of targets [2.5].
12
SPECKLE NOISE AND MULTILOOKING REDUCTION TECHNIQUES
2.5 SPECKLE NOISE AND MULTILOOKING
REDUCTION TECHNIQUES
One of the applications of SAR systems consists in generating a map of the NRCS
(Normalized Radar Cross Section) σ°(x,y), which is related to backscattering properties.
Actually, each pixel of the image is obtained through the vectorial sum of the contributions of
a large number of scatterers; consequently, the reflection of one zone to a constant σ°
fluctuates due to the effect of the random combination of the contributions present in each
singular cell. For this reason the image pixels have different gray values and this yields a
granular effect that can be attributed to a multiplicative noise known with the name of speckle
[2.6]. (Figure 2.6).
Figure 2.5: Multilook technique.
Typical speckle reduction technique consists in averaging pixels of several independent
images of the same zone; this technique is called multilooking. Actually, it is possible to
obtain images of the same zone with a lower resolution in azimuth subdividing the synthetic
aperture L in K subapertures of length L
sc
=L/K (K = 3 in Figure 2.5).
We denote by R
01
, R
02
………R
0K
the distances between point 0 and the centre of the synthetic
aperture and by x
i
position of the centres C
i
. Processing the obtained data in the K-th
subaperture, focusing on the point 0 with the term
λ
π
K
K
R
xx
j
K
exH
0
2
)(2
)(
−
−
=
(2.9)
13
SPECKLE NOISE AND MULTILOOKING REDUCTION TECHNIQUES
we obtain images of the same point 0 (related to the same zone) from K different angles of
azimuth θ
i
and with resolution, K
D
az
x
2
=δ reduced of a K factor. The diversity of azimuth of
the images allows the images to be independent. Averaging the K images we have a great
reduction of speckle, how we can see in Figure 2.7, obtained by a of 3-look multilooking of
the image in the Figure 2.6.
Figure 2.6: 1 look SAR images of an urban area corrupted by speckle.
Figure 2.7: 3 look SAR image generated from Figure 2.6 (the S/N ratio is enhanced).
14
OVERVIEW OF SAR SENSORS
3 OVERVIEW OF SAR SENSORS
3.1 SAR REMOTE SENSING
Imaging radars are airborne or spaceborne radars, which generate a reflectivity map of an
illuminated area through transmission and reception of electromagnetic energy. Among other
types of microwave sensors, special attention has been devoted to the synthetic aperture radar
(SAR) because of its high spatial and multivarious information content.
The development of the synthetic array radar originated in 1951 with Carl Wiley, who
postulated the use of Doppler information for increasing the azimuth resolution of the
conventional side-looking aperture radar (SLAR) [3.1] . Based on this idea, the first SAR
image was produced by researches at the University of Michigan in 1958, using an optical
processing method [3.2].
Precision radar optical processors and hologram were developed and fine resolution strip
maps were obtained by the mid 1960's. Later, in the early 1970's, digital signal processing
methods were introduced to obtain off-line or non-realtime SAR images of high quality [3.3],
[3.4]. Since these early days, SAR systems evolved to an essential and power tool in
geosciences and remote sensing. SAR data are applicable in many scientific fields. Besides
traditional applications in geography and for topographic and thematic mapping, nowadays
SAR sensors are also utilised in areas such as oceanography, forestry, agriculture, urban
planning, environmental and prediction evaluation of natural disasters.
SAR sensors operate in the microwave region of the electromagnetic spectrum with typical
wavelengths between 1cm and several metres. As an active system, a SAR emits by itself
microwave radiation to the ground and measures the electric field backscattered by the
15