MULTI_SENSOR_DATA_FUSIONOptimal rules for compressing data at each local sensor

VasuhiSamydurai1 17 views 32 slides Apr 29, 2024
Slide 1
Slide 1 of 32
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32

About This Presentation

Optimal rules for compressing data at each local sensor


Slide Content

MULTI SENSOR DATA
FUSION
Guided By: Mrs.S.Vasuhi
A.Anand Bardwaj 20033203
M.Anandaraj 20033204
K.Kapil 20033223

OBJECTIVE OF THIS WORK
•Optimal rules for compressing data at
each local sensor such that the fused
estimate is optimal.
•Data fusion with Estimated Weights.
•Implementation of these methods in LAN
in LINUX platform.

Data Fusion Algorithms
•Linear Estimation Fusion (LEF):
X = P
j(P
i+P
j)-1 X
i+ P
i(P
i+P
j) -1 X
j
•Weighted Least Square (WLS):
X=[H
T
P
-1
H]
-1
H
T
P
-1
y

Constraints
•Limited communication bandwidth
•Limited processing capability at the
fusion center

Data Compression
•Lossless
•Lossy
Advantages in Data Compression
•Curse of Dimensionality
•Reducing communication overhead

Reduced SVD
P
c= U* S* V
T
*
•By keeping only the first k singular values of S
•And first k rows of V
T
and first k columns of U
•P
cis referred to as the Rank kApproximation of
P or the "Reduced SVD" of P

Optimised algorithms
•Optimised LEF (OLEF):
X
o= P
cj(P
ci+P
cj)
-1
X
i+ P
ci(P
ci+P
cj)
-1
X
j
•Optimised WLS (OWLS):
X
o=[H
T
P
c
-1
H]
-1
H
T
P
c
-1
y

Assumptions
•Two non-maneuvering targets in a clutter
free environment.
•Zero mean, white Gaussian process noise
•Two radar system with same coverage area

RADAR 1:
•Scan rate: 0.5 sec
•Radar Freq: 2 Hz
•Scans : 150
•Location in LLA
–Latitude –10.502
–Longitude –80.1
–Altitude –10
•Standard deviation of errors in
–Range –50 m
–Azimuth –0.005759 radians
–Elevation –0.0171radians

RADAR 2:
•Scan rate: 1 sec
•Radar freq: 1 Hz
•Scans : 75
•Location in LLA
–Latitude –10.50
–Longitude –80.1
–Altitude –10
•Standard deviation of errors in
–Range –100 m
–Azimuth –0.005759 radians
–Elevation –0.0171radians

FUSION CENTER:
•Location in LLA
–Latitude –10.501
–Longitude –80.1
–Altitude –10
•Distributed environment

Fusion
Methods
Comm.
Overhead
Comp Time
(sec)
Arms (m)
LEF 90 0.26 107.73
OLEF 12 0.16 105.4
WLS 90 0.26 108.76
OWLS 12 0.14 107.25

Data Fusion with
Estimated Weights
•R
P=TM.Rc.TM
T

where,
C
R
L
=
TM 0 0
0 TM0
0 0 TMcos cos sin cos sin
sin cos 0
cos sin sin sin cos
m m m m m
mm
m m m m m
b e b e e
bb
b e b e e







TM=

Local Sensors
•X
L(k) = C
R
L
X
R(k)
•X
L
^
(k-1/k+1)= A X
L
^
(k/k) + K
L(k+1)
{Z
L(k+1) –H A X
L
^
(k/k)}
•X
R
^
(k-1/k+1) =A X
R
^
(k/k) +
C
R
L
K
L(k+1) TM
{Z(k+1) –HAX
R
^
(k/k)}
•K
R(k+1) = C
R
L
K
L(k+1)TM

Fusion centre (CMM)
•X
^f
(k + l/k + 1) = μ
iX
^i
(k + l/ k + 1)
•μ
l= P
R
i
(k+1/k+1) -P
R
l
(k+1/k+1)
(N-1) X P
R
i
(k+1/k+11
N
i 1
N
i 1
N
i

Implementation in Lan
Distributed Fusion
1. Without Compression Technique
2. With Compression Technique
3. Data Fusion with Estimated Weights

SENSOR 1
/CLIENT1
SENSOR
2
/CLIENT2
SENSOR N
/CLIENT N
FUSION
CENTRE
/ SERVER
IMPLEMENTATION OF
DISTRIBUTED FUSION IN LAN

Method of Implementation
•Socket Programming in Linux Environment
1.Server
2.Client
•Server Socket
•Client Socket
•Now the processes can communicate

Fusion
Methods
Computation
time
Communication
overhead per
scan
WLS 0.26 90
CMM 0.17 12
LEF 0.26 90
OLEF 0.16 12
OWLS 0.14 12

FUTURE SCOPE
•Image fusion
•Objective Of Image Fusion
•Techniques Available

Techniques Available
•Based on retina model
•Influence factor modification and the
anova methods
•Multispectral multisensor image fusion
using wavelet transforms
•An estimation theory perspective

Techniques Available
•A pixel-level multi sensor image fusion
algorithm based on fuzzy logic
•A region-based image fusion method using
the expectation-maximization Algorithm
•Target tracking in infrared imagery using
weighted composite reference
•Function-based decision fusion

Military applications
•Automated target recognition
•Battlefield surveillance
•Concealed weapon detection
•Guidance and control of autonomous
vehicles
•Military command and control

Applications (Non military)
–Robotics
–Remote sensing
–Air traffic control
–Medical diagnostics
–Pattern recognition and environmental
monitoring
–Monitoring of complex machinery
–Artificial intelligence

REFERENCES
•Lei Wei Fong 2006, ‘Multi sensor data fusion with estimated
weights’ IEEE Transactions on Information Theory.
•Rong Li.X 2003, ‘Optimal Linear Estimation Fusion –Part I
,Unified Fusion Rules’,IEEE Transactions on Information Theory,
Vol 49,pp 2192-2207.
•Rong Li X. 2003, Jie Wang, ‘Unified Optimal Linear Estimation
Fusion –Part II Discussions and Examples’, IEEE Transactions
on Information Theory.
•Rong Li.X 2003, Keshu zhang, Peng Zhang, Haifeng Li ,‘Unified
Optimal Linear Estimation Fusion –Part VI ‘Sensor data
compression’, IEEE Transactions on Information Theory.
•Neil Matthew 2001 and Richard Stones, ‘Beginning Linux
programming’ 2nd edition.

REFERENCES
•Samuel S.Blackman1986 ‘Multiple Target Tracking with Radar
Applications’, Artech House Inc.
•Bar Shalom.Y 1995. and Xiao-Rong Li ‘Multisensor –Multitarget
Tracking.
•Bar-Shalom.Y1995., ’Multi target-Multisensor Tracking: Applications
and Advances volume II’ 1941-621.3848
•Farina.A and Studer.F.A. ‘Radar Data Processing’, Research
Studies Press Ltd.1986.
•David L Hall2001, James Llinas , ’Handbook of Multi sensor Data
Fusion’ --(Electrical engineering and applied signal processing) II.
Title. III. Series
Tags