24
MAXIMUM LIKELIHOOD JOINT ASSOCIATION, TRACKING, AND FUSION IN STRONG CLUTTER Leonid Perlovsky Harvard University and the AF Research Lab Seminar Department of Electrical and Computer Engineering, University of Connecticut Storr, 6 Mar., 2009

MAXIMUM LIKELIHOOD JOINT ASSOCIATION, TRACKING, AND FUSION IN STRONG CLUTTER

Embed Size (px)

DESCRIPTION

MAXIMUM LIKELIHOOD JOINT ASSOCIATION, TRACKING, AND FUSION IN STRONG CLUTTER. Seminar Department of Electrical and Computer Engineering, University of Connecticut Storr, 6 Mar., 2009. Leonid Perlovsky Harvard University and the AF Research Lab. OUTLINE. Related research - PowerPoint PPT Presentation

Citation preview

MAXIMUM LIKELIHOOD JOINT ASSOCIATION, TRACKING, AND FUSION IN STRONG CLUTTER

Leonid PerlovskyHarvard University and the AF Research Lab

Seminar Department of Electrical and Computer Engineering, University of Connecticut

Storr, 6 Mar., 2009

OUTLINE

• Related research

• Combinatorial complexity and logic

• Dynamic logic

• Joint likelihood, math. formulation

• Examples

• Publications, recognition

RELATED RESEARCH

• > 50 publications by Perlovsky and co-authors on concurrent association, tracking, and fusion (+ > 200 other applications)

– Perlovsky, L. I. (1991). Model Based Target Tracker with Fuzzy Logic. 25th Annual Asilomar Conference on Signals, Systems and Computers, Pacific Grove, CA.

– Perlovsky, L.I., Schoendorf, W.H., Tye, D.M., Chang, W. (1995). Concurrent Classification and Tracking Using Maximum Likelihood Adaptive Neural System. Journal of Underwater Acoustics, 45(2), pp.399-414.

• Many publications by Bar-Shalom, Streit, Luginbuhl, Willett, Avitzour, and co-authors

• Similarity: algorithms related to EM• Differences:

– Formulation of likelihood– Maximization procedures– Performance: linear complexity, Cramer-Rao Bound

• Cramer-Rao Bound for joint association and tracking– Perlovsky, L.I. (1997). Cramer-Rao Bound for Tracking in Clutter and Tracking Multiple

Objects. Pattern Recognition Letters, 18(3), pp.283-288.

COMBINATORIAL COMPLEXITY 50 years of difficulties

• Detect signal in noise and clutter at the farthest possible distance

• SP, detection, exploitation, fusion, tracking, etc. in noise/clutter– Requires association (pixels<->objects) before detection

If 1 object, no noise: (1) detect pixels, (2) detect objects, (3) recognize targets– Joint detection-discrimination-classification…

• Combinatorial Complexity (CC) – Need to evaluate large numbers of combinations (pixels<->objects) ,

operations: ~MN

– A general problem (since the 1950s) SP, detection, recognition, tracking, fusion, exploitation, situational awareness,… Pattern recognition, neural networks, rule systems…

• Combinations of 100 elements are 100100

– Larger than the number of particles in known Universe Greater than all the elementary events in the Universe during its entire life

• CC affects many SP algorithms– Our sensors under-utilize signals– Work much worse than Cramer-Rao Bound information-theoretic limit

CC vs. LOGIC

• CC is related to formal logic– Gödel proved that logic is “illogical,” “inconsistent” (1930s)

– CC is Gödel's “incompleteness” in a finite system

• Fuzzy logic – How to select degree of fuzziness?– The mind fits fuzziness for every process => CC

• Logic pervades all algorithms and neural networks – Rule systems, fuzzy systems (degree of fuzziness), pattern recognition, neural networks (training uses logic)

• Probabilistic association (Bar-Shalom) – Overcame logic in association– Where all logical steps overcome?

DYNAMIC LOGIC overcame logic limitations

•CC is related to logic– CC is Gödel's “incompleteness” in a finite system – Logic pervaded all algorithms and neural

networks in the pastrule systems, fuzzy systems (degree of fuzziness), pattern

recognition, neural networks (training uses logical statements)

•Dynamic Logic is a process-logic– “from vague to crisp” (statements, targets,

decisions…)

•Overcomes CC– Fast algorithms

OUTLINE

• Related research

• Combinatorial complexity and logic

• Dynamic logic

• Joint likelihood, math. formulation

• Examples

• Publications, recognition

JOINT LIKELIHOOD for tracks and clutter

• Total likelihood– L = l ({x}) = l (x(n))

• no assumption of “independence”

• Conditional likelihoods

– l (x(n)) = r(m) l (x(n) | Mm(Sm,n))

– l (x(n) | Mm(Sm,n)) is a conditional likelihood for x(n) given m

• {x(n)} are not independent, M(n) may depend on n’

• CC: L contains MN items: all associations of pixels and models (LOGIC)

n

m

EXAMPLES OF MODELS

• Linear track model– Mm(Sm,n) = Xm + Vm*t; Sm = (Xm, Vm, rm, Cm

-1)

• Gaussian conditional likelihoods

– l (x(n) | Mm(Sm,n)) =

(2) -d/2 (detC)-1/2 exp{ -0.5 [ x(n) - Mm(Sm,n) ]T Cm-1 [ x(n) - Mm(Sm,n) ] }

– No “Gaussian” assumption • errors are Gaussian • mixture of any pdfs can be used

• Uniform clutter model– rm, l (x(n) | Mm(Sm,n)) = 1/ volume(x)

DYNAMIC LOGIC (DL) non-combinatorial solution

• Start with a set of signals and unknown models– any parameter values Sm – associate models with signals (vague)– (1) f(m|n) = r(m) l (n|m) / r(m') l (n|m')

• Improve parameter estimation– (2) Sm = Sm + f(m|n) [ln l

(n|m)/Mm]*[Mm/Sm]

• Continue iterations (1)-(2). Theorem: DL is a converging system- likelihood increases on each iteration

'm

n

OUTLINE

• Related research

• Combinatorial complexity and logic

• Dynamic logic

• Joint likelihood, math. formulation

• Examples

• Publications, recognition

TRACKING AND DETECTION BELOW CLUTTER

yDL starts with uncertain knowledge and converges rapidly on exact solution

Performance achieves joint CRB for association and estimation

0 1 km

TRACKING AND DETECTION BELOW CLUTTER

Cross-Range

Ra

ng

e1

km0

(a)True

Tracks

detections

Ra

ng

e1

km0

c d

(b)

e f g h

Multiple Hypothesis Testing “logical” complexity ~ 101800; DL complexity ~ 106; S/C ~ 18 dB improvement

NUMBER OF TARGETS

•Active models and one dormant model - Only r(m) is estimated for the dormant model- The dormant model is activated if r(m) > threshold- An active model is deactivated if r(m) < threshold- In this example threshold = 0.001 of the total signal

- threshold = 0.001 x(n)

n

LOCAL MAXIMA

•Practically it is not a problem

•Reasons- Vague initial states smooth local maxima- Activation and deactivation eliminates local convergences

- In system applications, new data are coming all the time

local maxima come and go, real tracks persist

JOINT FUSION, ASSOCIATION, TRACKING, AND NAVIGATION

•3 platforms-sensors•Targets cannot be detected or tracked with one sensor

•All data are processed simultaneously•GPS is inadequate for triangulation

- Relative platform positions have to be estimated jointly with target tracks

•Multiple Hypothesis Testing “logical” complexity ~ 1017000

Sensor 1 (of 3): Model Evolves to Locate Target Tracks in Image Data

UNCLASSIFIED

truth data Initial uncertain model

Models converged to the truthImproved model after few iterations Few more iterations

Sensor 2 (of 3): Model Evolves to Locate Target Tracks in Image Data

UNCLASSIFIED

Sensor 3 (of 3): Model Evolves to Locate Target Tracks in Image Data

UNCLASSIFIED

NAVIGATION, FUSION, TRACKING, AND DETECTION this is the basis for the previous 3 figures, all fused in x,y,z, coordinates

OUTLINE

• Related research

• Combinatorial complexity and logic

• Dynamic logic

• Joint likelihood, math. formulation

• Examples

• Publications, recognition

PUBLICATIONS

300 publications

OXFORD UNIVERSITY PRESS(2001; 3rd printing)

Neurodynamics of High Cognitive Functionswith Prof. Kozma, Springer, 2007

Sapient Systemswith Prof. Mayorga, Springer, 2007

RECOGNITION

• 2007 Gabor Award- The top engineering award from

International Neural Network Society (INNS)

• Elected to the Board of Governors of INNS

• 2007 John L. McLucas Award- The top scientific award from the US Air

Force

CONCLUSION

• Dynamic Logic – an approach to improve algorithms and developing new ones

–Being developed since late 1980s–Proven breakthrough in several areas

• More can be done

16-Sep-05 24