of 38/38
Carnegie Mellon Kalman and Kalman Bucy @ 50: Distributed and Intermittency José M. F. Moura Joint Work with Soummya Kar Advanced Network Colloquium University of Maryland College Park, MD November 04, 2011 Acknowledgements: NSF under grants CCF-1011903 and CCF-1018509, and AFOSR grant FA95501010291

Kalman and Kalman Bucy @ 50: Distributed and Intermittency

  • View
    56

  • Download
    1

Embed Size (px)

DESCRIPTION

Kalman and Kalman Bucy @ 50: Distributed and Intermittency. José M. F. Moura Joint Work with Soummya Kar Advanced Network Colloquium University of Maryland College Park, MD November 04, 2011. Acknowledgements: NSF under grants CCF-1011903 and CCF-1018509, and AFOSR grant FA95501010291. - PowerPoint PPT Presentation

Text of Kalman and Kalman Bucy @ 50: Distributed and Intermittency

Higher dimensional consensus algorithms in sensor networksTexPoint fonts used in EMF.
Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAAAAAAAAA
José M. F. Moura
Advanced Network Colloquium
University of Maryland
College Park, MD
November 04, 2011
*
Filtering Then … Filtering Today
Distributed Filtering: Consensus + innovations
Intermittency: Infrastructure failures, Sensor failures
Random protocols: Gossip
Limited Resources: Quantization
Stochastic boundedness
Invariant distribution
Moderate deviation
Filtering Then … Filtering Today
Distributed Filtering: Consensus + innovations
Intermittency: Infrastructure failures, Sensor failures
Random protocols: Gossip
Limited Resources: Quantization
Stochastic boundedness
Invariant distribution
Moderate deviation
1939-41: A. N. Kolmogorov, "Interpolation und Extrapolation von Stationaren Zufalligen Folgen,“ Bull. Acad. Sci. USSR, 1941
*
Carnegie Mellon
Kalman Filter @ 51
*
Kalman-Bucy Filter @ 50
*
Filtering Then … Filtering Today
Distributed Filtering: Consensus + innovations
Intermittency: Infrastructure failures, Sensor failures
Random protocols: Gossip
Limited Resources: Quantization
Stochastic boundedness
Invariant distribution
Moderate deviation
Optimality: structural conditions – observability/controllability
“Kalman Gain”
Cooperative solution
Cooperation: better understanding/global knowledge
Filtering Then … Filtering Today
Random protocols: Gossip
Limited Resources: Quantization
Two Linear Estimators:
LU: Stochastic Approximation
Performance Analysis: Asymptotics
(Distributed) Consensus:
Asymptotic agreement: λ2 (L) > 0
DeGroot, JASA 74; Tsitsiklis, 74, Tsitsiklis, Bertsekas, Athans, IEEE T-AC 1986
Jadbabaie, Lin, Morse, IEEE T-AC 2003
Distributed architecture, no fusion center, nor parallel architecture
*
Consensus (reinterpreted): a.s. convergence to unbiased rv θ:
Consensus in Random Environments
*
Filtering Then … Filtering Today
Distributed Filtering: Consensus + innovations
Intermittency: Infrastructure failures, Sensor failures
Random protocols: Gossip
Limited Resources: Quantization
Stochastic boundedness
Invariant distribution
Moderate deviation
fast comm. (cooperation) vs slow sensing (exogenous, local)
Consensus + innovations: In and Out balanced interactions
communications and sensing at every time step
Distributed filtering: Consensus +Innovations
Structural failures (random links)/ random protocol (gossip):
Quantization/communication noise
Distributed inference: Generalized linear unbiased (GLU)
Consensus: local avg
Compare distributed to centralized performance
Structural conditions
Distributed connectivity: Network connected in the mean
*
Estimator:
A6. assumption: Weight sequences
*
Consistency: sensor n is consistent
Asymptotically normality:
*
Define
Let
Strong convergence rates: study sample paths more critically
Characterize information flow (consensus): study convergence to averaged estimate
Study limiting properties of averaged estimate:
Rate at which convergence of averaged estimate to centralized estimate
Properties of centralized estimator used to show convergence to
*
Random Riccati Equation: Moderate deviation principle
Rate of decay of probability of rare events
Scalar numerical example
Model:
Intermittent observations:
Optimal Linear Filter (conditioned on path of observations) – Kalman filter with Random Riccati Equation
Carnegie Mellon
Random Riccati Equation: Moderate deviation principle
Rate of decay of probability of rare events
Scalar numerical example
Define operators f0(X), f1(X) and reexpress Pt:
[2] S. Kar, Bruno Sinopoli and J.M.F. Moura, “Kalman filtering with intermittent observations: weak convergence to a stationary distribution,” IEEE Tr. Aut Cr, Jan 2012.
Carnegie Mellon
Random Riccati Equation: Moderate deviation principle
Rate of decay of probability of rare events
Scalar numerical example
Stochastic Boundedness:
Carnegie Mellon
Interested in probability of rare events:
As ϒ 1: rare event: steady state cov. stays away from P* (det. Riccati)
RRE satisfies an MDP at a given scale:
Pr(rare event) decays exponentially fast with good rate function
String:
Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” IEEE Tr. Automatic Control;
Carnegie Mellon
Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” IEEE Tr. Automatic Control
P*
Random Riccati Equation: Moderate deviation principle
Rate of decay of probability of rare events
Scalar numerical example
‘Fractal like’:
Carnegie Mellon
Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” accepted EEE Tr. Automatic Control
Carnegie Mellon
Random Riccati Equation: Moderate deviation principle
Rate of decay of probability of rare events
Scalar numerical example
Intermittency: sensors fail; comm links fail
Gossip: random protocol
Limited power: quantization
Mixed scale: can optimize rate of convergence and limiting covariance
Structural conditions: distributed observability+ mean connectivitiy
Asymptotic properties: Distributed as Good as Centralized
unbiased, consistent, normal, mixed scale converges to optimal centralized
*
Stochastically bounded as long as rate of measurements strictly positive
Random Riccati Equation: Probability measure of random covariance is invariant to initial condition
Support of invariant measure is ‘fractal like’
Moderate Deviation Principle: rate of decay of probability of ‘bad’ (rare) events as rate of measurements grows to 1
All is computable