28
State Estimation and Kalman Filtering

State Estimation and Kalman Filtering

  • Upload
    maddox

  • View
    93

  • Download
    2

Embed Size (px)

DESCRIPTION

State Estimation and Kalman Filtering. Sensors and Uncertainty. Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models to estimate ground truth, unobserved variables, make forecasts. How does this relate to motion?. A robot must: - PowerPoint PPT Presentation

Citation preview

Page 1: State Estimation and Kalman Filtering

State Estimation and Kalman Filtering

Page 2: State Estimation and Kalman Filtering

Sensors and Uncertainty Real world sensors are noisy

and suffer from missing data (e.g., occlusions, GPS blackouts)

Use sensor models to estimate ground truth, unobserved variables, make forecasts

Page 3: State Estimation and Kalman Filtering

How does this relate to motion?

A robot must: Know where it is (localization) Model its environment to avoid collisions

(mapping, model building) Move in order to get a better view of an

object (view planning, inspection) Predict how other agents move (tracking)

Page 4: State Estimation and Kalman Filtering

Intelligent Robot ArchitecturePercept (raw sensor data)

Action

Sensing algorithms

Models

Planning

State estimationMappingTrackingCalibrationObject recognition…

Page 5: State Estimation and Kalman Filtering

State Estimation Framework

Sensor model(aka observation model) State => observation Observations may be partial, noisy

Dynamics model(aka transition model) State => next state Transition may be noisy, control-

dependent Probabilistic state estimation

Given observations, estimate probability distribution over state

xt

zt

xt+1

Sensor model

Dynamics model

Page 6: State Estimation and Kalman Filtering

Markov Chain

Given an initial distribution and the dynamics model, can predict how the state evolves (as a probability distribution) over time

X0 X1 X2 X3

Page 7: State Estimation and Kalman Filtering

Hidden Markov Model

Use observations to get a better idea of where the robot is at time t

X0 X1 X2 X3

z1 z2 z3

Hidden state variables

Observed variables

Predict – observe – predict – observe…

Page 8: State Estimation and Kalman Filtering

General Bayesian Filtering

Compute P(xt|zt), given zt, prior on P(xt-1) Bayes rule:

P(xt|zt) = P(zt|xt)P(xt)

P(xt) = integral [ P(xt | xt-1) P(xt-1) dxt-1 ]

= integral [ P(zt|xt)P(xt) dyt ]

Nasty, hairy integrals

Page 9: State Estimation and Kalman Filtering

Key Representational Issue

How to represent and perform calculations on probability distributions?

Page 10: State Estimation and Kalman Filtering

Kalman Filtering

In a nutshell Efficient filtering in continuous state

spaces Gaussian transition and observation

models Ubiquitous for tracking with noisy

sensors, e.g. radar, GPS, cameras

Page 11: State Estimation and Kalman Filtering

Kalman Filtering

Closed form solution for HMM Assuming linear Gaussian process

Transition and observation function are linear transformation + multivariate Gaussian noise

y = A x + ~ N(,)

Page 12: State Estimation and Kalman Filtering

Linear Gaussian Transition Model for Moving 1D Point Consider position and velocity xt, vt

Time step h Without noise

xt+1 = xt + h vt

vt+1 = vt

With Gaussian noise of std

P(xt+1|xt) exp(-(xt+1 – (xt + h vt))2/(22

i.e. xt+1 ~ N(xt + h vt, )

Page 13: State Estimation and Kalman Filtering

Linear Gaussian Transition Model If prior on position is Gaussian, then the

posterior is also Gaussianvh 1

N(,) N(+vh,+1)

Page 14: State Estimation and Kalman Filtering

Linear Gaussian Observation Model

Position observation zt

Gaussian noise of std 2

zt ~ N(xt,)

Page 15: State Estimation and Kalman Filtering

Linear Gaussian Observation Model If prior on position is Gaussian, then the

posterior is also Gaussian

(2z+22)/(2+2

2)

2 222/(2+2

2)

Position prior

Posterior probability

Observation probability

Page 16: State Estimation and Kalman Filtering

Kalman Filter

xt ~ N(x,x)

xt+1 = F xt + g + v

zt+1 = H xt+1 + w

v ~ N(,v), w ~ N(,w)

Dynamics noise

Observation noise

Page 17: State Estimation and Kalman Filtering

Two Steps

Maintain t, t the gaussian distribution over state at time t

PredictCompute distribution of xt+1 using dynamics

model alone Update

Compute xt+1|zt+1 with Bayes rule

Page 18: State Estimation and Kalman Filtering

Multivariate Computations

Linear transformations of gaussians If x ~ N(,), y = A x + b Then y ~ N(A+b, AAT)

Consequence If x ~ N(x,x), y ~ N(y,y), z=x+y Then z ~ N(x+y,x+y)

Conditional of gaussian If [x1,x2] ~ N([1 2],[11,12;21,22]) Then on observing x2=z, we have

x1 ~ N(1-1222-1(z-2), 11-1222

-121)

Page 19: State Estimation and Kalman Filtering

Predict Step

Linear transformation rule: xt+1 ~ N(Ft + g, F t FT

+ v) Let these be ’ and ’

Page 20: State Estimation and Kalman Filtering

Update step

Given ’ and ’, and new observation z Compute the final distribution t+1 and t+1

using the conditional distribution formulas

Page 21: State Estimation and Kalman Filtering

Presentation

Page 22: State Estimation and Kalman Filtering

Deriving the Update Rulext

zt

’a= N ( , )’ B

BT C

xt ~ N(’ , ’)

(1) Unknowns a,B,C

(3) Assumption

(7) Conditioning (1)xt | zt ~ N(’-BC-1(zt-a), ’-BC-1BT)

(2) Assumption

zt | xt ~ N(H xt, W)

C-BT’-1B = W => C = H ’ HT + W

H xt = a-BT’-1(xt-’) => a=H’, BT=H’ (5) Set mean (4)=(3)

(6) Set cov. (4)=(3)

(8,9) Kalman filtert = ’ - ’HTC-1(zt-H’)

(4) Conditioning (1)zt | xt ~ N(a-BT’-1xt, C-BT’-1B)

t = ’ - ’HTC-1H’

Page 23: State Estimation and Kalman Filtering

Filtering Variants

Extended Kalman Filter (EKF) Unscented Kalman Filter (UKF) Particle Filtering

Page 24: State Estimation and Kalman Filtering

Extended/Unscented Kalman Filter Dynamics / sensing model is nonlinear

xt+1 = f(xt) + vyt+1 = h(xt+1) + w

Gaussian noise, gaussian state estimate as before

Strategy:Propagate means as usualWhat about covariances & conditioning?

Page 25: State Estimation and Kalman Filtering

Extended Kalman Filter

Approach: linearize about current estimateF = f/x(xt)H = h/x(xt+1)

Use Jacobians to propagate covariance matrices and perform conditioning

Jacobian = matrix of partial derivatives

Page 26: State Estimation and Kalman Filtering

Unscented Kalman Filter

Approach: propagate a set of points with the same mean/covariance as the input, reconstruct gaussian

Page 27: State Estimation and Kalman Filtering

Next Class

Particle FilterNon-Gaussian distributionsNonlinear observation/transition function

ReadingsRekleitis (2004)P.R.M. Chapter 9

Page 28: State Estimation and Kalman Filtering

Midterm Project Report Schedule (tentative) Tuesday

Changsi and Yang: Indoor person following Roland: Person tracking Jiaan and Yubin: Indoor mapping You-wei: Autonomous driving

Thursday Adrija: Dynamic collision checking with point clouds Damien: Netlogo Santhosh and Yohanand: Robot chess Jingru and Yajia: Ball collector with robot arm Ye: UAV simulation