12.Estimation - Correlations

  • Upload
    sparsh

  • View
    220

  • Download
    0

Embed Size (px)

Citation preview

  • 8/14/2019 12.Estimation - Correlations

    1/18

    MIT OpenCourseWarehttp://ocw.mit.edu

    12.540Principles of Global Positioning SystemsSpring 2008

    For information about citing these materials or our Terms of Use, visit:http://ocw.mit.edu/terms.___________________

    ______________

    http://ocw.mit.edu/http://ocw.mit.edu/termshttp://ocw.mit.edu/termshttp://ocw.mit.edu/termshttp://ocw.mit.edu/http://ocw.mit.edu/termshttp://ocw.mit.edu/
  • 8/14/2019 12.Estimation - Correlations

    2/18

    12.540 Principles of the GlobalPositioning System

    Lecture 12

    Prof. Thomas Herring

  • 8/14/2019 12.Estimation - Correlations

    3/18

    03/22/06 12.540 Lec 12 2

    Estimation

    Summary Examine correlations Process noise

    White noise

    Random walk First-order Gauss Markov Processes

    Kalman filters Estimation in which the parametersto be estimated are changing with time

  • 8/14/2019 12.Estimation - Correlations

    4/18

    03/22/06 12.540 Lec 12 3

    Correlations

    Statistical behavior in which random variables tend tobehave in related fashions

    Correlations calculated from covariance matrix.Specifically, the parameter estimates from anestimation are typically correlated

    Any correlated group of random variables can beexpressed as a linear combination of uncorrelated

    random variables by finding the eigenvectors (linearcombinations) and eigenvalues (variances ofuncorrelated random variables).

  • 8/14/2019 12.Estimation - Correlations

    5/18

    03/22/06 12.540 Lec 12 4

    Eigenvectors and Eigenvalues

    The eigenvectors and values of a square matrixsatisfy the equation Ax=x

    If A is symmetric and positive definite (covariancematrix) then all the eigenvectors are orthogonal and

    all the eigenvalues are positive. Any covariance matrix can be broken down into

    independent components made up of theeigenvectors and variances given by eigenvalues.One method of generating samples of any randomprocess (ie., generate white noise samples withvariances given by eigenvalues, and transform using

    a matrix made up of columns of eigenvectors.

  • 8/14/2019 12.Estimation - Correlations

    6/18

    03/22/06 12.540 Lec 12 5

    Error ellipses

    One special case is error ellipses. Normally

    coordinates (say North and East) are correlated andwe find a linear combinations of North and East thatare uncorrelated. Given their covariance matrix wehave:

    n2 ne

    ne e2

    Covariance matrix;

    Eigenvalues satisfy2(n2 +e2)+ (n2e2 ne2

    ) = 0

    Eigenvectors:ne

    1 n2

    and2 e

    2

    ne

  • 8/14/2019 12.Estimation - Correlations

    7/18

    03/22/06 12.540 Lec 12 6

    Error ellipses

    These equations are often written explicitly as:

    The size of the ellipse such that there is P (0-1)probability of being inside is

    1

    2

    =

    1

    2n

    2 + e2 n

    2 + e2( )2 4 n2e2 ne2( )

    tan2= 2nen

    2 e2

    angle ellipse make to N axis

    = 2ln(1 P)

  • 8/14/2019 12.Estimation - Correlations

    8/18

    03/22/06 12.540 Lec 12 7

    Error ellipses

    There is only 40% chance of being in 1-sigmaerror (compared to 68% of 1-sigma in onedimension)

    Commonly see 95% confidence ellipse whichis 2.45-sigma (only 2-sigma in 1-D).

    Commonly used for GPS position and velocityresults

  • 8/14/2019 12.Estimation - Correlations

    9/18

    03/22/06 12.540 Lec 12 8

    Example of error ellipse

    -8

    -6

    -4

    -2

    0

    2

    4

    6

    8

    -8.0 -6.0 -4.0 -2.0 0.0 2.0 4.0 6.0 8.0

    Var2

    Var1

    Error Ellipses shown1-sigma 40%2.45-sigma 95%3.03-sigma 99%3.72-sigma 99.9%

    Covariance2 22 4

    Eigenvalues0.87 and3.66,

    Angle -63o

  • 8/14/2019 12.Estimation - Correlations

    10/18

    03/22/06 12.540 Lec 12 9

    Process noise models

    In many estimation problems there areparameters that need to be estimated butwhose values are not fixed (ie., theythemselves are random processes in someway)

    Examples include for GPS Clock behavior in the receivers and satellites Atmospheric delay parameters Earth orientation parameters Station position behavior after earthquakes

  • 8/14/2019 12.Estimation - Correlations

    11/18

    03/22/06 12.540 Lec 12 10

    Process noise models

    There are several ways to handle these types ofvariations: Often, new observables can be formed that eliminate the

    random parameter (eg., clocks in GPS can be eliminated bydifferencing data)

    A parametric model can be developed and the parameters ofthe model estimated (eg., piece-wise linear functions can beused to represent the variations in the atmospheric delays)

    In some cases, the variations of the parameters are slow

    enough that over certain intervals of time, they can beconsidered constant or linear functions of time (eg., EOP areestimated daily)

    In some case, variations are fast enough that the process canbe treated as additional noise

  • 8/14/2019 12.Estimation - Correlations

    12/18

    03/22/06 12.540 Lec 12 11

    Process noise models

    Characterization of noise processes

    Firstly need samples of the process (often not easyto obtain)

    Auto-correlation functions Power spectral density functions Allan variances (frequency standards) Structure functions (atmospheric delays) (see Herring, T. A., J. L. Davis, and I. I. Shapiro,

    Geodesy by radio interferometry: The application ofKalman filtering to the analysis of VLBI data, J.Geophys. Res., 95, 1256112581, 1990.

  • 8/14/2019 12.Estimation - Correlations

    13/18

    03/22/06 12.540 Lec 12 12

    Characteristics of random processes

    Stationary: Property that statistical propertiesdo not depend on time

    Autocorrelation (t1,t2) = x1x2x1x2

    f(x1,t1;x2,t2)dx1dx2

    For stationary process only depends of = t1 t2

    xx () = limT

    1

    2T x(t)x(t+ )dtPSD xx () = xx ()

    eid

  • 8/14/2019 12.Estimation - Correlations

    14/18

    03/22/06 12.540 Lec 12 13

    Specific common processes

    White-noise: Autocorrelation is Dirac-deltafunction; PSD is flat; integral of power underPSD is variance of process (true in general)

    First-order Gauss-Markov process (one ofmost common in Kalman filtering)

    xx () = 2e

    xx () =22

    2 +21

    is correlation time

  • 8/14/2019 12.Estimation - Correlations

    15/18

    03/22/06 12.540 Lec 12 14

    Example of FOGM process

  • 8/14/2019 12.Estimation - Correlations

    16/18

    03/22/06 12.540 Lec 12 15

    Longer correlation time

    Sh t l ti ti

  • 8/14/2019 12.Estimation - Correlations

    17/18

    03/22/06 12.540 Lec 12 16

    Shorter correlation time

  • 8/14/2019 12.Estimation - Correlations

    18/18

    03/22/06 12.540 Lec 12 17

    Summary

    Examples show behavior of different correlationsequences (see fogm.m on class web page).

    The standard deviation of the rate of changeestimates will be greatly effected by the correlations.

    Next class we examine, how we can make anestimator that will account for these correlations.

    Homework #2 in on the class web page (due April 5).