149
Linear / nonlinear time series analysis Uni- / Bivariate (Synchronization) Continuous / discrete time series Exemplary application to medical data - EEG and neuronal recordings - Epilepsy (window to the brain) 6 lectures of 2 hours: Thu, May 12 – Tue, May 31, 2016 Thomas Kreuz (ISC-CNR) ([email protected]; http://www.fi.isc.cnr.it/users/thomas.kreuz/) Time series analysis

Timeseriesanalysis€¦ · exponentially (Expansion) • Convergence: Expansion of trajectories to the attractor limits is followed by a decrease of distance (Folding). ! Sensitive

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

  • •  Linear / nonlinear time series analysis

    •  Uni- / Bivariate (Synchronization)

    •  Continuous / discrete time series

    •  Exemplary application to medical data - EEG and neuronal recordings - Epilepsy (“window to the brain”)

    6 lectures of 2 hours: Thu, May 12 – Tue, May 31, 2016

    Thomas Kreuz (ISC-CNR) ([email protected]; http://www.fi.isc.cnr.it/users/thomas.kreuz/)

    Time  series  analysis  

  • •  Lecture 1: Example (Epilepsy & spike train synchrony), Data acquisition, Dynamical systems

    •  Lecture 2: Linear measures, Introduction to non-linear dynamics, Non-linear measures I

    •  Lecture 3: Non-linear measures II

    •  Lecture 4: Measures of continuous synchronization

    •  Lecture 5: Measures of discrete synchronization (spike trains)

    •  Lecture 6: Measure comparison & Application to epileptic seizure prediction

    (Preliminary)  Schedule  

  • •  Lecture 1: Example (Epilepsy & spike train synchrony), Data acquisition, Dynamical systems

    •  Lecture 2: Linear measures, Introduction to non-linear dynamics, Non-linear measures I

    •  Lecture 3: Non-linear measures II

    •  Lecture 4: Measures of continuous synchronization

    •  Lecture 5: Measures of discrete synchronization (spike trains)

    •  Lecture 6: Measure comparison & Application to epileptic seizure prediction

    (Preliminary)  Schedule  

  • •  General Introduction

    •  Example: Epileptic seizure prediction

    •  Data acquisition

    •  Introduction to dynamical systems

    First  lecture:  Introduc9on  

  • Second  lecture:  Univariate  Analysis  I  

    (Non-linear) Model systems Linear measures Introduction to non-linear dynamics Phase space reconstruction Non-linear measures I

    - Lyapunov-Exponent

  • Logis9c  map  

    r - Control parameter

    •  Model of population dynamics

    •  Classical example of how complex, chaotic behaviour can arise from very simple non-linear dynamical equations

    [R. M. May. Simple mathematical models with very complicated dynamics. Nature, 261:459, 1976]

    𝑟=4

  • Hénon  map  

    •  Introduced by Michel Hénon as a simplified model of the

    Poincaré section of the Lorenz model

    •  One of the most studied examples of dynamical systems that exhibit chaotic behavior

    [M. Hénon. A two-dimensional mapping with a strange attractor. Commun. Math. Phys., 50:69, 1976]

  • Rössler  system    

    •  designed in 1976, for purely theoretical reasons

    •  later found to be useful in modeling equilibrium in chemical reactions

    [O. E. Rössler. An equation for continuous chaos. Phys. Lett. A, 57:397, 1976]

    dx / dt = −ω(y+ z)dy / dt =ω(x + ay)dz / dt = b+ z(x − c)

    a = 0.15, b = 0.2, c = 10

  • Lorenz  system    

    •  Developed in 1963 as a simplified mathematical model for atmospheric convection

    •  Appears in simplified models for lasers, dynamos, electric circuits, and chemical reactions

    [E. N. Lorenz. Deterministic non-periodic flow. J. Atmos. Sci., 20:130, 1963]

    dx / dt =σ (y− x)dy / dt = −y− xz+ Rxdz / dt = xy− bz

    R = 28, σ = 10, b = 8 / 3

  • Linear  measures  

    •  Static measures - Moments of amplitude distribution (1st to 4th) •  Dynamic measures

    -  Autocorrelation -  Fourier spectrum -  Wavelet spectrum

  • Phase  space  example:  Pendulum  

    Velocity v(t)

    Position x(t)

    t

    State space:

    Time series:

  • AFractor  classifica9on  

    Fixed point: point that is mapped to itself

    Limit cycle: periodic orbit of the system that is isolated (i.e., has its own basin of attraction)

    Limit torus: quasi-periodic motion defined by n incommensurate frequencies (n-torus)

    Strange attractor: Attractor with a fractal structure

    (2-torus)

  • Taken’s  embedding  theorem  Trajectory of a dynamical system in - dimensional phase space . One observable measured via some measurement function :

    It is possible to reconstruct a topologically equivalent attractor via time delay embedding:

    - time lag, delay; – embedding dimension

    [F. Takens. Detecting strange attractors in turbulence. Springer, Berlin, 1980]

    ℜd

    x(t) =M (r(t)); M : ℜd → ℜ

    x(t) = [x(t), x(t −τ ), x(t − 2τ ),..., x(t − (m−1)τ )]

    τ m

    r(t) d

  • Topological  equivalence  

    original reconstructed

  • Characterizi9on  of  a  dynamic  in  phase  space  

    Predictability

    (Information / Entropy) Density

    Self-similarity Linearity / Non-linearity

    Determinism / Stochasticity

    (Dimension)

    Stability (sensitivity to initial conditions)

  • Divergence  and  convergence  Chaotic trajectories are Lyapunov-instable:

    •  Divergence: Neighboring trajectories expand Such that their distance increases exponentially (Expansion)

    •  Convergence: Expansion of trajectories to the attractor limits is followed by a decrease of distance (Folding).

    à Sensitive dependence on initial conditions

    Quantification: Lyapunov-Exponent

  • Lyapunov-‐exponent  In m-dimensional phase space:

    Lyapunov-spectrum: (expansion rates for different dimensions)

    Relation to divergence:

    Dissipative system:

    Largest Lyapunov exponent (LLE)

    Regular dynamics Chaotic dynamics Stochastic dynamics Stable fixed point

    λi, i =1,...,m

    div f = λii∑

    λii∑ < 0

    λ1

    λ1 = 0λ1 > 0λ1→∞λ1 < 0

  • Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures

    Today’s  lecture  

    [Acknowledgement: K. Lehnertz, University of Bonn, Germany]

  • Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures

    Today’s  lecture  

  • Dimension  (classical)  

    Number of degrees of freedom necessary to characterize a geometric object

    Euclidean geometry: Integer dimensions

    Object Dimension Point 0 Line 1 Square (Area) 2 Cube (Volume) 3 N-cube n

    Time series analysis: Number of equations necessary to model a physical system

  • Hausdorff-‐Dimension  D0  Generalization to non-Euclidian geometry

    Dimension of a non-Euclidian object in m-dimensional space: -  Cover object with m-dimensional hypercubes of edge length -  Minimum number of hypercubes needed for complete cover

    - Hausdorff dimension (Boxcount dimension, fractal dimension)

    N(ε)∝ε−D0ε→ 0

    D0 = limε→0logN(ε)log1 ε

    D0

    εN(ε)

  • Hausdorff-‐Dimension  D0  of  a  line  

  • Hausdorff-‐Dimension  D0  

    [Wikimedia]

    ε =1

    ε =12

    ε =13

    linear

  • Hausdorff-‐Dimension  D0  

    [Wikimedia]

    ε =1

    ε =12

    ε =13

    linear quadratic

  • Hausdorff-‐Dimension  D0  

    [Wikimedia]

    ε =1

    ε =12

    ε =13

    linear quadratic cubic

  • Generalized  dimensions  Hausdorff-Dimension D0 of high-dimensional systems difficult to estimate via box-counting (Statistical finite size effects)

  • Generalized  dimensions  Hausdorff-Dimension D0 of high-dimensional systems difficult to estimate via box-counting (Statistical finite size effects)

    à Generalized dimensions Dk:

    Partition of a m-dimensional phase space with M hypercubes of edge length ε with ε 0

  • Generalized  dimensions  Hausdorff-Dimension D0 of high-dimensional systems difficult to estimate via box-counting (Statistical finite size effects)

    à Generalized dimensions Dk:

    Partition of a m-dimensional phase space with M hypercubes of edge length ε with ε 0 Probability pi to find a point of the attractor in hypercube i with i=1,…,M(ε):

    Ni - Number of points in hypercube i N - Overall number of points pi = limN→∞

    NiN

  • Generalized  dimensions  Hausdorff-Dimension D0 of high-dimensional systems difficult to estimate via box-counting (Statistical finite size effects)

    à Generalized dimensions Dk:

    Partition of a m-dimensional phase space with M hypercubes of edge length ε with ε 0 Probability pi to find a point of the attractor in hypercube i with i=1,…,M(ε):

    Ni - Number of points in hypercube i N - Overall number of points pi = limN→∞

    NiN

    Renyi-Order q=0,1,2,…,∞ (Different weighting of probabilities)

  • Special  case:  Hausdorff-‐Dimension  D0  Generalized dimensions Dk

  • Special  case:  Hausdorff-‐Dimension  D0  Generalized dimensions Dk k 0 : Hausdorff-Dimension D0

    D0 counts the number of non-empty hypercubes

  • Special  case:  Informa9on  dimension  D1     k 1: Information dimension D1

    with Shannon entropy

    L′Hôpital′s rule

    H (ε) = − pii=0

    M (ε )

    ∑ log pi

    H (ε)

  • Special  case:  Informa9on  dimension  D1     k 1: Information dimension D1

    D1 - Dimension of probability distribution.

    with Shannon entropy

    L′Hôpital′s rule

    H (ε) = − pii=0

    M (ε )

    ∑ log pi

    H (ε)

  • Special  case:  Informa9on  dimension  D1     k 1: Information dimension D1

    D1 - Dimension of probability distribution. Homogeneous attractor: pi = 1/M(ε) for all i à à D1 = D0 à |D1 - D0| Measure of Inhomogeneity

    with Shannon entropy

    L′Hôpital′s rule

    H (ε) = − pii=0

    M (ε )

    ∑ log pi

    H (ε) = logM (ε)

    H (ε)

  • Special  case:  Correla9on  dimension  D2    k 2 : Correlation dimension D2 (easiest to calculate)

    [Grassberger & Procaccia, Measuring the strangeness of strange attractors. Physica D 1983]

    with H – Heaviside function

  • Special  case:  Correla9on  dimension  D2    k 2 : Correlation dimension D2 (easiest to calculate)

    [Grassberger & Procaccia, Measuring the strangeness of strange attractors. Physica D 1983]

    with

    Correlation sum: Mean probability that the states at two different times are close

    H – Heaviside function

  • Special  case:  Correla9on  dimension  D2    k 2 : Correlation dimension D2 (easiest to calculate)

    [Grassberger & Procaccia, Measuring the strangeness of strange attractors. Physica D 1983]

    with

    Correlation sum: Mean probability that the states at two different times are close

    Partition centralized around phase state points

    H – Heaviside function

  • Calcula9on  of  correla9on  dimension  D2    

    Hénon map: DTheory=1.26

    Scaling range

  • Calcula9on  of  correla9on  dimension  D2    

    No scaling range

    White noise: DTheory = ∞

  • Calcula9on  of  correla9on  dimension  D2    

    Hénon + noise

    Hénon map: DTheory=1.26 White noise: DTheory = ∞

    Scaling breaks down at noise level

    a: no noise b: low noise c: very noisy

  • Generalized  dimensions  Dk  

    •  In general Dk’ ≤ Dk for k’>k (monotonous decrease with k)

    •  Dk’ = Dk for homogenous probability distributions

    •  D1 and D2 are lower bounds for D0

    0 1 2 3 4 5 6 7 8 90

    1

    2

    3

    k

    Dk

    Homogeneous

    Non−Homogeneous

  • Generalized  dimensions  Dk  

    •  In general Dk’ ≤ Dk for k’>k (monotonous decrease with k)

    •  Dk’ = Dk for homogenous probability distributions

    •  Static measure

    •  Degrees of freedom, Measure of system complexity

  • Generalized  dimensions  Dk  

    •  In general Dk’ ≤ Dk for k’>k (monotonous decrease with k)

    •  Dk’ = Dk for homogenous probability distributions

    •  Static measure

    •  Degrees of freedom, Measure of system complexity

    Regular dynamics D integer

    Chaotic dynamics D fractal

    Stochastic dynamics D ∞

  • Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures

    Today’s  lecture  

  • Example  of  a  fractal:  Cantor  set  

    Some properties: - Perfect set that is nowhere dense - Lebesque-measure = 0 - Fractal dimension D=ln2/ln3≈ 0.63

    Continuous elimination of middle interval

    [Wikimedia]

    Self-similarity

  • Hausdorff-‐dimension  D0  of  a  Cantor  set  

  • Example  of  a  fractal:  Koch-‐curve  

    Some properties: - Infinite length - Continuous everywhere - Differentiable nowhere - Fractal dimension D=log4/log3≈1.26

    Animation

  • Box-‐coun9ng  the  Koch-‐curve  

  • Example  of  a  fractal:  Sierpinski  triangle  

    Some properties: - Area = 0 - Fractal dimension D=log3/log2≈1.585

    [Wikimedia]

    Animation

  • Example  of  a  fractal:  Menger  sponge  

    Some properties: - Infinite surface - Zero volume - Fractal dimension D=log20/log3≈ 2.7268 [Wikimedia]

  • Example  of  a  non-‐fractal  (!):  Hilbert  curve  

    Some properties: - Space-filling - Dimension D=2 (Integer!)

    [Wikimedia]

    Animation

  • Example:  Fractals  in  nature  

    [Wikimedia]

  • Fractal  dimension  of  a  coastline  

    [Mandelbrot. How Long Is the Coast of Britain? Statistical Self-Similarity and Fractional Dimension, Science 1967]

    Dependence on length of measuring stick?

    Intuitively, if a coastline looks smooth it should have dimension close to 1; and the more irregular the coastline looks the closer its dimension should be to 2.

  • More  fractals  in  nature  (?)  

    c

    y 1998

  • Fractals  in  art  (?)  

    Jack the Dripper: Jackson Pollock (1912-1956)

    The fractal dimension of Pollock's drip paintings increased from nearly 1.0 in 1943 to 1.72 in 1952

    [Taylor et al. Fractal analysis of Pollock's drip paintings. Nature 1999]

  • Fractals:  Defini9on  

    A mathematical set is a fractal if

    -  It has a fine structure

    -  It is irregular

    -  Exhibits self-similarity

    -  The fractal dimension (non-integer) is larger than the topological dimension

  • Example  of  a  fractal:  Mandelbrot  set  •  Sampling complex numbers c with real and imaginary parts as

    image coordinates

    [Benoit Mandelbrot 1924-2010]

  • Example  of  a  fractal:  Mandelbrot  set  •  Sampling complex numbers c with real and imaginary parts as

    image coordinates

    •  Mathematical operation zn+1 = zn2 + c with initial condition z0 = 0

    [Benoit Mandelbrot 1924-2010]

  • Example  of  a  fractal:  Mandelbrot  set  •  Sampling complex numbers c with real and imaginary parts as

    image coordinates

    •  Mathematical operation zn+1 = zn2 + c with initial condition z0 = 0

    •  If zn+1 tends towards infinity for a given c, c does not belong to the Mandelbrot set.

    [Benoit Mandelbrot 1924-2010]

  • Example  of  a  fractal:  Mandelbrot  set  •  Sampling complex numbers c with real and imaginary parts as

    image coordinates

    •  Mathematical operation zn+1 = zn2 + c with initial condition z0 = 0

    •  If zn+1 tends towards infinity for a given c, c does not belong to the Mandelbrot set.

    Examples: - c=1 does not belong to the Mandelbrot set since the sequence 0,1,2,5,26,… diverges - c=i does belong to the Mandelbrot set since the sequence 0,i,(-1+i),-i,(-1+i),-i,… stays bounded

    [Benoit Mandelbrot 1924-2010]

  • Example  of  a  fractal:  Mandelbrot  set  •  Sampling complex numbers c with real and imaginary parts as

    image coordinates

    •  Mathematical operation zn+1 = zn2 + c with initial condition z0 = 0

    •  If zn+1 tends towards infinity for a given c, c does not belong to the Mandelbrot set.

    Examples: - c=1 does not belong to the Mandelbrot set since the sequence 0,1,2,5,26,… diverges - c=i does belong to the Mandelbrot set since the sequence 0,i,(-1+i),-i,(-1+i),-i,… stays bounded

    Visualization: Members of the set – black; others are color-coded according to how rapidly the sequence diverges

    [Benoit Mandelbrot 1924-2010]

  • Example  of  a  fractal:  Mandelbrot  set  

    [Wikimedia]

  • Example  of  a  fractal:  Mandelbrot  set  

    [Wikimedia, ~45s]

  • Example  of  a  fractal:  Julia  set  

    •  Julia set of a function: Values for which an arbitrarily small

    perturbation can cause drastic changes in the sequence of iterated function values.

    •  Behavior of the function on the Julia set is 'chaotic'.

    •  Connection to Mandelbrot sets: A point is in the Mandelbrot set exactly when the corresponding Julia set is connected.

    [Gaston Julia, 1893-1978]

  • Example  of  a  fractal:  Julia  set  

    [Sadrain]

  • Example  of  a  fractal:  Julia  set  

    [Wikimedia]

  • Example  of  a  fractal:  Julia  set  

    [Wikimedia]

  • Example  of  a  fractal:  Julia  set  

    [Wikimedia]

  • Example  of  a  fractal:  Julia  set  

    [Wikimedia]

  • Strange  aFractors  are  fractals  

    Logistic map

    Hénon map

    2,01 Rössler System (𝑎=0.15,  b=0.2;c=10)

  • Self-‐similarity  of  the  logis9c  aFractor  

  • Self-‐similarity  of  the  Hénon  aFractor  

  • Poincaré  Sec9on  of  the  Lorenz  aFractor    

    Poincaré Section

  • Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures

    Today’s  lecture  

  • Entropy:  History  

    Thermodynamics and statistical mechanics:

    Entropy ~ Disorder

    (Boltzmann, Gibbs, ~1870)

  • Entropy:  History  

    Thermodynamics and statistical mechanics:

    Entropy ~ Disorder

    (Boltzmann, Gibbs, ~1870) Information theory:

    Entropy ~ Information content of probability distribution

    (Shannon, Renyi, Kolmogorov, ~1950)

  • Entropy:  Thermodynamics  Irreversible process:

    Expansion of an ideal gas from one into two vessels with equal volume

  • Entropy:  Thermodynamics  Irreversible process:

    Expansion of an ideal gas from one into two vessels with equal volume N particles that move independently Each particle: Equal probability p to be in either vessel Uniform distribution or extreme distributions (all left or all right) ?

  • Entropy  and  Probability  Two extreme states: •  State 1: All particles in the left (right) vessel

  • Entropy  and  Probability  Two extreme states: •  State 1: All particles in the left (right) vessel •  State 2: Equal distribution over both vessels

  • Entropy  and  Probability  Two extreme states: •  State 1: All particles in the left (right) vessel •  State 2: Equal distribution over both vessels Which state is more probable?

  • Entropy  and  Probability  Two extreme states: •  State 1: All particles in the left (right) vessel •  State 2: Equal distribution over both vessels Which state is more probable? State 1 has vanishing probability

    p = 2−N → 0 for N→∞N =10⇒ p ≈10−3

    N = 30⇒ p ≈10−9

  • Entropy  and  Probability  Two extreme states: •  State 1: All particles in the left (right) vessel •  State 2: Equal distribution over both vessels Which state is more probable? State 1 has vanishing probability State 2 almost certain (Probability close to 1) Probabilities get very small soon à logarithmic representation

    p = 2−N → 0 for N→∞N =10⇒ p ≈10−3

    N = 30⇒ p ≈10−9

  • Logarithmic representation of the probability of a state:

    Boltzmann-constant k relates micro- and macroscopic properties

    From  Probability  to  Entropy  

    S = k log p

  • Logarithmic representation of the probability of a state:

    Boltzmann-constant k relates micro- and macroscopic properties

    Entropy is additive state function:

    à

    From  Probability  to  Entropy  

    log p = p1p2 S = S1 + S2

    S = k log p

  • Second  law  of  Thermodynamics  

    •  The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermo-dynamic equilibrium—the state of maximum entropy.

  • Second  law  of  Thermodynamics  

    •  The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermo-dynamic equilibrium—the state of maximum entropy.

    •  Equivalent formulation: A perpetuum mobile of the second kind (extracting work from heat) is impossible.

  • Second  law  of  Thermodynamics  

    •  The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermo-dynamic equilibrium—the state of maximum entropy.

    •  Equivalent formulation: A perpetuum mobile of the second kind (extracting work from heat) is impossible.

    Isolated system Equilibrium state:

    Ssys = Smax (ΔSSys =0)

  • Second  law  of  Thermodynamics  

    •  The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermo-dynamic equilibrium—the state of maximum entropy.

    •  Equivalent formulation: A perpetuum mobile of the second kind (extracting work from heat) is impossible.

    Isolated system Equilibrium state:

    Open system Equilibrium state:

    Decrease in system à Increase in environment !

    Ssys + SEnv = Smax (ΔSsys +ΔSEnv=0)

    Ssys = Smax (ΔSSys =0)

  • Entropy  and  Informa9on  theory  

    Observation of a process (Measurement)

    Source of information

  • Entropy  and  Informa9on  theory  

    Observation of a process (Measurement)

    Source of information Questions: How much can I learn about a system by conducting one measurement? How much information do I get about the future evolution of the system by getting to know its complete past?

  • Shannon  entropy  H  

    System with two states:

    Measurement (one question) à Information gain = 1 bit

    System with four states:

    Measurement (two questions) à Information gain = 2 bits

  • Shannon  entropy  H  

    System with two states:

    Measurement (one question) à Information gain = 1 bit

  • Shannon  entropy  H  

    System with two states:

    Measurement (one question) à Information gain = 1 bit

    System with four states:

    Measurement (two questions) à Information gain = 2 bits …

    Maximum information gain for system with N states:

    H=log2N

  • [Tomasz Downarowicz (2007), Scholarpedia, 2(11):3901]

    Shannon  entropy  H:  Example  

  • [Tomasz Downarowicz (2007), Scholarpedia, 2(11):3901]

    Shannon  entropy  H:  Example  

  • [Tomasz Downarowicz (2007), Scholarpedia, 2(11):3901]

    Shannon  entropy  H:  Example  

    Expected number of questions:

    H (A) = 14

    * 2 + 18

    *3 + 18

    *3 + 12

    *1 = 74

  • Shannon  entropy  H  

    Coin throw (2 states):

    H = - ( 12

    * log212

    + 12

    * log212

    )

  • Shannon  entropy  H  

    Coin throw (2 states): In general: N states with probabilities pi

    (Normalization: )

    H = - ( 12

    * log212

    + 12

    * log212

    )

    pi =1i=1

    N

  • Shannon  entropy  H  

    Coin throw (2 states): In general: N states with probabilities pi Average information gain per measurement:

    Shannon entropy

    (Shannon information: )

    (Normalization: )

    H = - ( 12

    * log212

    + 12

    * log212

    )

    I = −HH = - pi log pi

    i=1

    N

    pi =1i=1

    N

  • Shannon  entropy  H  

    Shannon entropy ~ ‘Uncertainty’

    H = - pi log pii=1

    N

  • Shannon  entropy  H  

    Shannon entropy ~ ‘Uncertainty’

    𝐻(𝑝)

    𝑝

    Binary probabilities:

    H = - pi log pii=1

    N

  • Shannon  entropy  H  

    Shannon entropy ~ ‘Uncertainty’

    𝐻(𝑝)

    𝑝

    Binary probabilities: In general:

    H = - pi log pii=1

    N

    H (p) = 1

    H (p) = 0No uncertainty

    Highest uncertainty

  • Generaliza9on:  Renyi  entropies  Hq  Information content necessary to determine a value (or the position of a point in phase space) with a certain precision if only the probability distribution is known.

  • Generaliza9on:  Renyi  entropies  Hq  Information content necessary to determine a value (or the position of a point in phase space) with a certain precision if only the probability distribution is known.

    Partition of a m-dimensional phase space with M hypercubes of edge length ε (ε 0)

  • Generaliza9on:  Renyi  entropies  Hq    Information content necessary to determine a value (or the

    position of a point in phase space) with a certain precision if only the probability distribution is known.

    Partition of a m-dimensional phase space with M hypercubes of edge length ε (ε 0) Probability pi to find a point of the attractor in hypercube i with i=1,…,M(ε):

    Ni - Number of points in hypercube i N - Overall number of points pi = limN→∞

    NiN

  • Generaliza9on:  Renyi  entropies  Hq  Information content necessary to determine a value (or the position of a point in phase space) with a certain precision if only the probability distribution is known.

    Partition of a m-dimensional phase space with M hypercubes of edge length ε (ε 0) Probability pi to find a point of the attractor in hypercube i with i=1,…,M(ε):

    Renyi-Order q=0,1,2,…,∞ (Different weighting of probabilities)

    Ni - Number of points in hypercube i N - Overall number of points pi = limN→∞

    NiN

  • Renyi  (q=1):  Shannon  entropy  H1  

    Additivity (only for ):

    Entropy of a joint process is sum of the entropies of the marginal processes.

    Shannon entropy

    L′Hôpital′s rule

    q→1

    q→1: H1 = H

  • Renyi  (q=1):  Shannon  entropy  H1  

    Shannon entropy

    L′Hôpital′s rule

    q→1: H1 = H

  • Example:  Renyi  entropies  Hq  

    Renyi entropies Hq of homogenous probability distribution:

    pi =1N

    ∀ i =1,...,N

  • Example:  Renyi  entropies  Hq  

    Renyi entropies Hq of homogenous probability distribution:

    Maximum possible entropy.

    for all q !!!

    pi =1N

    ∀ i =1,...,N

    = log2 N

    0 1 2 3 4 5 6 7 8 90

    1

    q

    Hq

    Homogeneous

    Non−Homogeneous

  • Renyi  block  entropies  hq  Static distributions à Generalization to dynamic distributions

    (Dynamic) Renyi block entropy: Probability of trajectory

  • Renyi  block  entropies  hq  Static distributions à Generalization to dynamic distributions

    (Dynamic) Renyi block entropy: Probability of trajectory

    Partition of a phase space with hypercubes of edge length ε 0 à Joint probability to find a point of the attractor at time t1 in hypercube i1, at time t2=t1+Δt in hypercube i2, etc.:

    →pi1,i2 ,...,im

  • Renyi  block  entropies  hq  Static distributions à Generalization to dynamic distributions

    (Dynamic) Renyi block entropy: Probability of trajectory

    Partition of a phase space with hypercubes of edge length ε 0 à Joint probability to find a point of the attractor at time t1 in hypercube i1, at time t2=t1+Δt in hypercube i2, etc.:

    Block length m ∞: Generalized entropy of order q

    →pi1,i2 ,...,im

  • Topological  entropy  K0  

    Asymptotic entropy per time step

    Generalized entropy of order q: m→∞ :

    with

  • Topological  entropy  K0  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    Topological entropy h0 (or K0) – no weighting of probabilities

    q→ 0 :

    m→∞ :

    with

  • Topological  entropy  K0  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    Relation to Haussdorff-dimension D0 : D0 counts the number of non-empty cubes K0 counts the number of different trajectories (how homogeneous is the space of possible trajectories covered)

    with

    Topological entropy h0 (or K0) – no weighting of probabilities

    q→ 0 :

    m→∞ :

  • Metric  entropy  (Kolmogorov-‐Sinai)  K1  

    Asymptotic entropy per time step

    Generalized entropy of order q: m→∞ :

    with

  • Metric  entropy  (Kolmogorov-‐Sinai)  K1  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    Metric entropy or Kolmogorov-Sinai entropy h1 (or K1)

    m→∞ :

    q→1:

    with

  • Metric  entropy  (Kolmogorov-‐Sinai)  K1  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    Metric entropy or Kolmogorov-Sinai entropy h1 (or K1)

    Relation to Information dimension D1: D1 - How does the average information to identify an occupied box scale with box size? K1 - Average loss of information per iteration about the state of the system

    m→∞ :

    q→1:

    with

  • Generalized  entropies  Kq  

    •  In general Kq’ ≤ Kq for q’>q (monotonous decrease with q)

    •  Kq’ = Kq for homogenous probability distributions •  K1 and K2 are lower bounds for K0

    0 1 2 3 4 5 6 7 8 90

    1

    2

    3

    4

    5

    q

    Kq

    Homogeneous

    Non−Homogeneous

  • Generalized  entropies  Kq  

    •  In general Kq’ ≤ Kq for q’>q (monotonous decrease with q)

    •  Average loss of information ~ average prediction time

  • Generalized  entropies  Kq  

    •  In general Kq’ ≤ Kq for q’>q (monotonous decrease with q)

    •  Average loss of information ~ average prediction time

    •  Dynamic measure

    •  Measure of system disorder:

    𝜌 – localization precision of initial condition

  • Generalized  entropies  Kq  

    •  In general Kq’ ≤ Kq for q’>q (monotonous decrease with q)

    •  Average loss of information ~ average prediction time

    •  Dynamic measure

    •  Measure of system disorder:

    Regular dynamics K = 0

    Chaotic dynamics K > 0

    Stochastic dynamics K ∞

    𝜌 – localization precision of initial condition

  • Example:  Generalized  entropy  of  white  noise  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    with

  • Example:  Generalized  entropy  of  white  noise  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    with

    Partition in M hypercubes: m=2: Joint probability pij for transition from hypercube i into hypercube j

  • Example:  Generalized  entropy  of  white  noise  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    with

    Partition in M hypercubes: m=2: Joint probability pij for transition from hypercube i into hypercube j

    White noise: Joint probabilities factorize and are the same for all possible transitions

  • Example:  Generalized  entropy  of  white  noise  

    Asymptotic entropy per time step

    Generalized entropy of order q:

    with

    Partition in M hypercubes: m=2: Joint probability pij for transition from hypercube i into hypercube j

    White noise: Joint probabilities factorize and are the same for all possible transitions

    à à Any temporal correlation reduces entropy !

    (independent of q)

    pij = pi pj =1M 2

    Hq (m) =m logM hq = logM ε→0" →"" ∞

  • Calcula9on  of  generalized  entropies  Kq  

    •  Calculation of (dynamic) entropies from real time series very difficult, in particular for high-dimensional systems

  • Calcula9on  of  generalized  entropies  Kq  

    •  Calculation of (dynamic) entropies from real time series very difficult, in particular for high-dimensional systems

    •  More data points needed than for Lyapunov exponent or dimension

  • Calcula9on  of  generalized  entropies  Kq  

    •  Calculation of (dynamic) entropies from real time series very difficult, in particular for high-dimensional systems

    •  More data points needed than for Lyapunov exponent or dimension

    •  Limit m ∞ very difficult to achieve

  • Calcula9on  of  generalized  entropies  Kq  

    •  Calculation of (dynamic) entropies from real time series very difficult, in particular for high-dimensional systems

    •  More data points needed than for Lyapunov exponent or dimension

    •  Limit m ∞ very difficult to achieve

    •  Box-counting impractical - m–dimensional histograms - very long time series needed - scaling behavior insufficient

  • Calcula9on  of  Correla9on  entropy  K2  •  Alternative: Importance sampling - no uniform partition of phase space - instead centralization of partition around phase state points

    [Grassberger & Procaccia, 1983]

  • Calcula9on  of  Correla9on  entropy  K2  •  Alternative: Importance sampling - no uniform partition of phase space - instead centralization of partition around phase state points

    à  Correlation entropy K2 based on correlation sum

    [Grassberger & Procaccia, 1983]

    For q>1:

    H – Heaviside function

  • Calcula9on  of  Correla9on  entropy  K2  •  Alternative: Importance sampling - no uniform partition of phase space - instead centralization of partition around phase state points

    à  Correlation entropy K2 based on correlation sum

    [Grassberger & Procaccia, 1983]

    For q>1:

    If scaling range exists:

    H – Heaviside function

  • Calcula9on  of  Correla9on  entropy  K2  Hénon map: Scaling region recognizable

  • Calcula9on  of  Correla9on  entropy  K2  White noise: No scaling region

  • Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures

    Today’s  lecture  

  • Characteriza9on  of  a  dynamic  

    Regular dynamics D integer; λ1,K = 0

    Chaotic dynamics D fractal; λ1,K > 0

    Stochastic dynamics D,λ1,K ∞

  • Characteriza9on  of  a  dynamic  

    Regular dynamics D integer; λ1,K = 0

    Chaotic dynamics D fractal; λ1,K > 0

    Stochastic dynamics D,λ1,K ∞

    Dimension, Lyapunov-exponent and entropy describe different properties of a dynamic Are these measures related?

  • Positive Lyapunov exponents:

    -  Exponential divergence of neighboring trajectories -  Information loss about future position in phase space

    Rela9on  between  K  and  λ  

  • Positive Lyapunov exponents:

    -  Exponential divergence of neighboring trajectories -  Information loss about future position in phase space

    Temporal

    evolution 𝑥( 𝑡↓0 ) 𝑥(𝑡)

    Initial state x(t0 ) ε-ball ~ Uncertainty regarding position (noise)

    Future state x(ti ) ε-ball à Ellipsoid Expansion / contraction of axis i with eλt

    Rela9on  between  K  and  λ  

  • Entropy Lyapunov-exponent Average information loss Exponential divergence of

    neighboring trajectories

    Rela9on  between  K  and  λ  

  • Entropy Lyapunov-exponent Average information loss Exponential divergence of

    neighboring trajectories Pesin’s theorem: Kolmogorov-Sinai entropy = Sum of positive Lyapunov exponents

    [Pesin, 1977]

    Rela9on  between  K  and  λ  

    K1 = λii,λi>0∑

  • with and

    Kaplan-‐Yorke  dimension  

    [Kaplan & Yorke, 1979]

  • with and

    Kaplan-‐Yorke  dimension  

    [Kaplan & Yorke, 1979]

    Integer part of dimension

    Fractal part of dimension

    λ1λ1 +λ2

    λ1 +λ2 +λ3

  • with and

    Kaplan-‐Yorke  dimension  

    [Kaplan & Yorke, 1979]

    Kaplan-Yorke conjecture •  True for 2-dimensional

    maps •  In other cases not

    confirmed •  Counter-examples exist

    Integer part of dimension

    Fractal part of dimension

    λ1λ1 +λ2

    λ1 +λ2 +λ3

  • Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures

    Today’s  lecture  

  • Measures of synchronization for continuous data (time series derived from non-linear model systems / EEG)

    •  Linear measures: Cross correlation, coherence

    •  Mutual information

    •  Phase synchronization (Hilbert transform)

    •  Non-linear interdependences

    Measures of directionality

    •  Granger causality

    •  Transfer entropy

    Next  lecture