Overview of SPM

Preview:

DESCRIPTION

Overview of SPM. Statistical parametric map (SPM). Design matrix. Image time-series. Kernel. Realignment. Smoothing. General linear model. Gaussian field theory. Statistical inference. Normalisation. p

Citation preview

Overview of SPM

Realignment Smoothing

Normalisation

General linear model

Statistical parametric map (SPM)Image time-series

Parameter estimates

Design matrix

Template

Kernel

Gaussian field theory

p <0.05

Statisticalinference

The General Linear Model (GLM)

With many thanks for slides & images to:FIL Methods group, Virginia Flanagin and Klaas Enno Stephan

Frederike Petzschner

Translational Neuromodeling Unit (TNU)Institute for Biomedical Engineering, University of Zurich & ETH Zurich

Image a very simple experiment…

time

• One session• 7 cycles of rest and

listening• Blocks of 6 scans with 7 sec

TR

time

Time

single voxel time series

Image a very simple experiment…

Question: Is there a change in the BOLD response between listening and rest?

What we know.

What we measure.

time

Time

single voxel time series

Image a very simple experiment…

Question: Is there a change in the BOLD response between listening and rest?

What we know.

What we measure.

linear model

effects estimate

error estimate

statistic

You need a model of your data…

Question: Is there a change in the BOLD response between listening and rest?

BOLD signal

Time =1 2+ +

error

x1 x2 e

Explain your data… as a combination of experimental manipulation,confounds and errors

Single voxel regression model:

regressor

BOLD signal

Time =1 2+ +

error

x1 x2 e

eXy Single voxel regression model:

Explain your data… as a combination of experimental manipulation,confounds and errors

n

= + +error

e

1

2

eXy

1n

p

p1

n1

n: number of scansp: number of regressors

The black and white version in SPM

Desig

nmat

rix

erro

r

The design matrix embodies all available knowledge about experimentally controlled

factors and potential confounds. Talk: Experimental Design Wed 9:45 – 10:45

Model assumptions

Designmatrix

errorYou want to estimate your parameters such that you minimize:

This can be done using an Ordinary least squares estimation (OLS) assuming an i.i.d. error:

errorGLM assumes identical and independently distributed errors

i.i.d. = error covariance is a scalar multiple of the identity matrix: Cov(e) = 2I

1001

)(eCov

1004

)(eCov

2112

)(eCov

non-identity non-independencet1 t2

t1

t2

= +

e

2

1

y X

„Option 1“: Per hand

How to fit the model and estimate the parameters?

error

= +

e

2

1

y X

How to fit the model and estimate the parameters?

error

Data predicted by our modelError between predicted and actual dataGoal is to determine the betas such that we minimize the quadratic error

OLS (Ordinary Least Squares)

OLS (Ordinary Least Squares) The goal is to

minimize the quadratic error between data and model

OLS (Ordinary Least Squares) The goal is to

minimize the quadratic error between data and model

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

The goal is to minimize the quadratic error between data and model

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

The goal is to minimize the quadratic error between data and model

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

You find the extremum of a function by taking its derivative and setting it to zero

The goal is to minimize the quadratic error between data and model

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

You find the extremum of a function by taking its derivative and setting it to zero

SOLUTION: OLS of the Parameters

The goal is to minimize the quadratic error between data and model

ye

Design space defined by X

x1

x2

A geometric perspective on the GLM

ˆ Xy

yXXX TT 1)(ˆ OLS estimates

x1

x2x2*

y

Correlated and orthogonal regressors

When x2 is orthogonalized with regard to x1, only the parameter estimate for x1 changes, not that for x2!

Correlated regressors = explained variance is shared between regressors

121

2211

exxy

1;1 *21

*2

*211

exxy

Design space defined by X

linear model

effects estimate

error estimate

statistic

We are nearly there…

= +

What are the problems?

1. BOLD responses have a delayed and dispersed form.

2. The BOLD signal includes substantial amounts of low-frequency noise.

3. The data are serially correlated (temporally autocorrelated) this violates the assumptions of the noise model in the GLM

Design Error

t

dtgftgf0

)()()(

The response of a linear time-invariant (LTI) system is the convolution of the input with the system's response to an impulse (delta function).

Problem 1: Shape of BOLD response

Solution: Convolution model of the BOLD response

expected BOLD response

= input function impulse response

function (HRF)

HRF

t

dtgftgf0

)()()(

blue = datagreen = predicted response, taking convolved with HRFred = predicted response, NOT taking into account the HRF

Problem 2: Low frequency noise

blue = datablack = mean + low-frequency driftgreen = predicted response, taking into account low-frequency driftred = predicted response, NOT taking into

account low-frequency drift

Problem 2: Low frequency noise

blue = datablack = mean + low-frequency driftgreen = predicted response, taking into account low-frequency driftred = predicted response, NOT taking into

account low-frequency drift

Linear model

discrete cosine transform (DCT)

set

Solution 2: High pass filtering

Problem 3: Serial correlations

non-identity non-independencet1 t2

t1

t2

i.i.d

n: number of scans

n

n

Problem 3: Serial correlations

n: number of scans

n

n

autocovariancefunction

withttt aee 1 ),0(~ 2 Nt

1st order autoregressive process: AR(1)

Problem 3: Serial correlations

• Pre-whitening: 1. Use an enhanced noise model with multiple error covariance components, i.e. e ~ N(0,2V) instead of e ~ N(0,2I). 2. Use estimated serial correlation to specify filter matrix W for whitening the data.

WeWXWy

This is i.i.d

How do we define W ?

• Enhanced noise model

• Remember linear transform for Gaussians

• Choose W such that error covariance becomes spherical

• Conclusion: W is a simple function of V so how do we estimate V ?

WeWXWy

),0(~ 2VNe

),(~

),,(~22

2

aaNy

axyNx

2/1

2

22 ),0(~

VW

IVW

VWNWe

Find V: Multiple covariance components

),0(~ 2VNe

iiQV

eCovV

)(

= 1 + 2

Q1 Q2

Estimation of hyperparameters with EM (expectation maximisation) or ReML (restricted maximum likelihood).

V

enhanced noise model error covariance components Qand hyperparameters

linear model

effects estimate

error estimate

statistic

We are there…

= +

• GLM includes all known experimental effects and confounds

• Convolution with a canonical HRF

• High-pass filtering to account for low-frequency drifts

• Estimation of multiple variance components (e.g. to account for serial correlations)

linear model

effects estimate

error estimate

statistic

We are there…

= +

c = 1 0 0 0 0 0 0 0 0 0 0

Null hypothesis: 01

)ˆ(

ˆ

T

T

cStdct

Talk: Statistical Inference and design efficiency. Next Talk

We are there…

Time

single voxel time series

• Mass-univariate approach: GLM applied to > 100,000 voxels

• Threshold of p<0.05 more than 5000 voxels significant by chance! • Massive problem with

multiple comparisons! • Solution: Gaussian

random field theory

Outlook: further challenges

• correction for multiple comparisons Talk: Multiple Comparisons Wed 8:30 – 9:30

• variability in the HRF across voxels Talk: Experimental Design Wed 9:45 – 10:45

• limitations of frequentist statistics Talk: entire Friday

• GLM ignores interactions among voxels Talk: Multivariate Analysis Thu 12:30 – 13:30

Thank you!

• Friston, Ashburner, Kiebel, Nichols, Penny (2007) Statistical Parametric Mapping: The Analysis of Functional Brain Images. Elsevier.

• Christensen R (1996) Plane Answers to Complex Questions: The Theory of Linear Models. Springer.

• Friston KJ et al. (1995) Statistical parametric maps in functional imaging: a general linear approach. Human Brain Mapping 2: 189-210.

Read

me

Recommended