22
1 1 Chapter 1 Some Results on Linear Algebra, Matrix Theory and Distributions We need some basic knowledge to understand the topics in the analysis of variance. Vectors: A vector Y is an ordered n-tuple of real numbers. A vector can be expressed as a row vector or a column vector as is a column vector of order and is a row vector of order If all for all i = 1,2,…,n then is called the null vector. If Analysis of Variance | Chapter 1 | Linear Algebra, Matrix Theory and Dist. | Shalabh, IIT Kanpur

Chapter 1home.iitk.ac.in/.../WordFiles-Anova/chapter1-matrix.docx · Web viewAnalysis of Variance | Chapter 1 | Linear Algebra, Matrix Theory and Dist. | Shalabh, IIT Kanpur15 15

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Chapter 1

Chapter 1

Some Results on Linear Algebra, Matrix Theory and Distributions

We need some basic knowledge to understand the topics in the analysis of variance.

Vectors:

A vector Y is an ordered n-tuple of real numbers. A vector can be expressed as a row vector or a column vector as

is a column vector of order and

is a row vector of order

If all for all i = 1,2,…,n then is called the null vector.

If

where k is a scalar.

Orthogonal vectors:

Two vectors X and Y are said to be orthogonal if .

The null vector is orthogonal to every vector X and is the only such vector.

Linear combination:

If are m vectors and are scalars, then

is called the linear combination of .

Linear independence

If are m vectors then they are said to be linearly independent if there exist scalars such that

for all i = 1,2,…,m.

If there exist with at least one to be nonzero, such that then

are said to be linearly dependent.

· Any set of vectors containing the null vector is linearly dependent.

· Any set of non-null pair-wise orthogonal vectors is linearly independent.

· If m > 1 vectors are linearly dependent, it is always possible to express at least one of them as a linear combination of the others.

Linear function:

Let be a vector of scalars and be a 1 vector of variables, then is called a linear function or linear form. The vector is called the coefficient vector. For example, the mean of can be expressed as

where is a vector of all elements unity.

Contrast:

The linear function is called a contrast in if

For example, the linear functions

are contrasts.

·

A linear function is a contrast if and only if it is orthogonal to a linear function or to the linear function .

·

Contrasts are linearly independent for all

·

Every contrast in can be written as a linear combination of (m - 1) contrasts .

Matrix:

A matrix is a rectangular array of real numbers. For example

is a matrix of order with m rows and n columns.

· If m = n, then A is called a square matrix.

·

If then A is a diagonal matrix and is denoted as

·

If m = n (square matrix) and for i > j , then A is called an upper triangular matrix. On the other hand if m = n and for i < j then A is called a lower triangular matrix.

·

If A is a matrix, then the matrix obtained by writing the rows of A and

columns of A as columns of A and rows of A respectively is called the transpose of a matrix A and is denoted as .

·

If then A is a symmetric matrix.

·

If then A is skew-symmetric matrix.

· A matrix whose all elements are equal to zero is called a null matrix.

·

An identity matrix is a square matrix of order p whose diagonal elements are unity (ones) and all the off diagonal elements are zero. It is denoted as .

·

If A and B are matrices of order then

· If A and B are the matrices of order m x n and n x p respectively and k is any scalar, then

·

If the orders of matrices A is m x n, B is n x p and C is n x p then

·

If the orders of matrices A is m x n, B is n x p and C is p x q then

·

If A is the matrix of order m x n then .

Trace of a matrix:

The trace of matrix , denoted as tr(A) or trace(A) is defined to be the sum of all the diagonal elements of A, i.e. .

·

If is of order and is of order , then

.

·

If is matrix and is any nonsingular matrix then

If is an orthogonal matrix than

* If are matrices, are scalars then

.

·

If matrix, then

and

if and only if

·

If matrix then

.

Rank of matrices

The rank of a matrix A of is the number of linearly independent rows in A.

Let B be any other matrix of order

· A square matrix of order m is called non-singular if it has full rank.

·

·

· Rank A is equal to the maximum order of all nonsingular square sub-matrices of A.

·

.

· A is of full row rank if rank(A) = m < n.

· A is of full column rank if rank(A) = n < m.

Inverse of a matrix

The inverse of a square matrix A of order m, is a square matrix of order m, denoted as , such that

The inverse of A exists if and only if A is non-singular.

·

·

If A is non-singular, then

·

If A and B are non-singular matrices of the same order, then their product, if defined, is also nonsingular and

Idempotent matrix:

A square matrix A is called idempotent if .

If A is an idempotent matrix with rank(A) = r n. Then

· eigenvalues of A are 1 or 0.

·

·

If A is of full rank n, then .

· If A and B are idempotent and AB = BA, then AB is also idempotent.

· If A is idempotent then (I – A) is also idempotent and A(I - A) = (I - A)A = 0.

Quadratic forms:

If A is a given matrix of order and X and Y are two given vectors of order and respectively

where are the nonstochastic elements of A.

If A is a square matrix of order m and X = Y , then

If A is symmetric also, then

is called a quadratic form in m variables or a quadratic form in X.

· To every quadratic form corresponds a symmetric matrix and vice versa.

· The matrix A is called the matrix of the quadratic form.

·

The quadratic form and the matrix A of the form is called.

·

Positive definite if for all .

·

Positive semidefinite if for all .

·

Negative definite if for all .

·

Negative semidefinite if for all .

·

If A is positive semi-definite matrix then and if then for all j, and for all j.

·

If P is any nonsingular matrix and A is any positive definite matrix (or positive semi-definite matrix) then is also a positive definite matrix (or positive semi-definite matrix).

·

A matrix A is positive definite if and only if there exists a non-singular matrix P such that

· A positive definite matrix is a nonsingular matrix.

·

If A is matrix and then is positive definite and is positive semidefinite.

·

If A matrix and , then both and are positive semidefinite.

Simultaneous linear equations

The set of m linear equations in n unknowns and scalars and , of the form

can be formulated as AX = b

where A is a real matrix of known scalars of order called as a coefficient matrix, X is real vector and b is real vector of known scalars given by

·

If A is a nonsingular matrix, then AX = b has a unique solution.

· Let B = [A, b] is an augmented matrix. A solution to AX = b exist if and only if rank(A) = rank(B).

·

If is a matrix of rank , then has a solution.

· Linear homogeneous system AX = 0 has a solution other than X = 0 if and only if rank(A) < n.

· If AX = b is consistent then AX = b has a unique solution if and only if rank(A) = n

·

If is the ith diagonal element of an orthogonal matrix, then .

·

Let the matrix be partitioned as where is an vector of the elements of ith column of A. A necessary and sufficient condition that A is an orthogonal matrix is given by the following:

Orthogonal matrix

A square matrix A is called an orthogonal matrix if or equivalently if

· An orthogonal matrix is non-singular.

·

If A is orthogonal, then is also orthogonal.

·

If A is an matrix and let P is an orthogonal matrix, then the determinants of A and are the same.

Random vectors:

Let be n random variables then is called a random vector.

· The mean vector Y is

· The covariance matrix or dispersion matrix of Y is

which is a symmetric matrix

·

If are independently distributed, then the covariance matrix is a diagonal matrix.

·

If for all i = 1, 2,…,n then .

Linear function of random variable :

If are n random variables, and are scalars, then is called a linear function of random variables .

If then .

·

the mean is and

·

the variance of is

Multivariate normal distribution

A random vector has a multivariate normal distribution with mean vector and dispersion matrix if its probability density function is

assuming is a nonsingular matrix.

Chi-square distribution

·

If are independently distributed following the normal distribution random variables with common mean 0 and common variance 1, then the distribution of is called the -distribution with k degrees of freedom.

·

The probability density function of -distribution with k degrees of freedom is given as

·

If are independently distributed following the normal distribution with common mean 0 and common variance , then has distribution with k degrees of freedom.

·

If the random variables are normally distributed with non-null means but common variance 1, then the distribution of has noncentral distribution with k degrees of freedom and noncentrality parameter .

·

If are independently and normally distributed following the normal distribution with means but common variance then has non-central distribution with k degrees of freedom and noncentrality parameter .

·

If U has a Chi-square distribution with k degrees of freedom then and

.

· If U has a noncentral Chi-square distribution with k degrees of freedom and

noncentrality parameter then and .

·

If are independently distributed random variables with each having a noncentral Chi-square distribution with degrees of freedom and non-centrality parameter then has noncentral Chi-square distribution with degrees of freedom and noncentrality parameter .

·

Let has a multivariate distribution with mean vector and positive definite covariance matrix . Then is distributed as noncentral with k degrees of freedom if and only if is an idempotent matrix of rank k.

·

Let has a multivariate normal distribution with mean vector and positive definite covariance matrix . Let the two quadratic forms-

·

is distributed as with degrees of freedom and noncentrality parameter and

·

is distributed as with degrees of freedom and noncentrality parameter .

Then are independently distributed if

t-distribution

· If

· X has a normal distribution with mean 0 and variance 1,

·

Y has a distribution with n degrees of freedom, and

· X and Y are independent random variables,

then the distribution of the statistic is called the t-distribution with n degrees of freedom. The probability density function of t is

·

If the mean of X is nonzero then the distribution of is called the noncentral t- distribution with n degrees of freedom and noncentrality parameter .

F-distribution

·

If X and Y are independent random variables with -distribution with m and n degrees of freedom respectively, then the distribution of the statistic is called the F-distribution with m and n degrees of freedom. The probability density function of F is

·

If X has a noncentral Chi-square distribution with m degrees of freedom and noncentrality parameter has a distribution with n degrees of freedom, and X and Y are independent random variables, then the distribution of is the noncentral F distribution with m and n degrees of freedom and noncentrality parameter .

Linear model:

Suppose there are n observations. In the linear model, we assume that these observations are the values taken by n random variables satisfying the following conditions:

1.

is a linear combination of p unknown parameters

where are known constants.

2.

are uncorrelated and normality distributed with variance .

The linear model can be rewritten by introducing independent normal random variables following ,as

These equations can be written using the matrix notations as

where Y is a vector of observation, X is a matrix of n observations on each of variables, is a vector of parameters and is a vector of random error components with Here Y is called study or dependent. variable, are called explanatory or independent variables and are called as regression coefficients.

Alternatively since so the linear model can also be expressed in the expectation form as a normal random variable Y with

Note that are unknown but X is known.

Estimable functions:

A linear parametric function of the parameter is said to be an estimable parametric function or estimable if there exists a linear function of random variables of Y where such that

with and being vectors of known scalars.

Best linear unbiased estimates (BLUE)

The unbiased minimum variance linear estimate of an estimable function is called the best linear unbiased estimate of .

·

Suppose and are the BLUE of respectively. Then is the BLUE of .

·

If is estimable, its best estimate is where is any solution of the equations

Least squares estimation

The least-squares estimate of is is the value of which minimizes the

error sum of squares .

Let

Minimizing S with respect to involves

which is termed as normal equation. This normal equation has a unique solution given by

assuming . Note that is a positive definite matrix. So is the value of which minimizes and is termed as ordinary least squares estimator of .

·

In this case, are estimable and consequently, all the linear parametric function are estimable.

·

·

·

If are the estimates of respectively, then

·

·

.

·

is called the residual vector

·

Linear model with correlated observations:

In the linear model

with and is normally distributed, we find

Assuming to be positive definite, so we can write

where is a nonsingular matrix. Premultiplying by P, we get

Note that in this model

Distribution of :

In the linear model consider a linear function which is normally distributed with

Then

.

Further, has a noncentral Chi-square distribution with one degree of freedom and noncentrality parameter

Degrees of freedom:

A linear function of the observations is said to carry one degrees of freedom. A set of linear functions where L is r x n matrix, is said to have M degrees of freedom if there exist M linearly independent functions in the set and no more. Alternatively, the degrees of freedom carried by the set equals . When the set are the estimates of the degrees of freedom of the set will also be called the degrees of freedom for the estimates of .

Sum of squares:

If is a linear function of observations, then the projection of Y on is the vector . The square of this projection is called the sum of squares (SS) due to is given by . Since has one degree of freedom, so the SS due to has one degree of freedom.

The sum of squares and the degrees of freedom arising out of the mutually orthogonal sets of functions can be added together to give the sum of squares and degrees of freedom for the set of all the function together and vice versa.

Let has a multivariate normal distribution with mean vector and positive definite covariance matrix Let the two quadratic forms.

·

is distribution degrees of freedom and noncentrality parameter and

·

is distributed as with degrees of freedom and noncentrality parameter .

Then are independently distributed if .

Fisher-Cochran theorem

If has a multivariate normal distribution with mean vector and positive definite covariance matrix and let

where with rank Then are independently distributed noncentral Chi-square distribution with degrees of freedom and noncentrality parameter if and only if is which case

Derivatives of quadratic and linear forms:

Let and f(X) be any function of n independent variables , then

.

If is a vector of constants, then

If A is a matrix, then

.

Independence of linear and quadratic forms:

·

Let Y be an vector having multivariate normal distribution and B be a matrix. Then the vector linear form BY is independent of the quadratic form if BA = 0 where A is a symmetric matrix of known elements.

·

Let Y be a vector having multivariate normal distribution with . If , then the quadratic form is independent of linear form BY where B is a matrix.

Analysis of Variance | Chapter 1 | Linear Algebra, Matrix Theory and Dist. | Shalabh, IIT Kanpur

15

15

0

ij

a

=

'

A

'

AA

=

'

AA

=-

p

I

12

'(,,...,)

n

Yyyy

=

()'''.

ABAB

+=+

()'''

()()().

ABBA

kABAkBkABkAB

=

===

().

ABCABAC

+=+

()().

ABCABC

=

.

mn

IAAIA

==

nn

´

A

1

()

n

ii

i

trAa

=

=

å

A

1.

n

´

mn

´

B

nm

´

()()

trABtrBA

=

A

nn

´

P

nn

´

1

()().

trAtrPAP

-

=

P

0

i

y

=

()(').

trAtrPAP

=

and

AB

nn

´

and

ab

()()()

traAbBatrAbtrB

+=+

isan

Amn

´

2

11

(')(')

nn

ij

ji

trAAtrAAa

==

==

åå

(')(')0

trAAtrAA

==

0.

A

=

is

Ann

´

'(0,0,...,0)

Y

=

(')

trAtrA

=

mn

´

.

nq

´

()min((),())

rankABrankArankB

£

()()()

rankABrankArankB

+£+

(')(')()(')

rankAArankAArankArankA

===

1

A

-

11

.

m

AAAAI

--

==

11

().

AA

--

=

11

(')()'

AA

--

=

111

222

121

222

,,

then

,

nnn

nnn

xyz

xyz

XYZ

xyz

xyky

xyky

XYkY

xyky

æöæöæö

ç÷ç÷ç÷

ç÷ç÷ç÷

===

ç÷ç÷ç÷

ç÷ç÷ç÷

èøèøèø

+

æöæö

ç÷ç÷

+

ç÷ç÷

+==

ç÷ç÷

ç÷ç÷

+

èøèø

MMM

MM

111

().

ABBA

---

=

2

.

AAAA

==

nn

´

£

(

)

(

)

.

traceArankAr

==

n

AI

=

mn

´

1

m

´

1

n

´

11

'

mn

ijij

ij

XAYaxy

==

=

åå

1122

()()

'()''

(')()''()

()

'...

nn

XYZXYZ

XYZXYXZ

kXYkXYXkY

kXYkXkY

XYxyxyxy

++=++

+=+

==

+=+

=+++

ij

a

22

1111221121,,11

'....()...().

mmmmmmmmm

XAXaxaxaaxxaaxx

---

=+++++++

22

11112121,1

11

'....2.....2

=

mmmmmmm

mn

ijij

ij

XAXaxaxaxxaxx

axx

--

==

=+++++

åå

12

,,...,

m

xxx

'

XAX

'0

XAX

>

0

x

¹

'0

XAX

³

'0

XAX

<

0

x

¹

''0

XYYX

==

'0

XAX

£

0

ii

a

³

0

ii

a

=

0

ij

a

=

0

ji

a

=

'

PAP

'.

APP

=

mn

´

(

)

rankAmn

=<

12

,,...,

m

xxx

'

AA

'

AA

(

)

rankAkmn

=<<

'

AA

12

,,...,

n

xxx

ij

a

i

b

1,2,...,,1,2,...,

imjn

==

11112211

21122222

1122

....

....

....

nn

nn

mmmnnm

axaxaxb

axaxaxb

axaxaxb

+++=

+++=

+++=

M

12

,,...,

m

kkk

1

´

n

11121

21222

12

11

22

...

...

,isanrealmatrixcalledascoefficientmatrix

,

...

isan1vector of variables and,isan1realve

ctor.

n

n

mmmn

nm

aaa

aaa

Amn

aaa

xb

xb

Xnbm

xb

æö

ç÷

ç÷

ç÷

ç÷

èø

æöæö

ç÷ç÷

ç÷ç÷

=´=´

ç÷ç÷

ç÷ç÷

èøèø

MMOM

MM

´

nn

A

mn

´

m

AXb

=

ii

a

11

-££

ii

a

m

12

[,,...,]

=

n

Aaaa

i

a

1

´

n

'

'

() 1 for 1,2,...,

() 0 for 1,2,...,.

ii

ij

iaain

iiaaijn

==

=¹=

''

AAAAI

==

1

'.

AA

-

=

'

AA

'

PAP

12

,,...,

n

YYY

1

m

ii

i

tkx

=

=

å

12

(,,...,)'

n

YYYY

=

12

()(((),(),...,())'

n

EYEYEYEY

=

1121

2122

12

() (,)... (,)

(,) () ... (,)

()

(,)(,)... ()

n

n

nnn

VarYCovYYCovYY

CovYYVarYCovYY

VarY

CovYYCovYYVarY

æö

ç÷

ç÷

=

ç÷

ç÷

èø

MMOM

12

,,...,

n

YYY

2

()

i

VarY

s

=

2

()

n

VarYI

s

=

12

,,...,

n

YYY

12

,,..,

n

kkk

1

n

ii

i

kY

=

å

12

,,...,

m

xxx

12

,,...,

n

YYY

1212

(,,...,)',(,,...,)'

nn

YYYYKkkk

==

1

'

n

ii

i

KYkY

=

=

å

'

KY

1

(')'()()

n

ii

i

EKYKEYkEY

=

==

å

'

KY

(

)

(

)

' '.

VarKYKVarY

K

=

12

(,,...,)'

n

YYYY

=

12

(,,...,)

n

mmmm

=

S

12

,,...,

m

XXX

1

1/2

/2

11

(/,)exp()'()

2

(2)

n

fYYY

mmm

p

-

éù

S=--S-

êú

ëû

S

S

12

,,...,

k

YYY

2

1

k

i

i

Y

=

å

2

c

2

1

2

/2

1

()exp; 0

(/2)22

k

k

x

fxxx

k

c

-

æö

=-<<¥

ç÷

G

èø

12

,,...,

k

YYY

2

s

2

2

1

1

k

i

i

Y

s

=

å

12

,,...,

m

kkk

2

c

12

,,...,

k

mmm

2

1

k

i

i

Y

=

å

2

1

k

i

i

lm

=

=

å

12

,,...,

k

YYY

12

,,...,

k

mmm

2

s

2

2

1

1

k

i

i

Y

s

=

å

2

2

1

1

k

i

i

lm

s

=

=

å

1

00

m

iii

i

kXk

=

=Þ=

å

()

EUk

=

()2

VarUk

=

l

(

)

EUk

l

=+

()24

VarUk

l

=+

12

,,...,

k

UUU

i

U

i

n

,1,2,...,

i

ik

l

=

1

k

i

i

U

=

å

1

k

i

i

n

=

å

1

k

i

i

l

=

å

12

(,,...,)'

n

XXXX

=

m

S

i

k

'

XAX

2

c

A

S

12

(,,...,)

n

XXXX

=

m

S

1

'

XAX

2

c

1

n

1

'

A

mm

1

0

m

ii

i

kx

=

=

å

2

'

XAX

2

c

2

m

2

'

A

mm

12

'and'

XAXXAX

12

0

AA

S=

2

c

/

X

T

Yn

=

1

2

2

1

2

()1; -

2

n

T

n

t

ftt

n

n

n

p

+

æö

-

ç÷

èø

+

æö

G

ç÷

æö

èø

=+¥<<¥

ç÷

æö

èø

G

ç÷

èø

/

X

Yn

12

,,...,

m

xxx

m

2

c

/

/

Xm

F

Yn

=

/2

2

2

2

2

()1; 0

22

m

mn

m

F

mnm

m

n

fffff

mn

n

+

æö

-

-

æö

ç÷

èø

ç÷

èø

+

æöæö

G

ç÷ç÷

æö

æö

èøèø

=+<<¥

ç÷

ç÷

æöæö

èø

èø

GG

ç÷ç÷

èøèø

;

Y

l

2

c

/

/

Xm

F

Yn

=

l

12

,,..,

n

YYY

()

i

EY

12

(,,...,)'

m

Kkkk

=

12

1122

,,...,,

()...,1,2,...,

p

iiiipp

EYxxxin

bbb

bbb

=+++=

'

ij

xs

12

,,...,

n

YYY

2

()

i

VarY

s

=

2

(0,)

N

s

1122

....,1,2,...,.

iiiippi

Yxxxin

bbbe

=++++=

YX

be

=+

1

n

´

np

´

12

,,...,

p

XXX

1

m

´

b

1

p

´

e

1

n

´

2

~(0,).

n

NI

es

12

,,...,

p

XXX

12

,,...,

p

bbb

2

~(,)

YNXI

bs

2

()

().

EyX

VaryI

b

s

=

=

2

and

bs

12

(,,...,)

m

Xxxx

=

'

lb

'

y

l

12

(,,...,)'

n

YYYY

=

(')'

Ey

lb

=

l

12

(,,...,)'

n

=

llll

12

(,,...,)'

n

llll

=

'

Y

l

'

lb

'

lb

m

´

'

1

Y

l

'

2

Y

l

''

12

and

lblb

1122

()'

aaY

+

ll

1122

()'

aa

llb

+

'

lb

ˆ

'

lb

ˆ

b

''.

XXXY

b

=

1

'

m

ii

i

KYky

=

=

å

b

YX

be

=+

b

'

ee

'()'()

'2''''.

SYXYX

YYXYXX

eebb

bbb

==--

=-+

b

0

S

b

=

''

XXXY

b

Þ=

1

ˆ

(')'

XXXY

b

-

=

(

)

rankXp

=

K

2

'

'

S

XX

bb

=

¶¶

1

ˆ

(')'

XXXY

b

-

=

b

'

ee

12

,,...,

p

bbb

11

ˆ

()(')'()(')'

EXXXEYXXXX

bbb

--

===

1121

ˆ

()(')'()(')(')

VarXXXVarYXXXXX

bs

---

==

ˆˆ

'and'

lbmb

'and'

lbmb

12

,,...,

m

xxx

21

ˆˆ

(')'()['(')]

VarVarXX

lblblsll

-

==

21

ˆˆ

(',')['(')]

CovXX

lbmbsml

-

=

ˆ

YX

b

-

ˆ

()0.

EYX

b

-=

YX

be

=+

()0,()

EVar

ee

==S

e

(),()

EYXVarY

b

==S

S

'

PP

S=

1

2

'

1

111

(1,1,...,1)1

m

im

i

m

x

x

xxX

mmm

x

=

æö

ç÷

ç÷

===

ç÷

ç÷

èø

å

M

P

YX

be

=+

or ***

where*,*and*.

PYPXP

YX

YPYXPXP

be

be

ee

=+

=+

===

2

(*00and(*).

EVarI

ees

==

'

Y

l

2

,~(0,)

YXNI

bees

=+

'

Y

l

2

(')',

(')(').

EYX

VarY

b

s

=

=

ll

lll

''

~,1

''

YX

N

b

ss

æö

ç÷

èø

ll

llll

2

2

(')

'

Y

s

l

ll

'

1

m

2

2

(')

.

'

X

b

s

l

ll

'

Y

l

(0)

¹

l

'

LY

()

rankL

',

b

L

'

b

L

'

Y

l

l

1

m

´

'

.

'

Y

l

l

ll

'

y

l

2

(')

'

Y

l

ll

'

Y

l

12

(,,...,)

n

XXXX

=

m

.

S

',

XAX

2

1

with

n

c

1

'

m

ii

i

KXkx

=

=

å

1

'

A

mm

2

'

XAX

2

c

2

n

2

'

A

mm

12

'and'

XAXXAX

12

0

AA

S=

12

(,,...,)

n

XXXX

=

m

S

1

2

n

y

y

Y

y

æö

ç÷

ç÷

=

ç÷

ç÷

èø

M

12

,,...,

m

xxx

1

12

'...

k

XXQQQ

-

S=+++

'

ii

QXAX

=

(),1,2,...,.

ii

ANik

==

'

i

Qs

i

N

'

i

A

mm

1

k

i

i

NN

=

=

å

1

1

''.

k

i

i

A

mmmm

-

=

S=

å

12

(,,...,)'

=

n

Xxxx

12

,,...,

n

xxx

1

0.

m

i

i

k

=

=

å

1

2

()

()

()

()

æö

ç÷

ç÷

ç÷

ç÷

=

ç÷

ç÷

ç÷

ç÷

ç÷

èø

M

n

fX

x

fX

fX

x

X

fX

x

12

(,,...,)'

=

n

Kkkk

'

=

KX

K

X

´

mn

'

2(')

=+

XAX

AAX

X

1

´

n

(,)

m

NI

´

mn

1

´

m

'

YAY

3

1

121232

,23,

23

x

x

xxxxxx

--+-+

(,)

m

S

N

()

S=

rankn

0

S=

BA

'

YAY

'

KX

m

i

i

x

=

å

.

1

1

m

i

i

xx

m

=

=

å

12131

,,...,

j

xxxxxx

---

2,3,...,.

jm

=

12

,,...,

n

xxx

12131

,,...,

m

xxxxxx

---

1

n

´

11121

21222

12

...

...

...

n

n

mmmn

aaa

aaa

aaa

æö

ç÷

ç÷

ç÷

ç÷

èø

MMOM

mn

´

0,,,

ij

aijmn

=¹=

1122

(

,,...,).

mm

aa

Ag

a

dia

=

0

ij

a

=