50
Study Materials for 1 st Year B.Tech Students Paper Name: Mathematics Paper Code : M201 Teacher Name: Rahuldeb Das Lecture 1 Definition DEFINITION (Vector Space) A vector space over denoted is a non-empty set, satisfying the following axioms: 1. VECTOR ADDITION: To every pair there corresponds a unique element in such that 1. (Commutative law). 2. (Associative law). 3. There is a unique element in (the zero vector) such that for every (called the additive identity). 4. For every there is a unique element such that (called the additive inverse). is called VECTOR ADDITION. 2. SCALAR MULTIPLICATION: For each and there corresponds a unique element in such that 1. for every and 2. for every where 3. DISTRIBUTIVE LAWS: RELATING VECTOR ADDITION WITH SCALAR MULTIPLICATION

Vector Space

Embed Size (px)

Citation preview

Page 1: Vector Space

Study Materials for 1st Year B.Tech StudentsPaper Name: Mathematics

Paper Code : M201Teacher Name: Rahuldeb Das

Lecture 1

Definition

DEFINITION (Vector Space)   A vector space over denoted is a non-empty set, satisfying the following axioms:

1. VECTOR ADDITION: To every pair there corresponds a unique element

in such that

1. (Commutative law).

2. (Associative law).

3. There is a unique element in (the zero vector) such that

for every (called the additive identity).

4. For every there is a unique element such that

(called the additive inverse).

is called VECTOR ADDITION.

2. SCALAR MULTIPLICATION: For each and there corresponds a

unique element in such that

1. for every and

2. for every where 3. DISTRIBUTIVE LAWS: RELATING VECTOR ADDITION WITH SCALAR

MULTIPLICATION

For any and the following distributive laws hold:

1.

2.

Page 2: Vector Space

Note: the number 0 is the element of whereas is the zero vector.

Remark   The elements of are called SCALARS, and that of are called VECTORS. If

the vector space is called a REAL VECTOR SPACE. If the vector space is called a complex vector space.

We may sometimes write for a vector space if is understood from the context.

THEOREM   Let be a vector space over Then

1. implies

2. if and only if either is the zero vector or

3. for every

Proof. Proof of Part 1.

For by Axiom 1d there exists such that

Hence, is equivalent to

Proof of Part 2.

As , using the distributive law, we have

Thus, for any , the first part implies . In the same way,

Hence, using the first part, one has for any

Now suppose If then the proof is over. Therefore, let us assume

(note that is a real or complex number, hence exists and

Page 3: Vector Space

as for every vector

Thus we have shown that if and then

Proof of Part 3.

We have and hence height6pt width 6pt depth 0pt

Examples1. The set of real numbers, with the usual addition and multiplication (i.e.,

and ) forms a vector space over

2. Consider the set For and

define,

Then is a real vector space.

3. Let be the set of -tuples of real

numbers. For in and we define

(called component wise or coordinate wise operations). Then is a real vector space with addition and scalar multiplication defined as above. This vector space

is denoted by called the real vector space of -tuples.

4. Let (the set of positive real numbers). This is NOT A VECTOR SPACE under usual operations of addition and scalar multiplication (why?). We now define a new vector addition and scalar multiplication as

Page 4: Vector Space

Subspaces

DEFINITION (Vector Subspace)   Let be a NON-EMPTY SUBSET of is said to

be a subspace of if whenever and where the

vector addition and scalar multiplication are the same as that of

Remark   Any subspace is a vector space in its own right with respect to the vector

addition and scalar multiplication that is defined for

EXAMPLE

1. Let be a vector space. Then

1. the set consisting of the zero vector 2.

are vector subspaces of These are called trivial subspaces.

2. Let Then is a subspace of ( is a plane in passing through the origin.)

3. Let Then is not a subspace of ( is again a plane in but it doesn't pass through the origin.)

4. Let Then is a subspace of

5. The vector space is a subspace of the vector space

6. Conisder the vector space given in Example 5. Let Then check that

Hence is not a subspace of .

Page 5: Vector Space

But if we think of as a subset of the real vector space (component wise addition and scalar multiplication), then is a subspace. Check that

is a subspace of ( represents a line passing through

the point , the zero vector of ).

7. Consider the set given in Example 9. Then is a real as well as a

complex vector space Let be a subset of . Then

verify that is a vector subspace of the real vector space . The reason

being the following: the condition implies that for any scalar , as

Assignment

1. Let be the set of all real numbers. Define the addition in by

and the scalar multiplication by Prove that is a real vector space with respect to the operations defined above.

2. Which of the following are correct statements?

1. Let Then is a subspace of

2. Let be a vector space. Let Then the set forms a vector subspace of

3. Let Then is a subspace of the

real vector space,

Multiple Choice Questions:

1. If in a vector space c. α = θ , then

a) c= o and α = θ b) only c= o

Page 6: Vector Space

c) only α = θ d) c= o or α = θ

2. If S and T are two subspace of a vector space V then which one of the followings is a sub space of V also ?

a) S U T b) S ∩ T

c) S -T d) T- S

3. If a finite set has 5 elements, then its power set has

a) 5 ! b) 25

c) 10 d) None

Lecture 2

Linear Combinations

DEFINITION (Linear Span)   Let be a vector space and let be a non-empty subset of The linear span of is the set defined by

 

If is an empty set we define

EXAMPLE

1. Is a linear combination of and

Solution: We want to find such that

(3.1.1)

Page 7: Vector Space

2. Verify that is not a linear combination of the vectors and

?

3. The linear span of over is

 

   

   

4.

as , and if , take and

.

LEMMA (Linear Span is a subspace)   Let be a vector space and let be a non-

empty subset of Then is a subspace of

Proof. By definition, and hence is non-empty subset of Let

Then, for there exist vectors and scalars such

that and Hence,

Thus, is a vector subspace of

Remark   Let be a vector space and be a subspace. If then

is a subspace of as is a vector space in its own right.

THEOREM   Let be a non-empty subset of a vector space Then is the smallest subspace of containing

Page 8: Vector Space

Proof. For every and therefore, To show is the

smallest subspace of containing consider any subspace of containing Then

by Proposition, and hence the result follows.

DEFINITION   Let be an matrix with real entries. Then using the rows

and columns we define

1.

2.

3. denoted as

4. denoted

Note that the ``column space" of a matrix consists of all such that has a

solution. Hence,

LEMMA   Let be a real matrix. Suppose for some elementary matrix

Then

THEOREM   Let be an matrix with real entries. Then

1. is a subspace of ; 2. the non-zero row vectors of a matrix in row-reduced form, forms a basis for the

row-space. Hence

Proof. Part can be easily proved. Let be an matrix. For part let be the

row-reduced form of with non-zero rows Then

for some elementary matrices Then, a repeated

application of Lemma implies That is, if the rows

of the matrix are then

Page 9: Vector Space

Hence the required result follows.

Linear Independence

DEFINITION (Linear Independence and Dependence)   Let be

any non-empty subset of If there exist some non-zero 's such that

then the set is called a linearly dependent set. Otherwise, the set is called linearly independent.

EXAMPLE  

1. Let Then check that

Since and

is a solution of, so the set is a linearly dependent subset of

2. Let Suppose there exists such that

Then check that in this case we

necessarily have which shows that the set

is a linearly independent subset of

In other words, if is a non-empty subset of a vector space then to check whether the set is linearly dependent or independent, one needs to consider the equation

(1)

In case is THE ONLY SOLUTION of (1), the set becomes a linearly independent subset of Otherwise, the set becomes a linearly dependent subset of

PROPOSITION   Let be a vector space. 1. Then the zero-vector cannot belong to a linearly independent set.

Page 10: Vector Space

2. If is a linearly independent subset of then every subset of is also linearly independent.

3. If is a linearly dependent subset of then every set containing is also linearly dependent.

Proof. We give the proof of the first part. The reader is required to supply the proof of other parts.

Let be a set consisting of the zero vector. Then for any

Hence, for the system

we have a non-zero solution and

Therefore, the set is linearly dependent.

THEOREM   Let be a linearly independent subset of a vector space

Suppose there exists a vector such that the set is

linearly dependent, then is a linear combination of

Proof. Since the set is linearly dependent, there exist scalars

NOT ALL ZERO such that

(2)

CLAIM:

Let if possible Then equation (2) gives with

not all zero. Hence, by the definition of linear independence, the set

is linearly dependent which is contradictory to our hypothesis. Thus,

and we get

Note that for every and hence for Hence the result follows.

Page 11: Vector Space

We now state two important corollaries of the above theorem. We don't give their proofs as they are easy consequence of the above theorem.

COROLLARY   Let be a linearly dependent subset of a vector space

Then there exists a smallest such that

COROLLARY   Let be a linearly independent subset of a vector space

Suppose there exists a vector such that Then the set

is also a linearly independent subset of

Assignment1. Show that any two row-equivalent matrices have the same row space. Give

examples to show that the column space of two row-equivalent matrices need not be same.

2. Find all the vector subspaces of

3. Find the conditions on the real numbers so that the set

is a subspace of .

4. Show that the vector space is a subspace of . Further show that if

consists of all polynomials of degree , then is not a subspace. 5. Conisder the vector space given in Example 5. Determine all its vector

subspaces.

6. Let and be two subspaces of a vector space Show that is a

subspace of Also show that need not be a subspace of When is

a subspace of

7. Let and be two subspaces of a vector space Define

Show that is a subspace of Also

show that

Page 12: Vector Space

8. Let where

Determine

all such that

9. Consider the vector space Let Find all choices for the vector

such that the set is linear independent subset of Does there exist

choices for vectors and such that the set is linearly independent subset of ?

10. If none of the elements appearing along the principal diagonal of a lower triangular matrix is zero, show that the row vectors are linearly independent in

The same is true for column vectors.

11. Let Determine whether or not

the vector

12. Show that is linearly dependent in

13. Show that is a linearly independent set in In

general if is a linearly independent set then

is also a linearly independent set.

14. In give an example of vectors and such that is linearly

dependent but any set of vectors from is linearly independent. 15. What is the maximum number of linearly independent vectors in

Multiple Choice Questions

1. If α, β are two vectors of a vector space then indicate which one of the following is not a liner combination of α and β

a) 3α-4 β b) α

c) None d) α.β

Page 13: Vector Space

2. The maximum number of independent vectors among the four (1,2,3,4), (0,4,3,1), (0,0,7,1) and (0,0,0,0) is

a) 4 b) 3

c) 2 d) 1

3. The maximum number of independent vectors in

V = { (x1,x2,x3,x4,x5) : xi Є R} is

a) 4 b) 3

c) 5 d) None

Lecture 3

Bases DEFINITION (Basis of a Vector Space)  

1. A non-empty subset of a vector space is called a basis of if 1. is a linearly independent set, and

2. i.e., every vector in can be expressed as a linear combination of the elements of

2. A vector in is called a basis vector.

Remark   Let be a basis of a vector space Then any is a

unique linear combination of the basis vectors,

Observe that if there exists a such that and

then

But then the set is linearly independent and therefore the scalars

for must all be equal to zero. Hence, for and we have the uniqueness.

Page 14: Vector Space

By convention, the linear span of an empty set is Hence, the empty set is a basis of

the vector space

EXAMPLE 

1. Check that if then or

or or are bases of

2. For let Then, the set

forms a basis of This set is called the standard basis of

That is, if then the set forms an standard basis of

3. Let be a vector subspace of

Then It can be easily verified that the

vector and

Then by Remark, cannot be a basis of

A basis of can be obtained by the following method:

The condition is equivalent to we replace the value of

with to get

Hence, forms a basis of

Page 15: Vector Space

4. Let and That is, is a complex vector space.

Note that any element can be written as Hence, a

basis of is

5. Let and That is, is a real vector space. Any

element is expressible as Hence a basis of is

Observe that is a vector in Also, and hence is not defined.

6. Recall the vector space the vector space of all polynomials with real coefficients. A basis of this vector space is the set

This basis has infinite number of vectors as the degree of the polynomial can be any positive integer.

DEFINITION (Finite Dimensional Vector Space)   A vector space is said to be finite dimensional if there exists a basis consisting of finite number of elements. Otherwise, the vector space is called infinite dimensional.

Remark   We can use the above results to obtain a basis of any finite dimensional vector space as follows:

Step 1: Choose a non-zero vector, say, Then the set is linearly independent.

Step 2: If we have got a basis of Else there exists a vector, say,

such that Then by Corollary, the set is linearly independent.

Step 3: If then is a basis of Else there exists a vector,

say, such that So, by Corollary, the set is linearly independent.

Page 16: Vector Space

At the step, either or

In the first case, we have as a basis of

In the second case, . So, we choose a vector, say,

such that Therefore, by Corollary, the set

is linearly independent.

This process will finally end as is a finite dimensional vector space.

Assignment

1. Let be a subset of a vector space Suppose

but is not a linearly independent set. Then prove that each vector in can be expressed in more than one way as a linear combination of vectors from

2. Show that the set is a basis of 3. Let be a matrix of rank Then show that the non-zero rows in the row-

reduced echelon form of are linearly independent and they form a basis of the row space of

Multiple Choice Questions

1. The value of k for which the two vectors (k,6) and (2,k) form a basis of V2 is

a) 2√ 3 b) - 2√ 3

c) any value d) any value except ±2√3

2. The vectyors (1,2,3) and (4,-2,7) are

a) linearly independent b) linearly dependent

c) form a basis of V3 d) None

3. The dimension of the sub space {(x1,x2,x3) : xi Є R and 2x1 + x2 – x3= 0} is

Page 17: Vector Space

a) 2 b) 1

c) 3 d) None

Lecture 4

Basic Properties

DEFINITION (Linear Transformation)   Let and be vector spaces over A map is called a linear transformation if

We now give a few examples of linear transformations.

EXAMPLE

1. Define by for all Then is a linear transformation as

2. Verify that the maps given below from to are linear transformations. Let

1. Define

2. For any define

3. For a fixed vector define

Note that examples and can be obtained by assigning particular values for the vector

3. Define by

Then is a linear transformation with and

Page 18: Vector Space

DEFINITION (Zero Transformation)   Let be a vector space and let be the map defined by

Then is a linear transformation. Such a linear transformation is called the zero transformation and is denoted by DEFINITION (Identity Transformation)   Let be a vector space and let be the map defined by

Then is a linear transformation. Such a linear transformation is called the Identity transformation and is denoted by

THEOREM   Let be a linear transformation and be an ordered basis of Then the linear transformation is a linear combination of the

vectors

In other words, is determined by

Proof. Since is a basis of for any there exist scalars such that

So, by the definition of a linear transformation

Observe that, given we know the scalars Therefore, to know

we just need to know the vectors in

That is, for every is determined by the coordinates of

with respect to the ordered basis and the vectors

DEFINITION (Inverse Linear Transformation)   Let be a linear transformation. If the map is one-one and onto, then the map defined by

is called the inverse of the linear transformation

EXAMPLE  

Page 19: Vector Space

1. Define by Then is defined by

Note that

 

   

   

Hence, the identity transformation. Verify that Thus,

the map is indeed the inverse of the linear transformation

2. Recall the vector space and the linear transformation

defined by

for Then is defined as

for Verify that Hence, conclude that the map is indeed the inverse of the linear transformation

Assignment  1. Which of the following are linear transformations Justify your

answers.

Page 20: Vector Space

1. Let and with

2. Let with

3. Let with 4. Let and with

Lecture 5

Rank-Nullity Theorem

DEFINITION (Range and Null Space)   Let be finite dimensional vector spaces over the same set of scalars and be a linear transformation. We define

1. and

2.

We now prove some results associated with the above definitions.

PROPOSITION   Let and be finite dimensional vector spaces and let

be a linear transformation. Suppose that is an ordered basis of Then

1.

1. is a subspace of

2.

3.2.

1. is a subspace of

2.

Page 21: Vector Space

3. is one-one is the zero subspace of

is a basis of

4. if and only if

Proof. The results about and can be easily proved. We thus leave the proof for the readers.

We now assume that is one-one. We need to show that

Let Then by definition, Also for any linear transformation,

Thus So, is one-one implies That is,

Let We need to show that is one-one. So, let us assume that for some

Then, by linearity of This implies,

This in turn implies Hence, is one-one.

The other parts can be similarly proved.

Remark  

1. The space is called the RANGE SPACE of and is called the NULL SPACE of

2. We write and

3. is called the rank of the linear transformation and is called the nullity of

EXAMPLE  Determine the range and null space of the linear transformation

Solution: By Definition We therefore have

 

   

Page 22: Vector Space

   

   

   

Also, by definition

 

   

   

    

   

   

   

   

THEOREM (Rank Nullity Theorem)   Let be a linear transformation and be a finite dimensional vector space. Then

or equivalently

Proof. Let and Suppose is a basis of

Since is a linearly independent set in we can extend it to form

a basis of . So, there exist vectors such that

is a basis of Therefore,

 

   

Page 23: Vector Space

   

We now prove that the set is linearly independent. Suppose the set is not linearly independent. Then, there exists scalars,

not all zero such that

That is,

So, by definition of

Hence, there exists scalars such that

That is,

But the set is a basis of and so linearly independent. Thus by definition of linear independence

In other words, we have shown that is a basis of Hence,

COROLLARY   Let be a linear transformation on a finite dimensional vector space Then

Proof. By Proposition, is one-one if and only if By the rank-nullity

Theorem is equivalent to the condition Or equivalently is onto.

By definition, is invertible if is one-one and onto. But we have shown that is one-one if and only if is onto. Thus, we have the last equivalent condition.

Page 24: Vector Space

Remark   Let be a finite dimensional vector space and let be a linear transformation. If either is one-one or is onto, then is invertible.

The following are some of the consequences of the rank-nullity theorem. The proof is left as an exercise for the reader.

COROLLARY  The following are equivalent for an real matrix

1.2. There exist exactly rows of that are linearly independent. 3. There exist exactly columns of that are linearly independent.

4. There is a sub matrix of with non-zero determinant and every

sub matrix of has zero determinant. 5. The dimension of the range space of is 6. There is a subset of consisting of exactly linearly independent vectors

such that the system for is consistent.

7. The dimension of the null space of

Assignment

1. Let be a linear transformation and let be

linearly independent in Prove that is linearly independent.

2. Let be defined by

Then the vectors and are linearly independent whereas and

are linearly dependent.

3. Is there a linear transformation

4. Let be defined by

Page 25: Vector Space

1. Find for

2. Find and Also calculate and 3. Show that and find the matrix of the linear transformation with

respect to the standard basis.

5. Let be a linear transformation.

1. If is finite dimensional then show that the null space and the range space of are also finite dimensional.

2. If and are both finite dimensional then show that

1. if then is onto.

2. if then is not one-one.

6. Let be an real matrix. Then

3. if then the system has infinitely many solutions,

4. if then there exists a non-zero vector such that the system does not have any solution.

7. Let be a vector space of dimension and let be an

ordered basis of . Suppose and let

. Put . Then prove that is a basis of if and only if the matrix is invertible.

8. Let be an matrix. Prove that

Lecture 6

Definition

Page 26: Vector Space

In , given two vectors , we know the inner product

. Note that for any and , this inner product satisfies the conditions

and if and only if . Thus, we are motivated to define an inner product on an arbitrary vector space.

DEFINITION (Inner Product)   Let be a vector space over An inner product

over denoted by is a map,

such that for and

1.

2. the complex conjugate of and

3. for all and equality holds if and only if

DEFINITION (Inner Product Space)   Let be a vector space with an inner product

Then is called an inner product space, in short denoted by IPS.

EXAMPLE   The first two examples given below are called the STANDARD INNER

PRODUCT or the DOT PRODUCT on and respectively.. 1. Let be the real vector space of dimension Given two vectors

and of we define

Verify is an inner product.

2. Let be a complex vector space of dimension Then for

and in check that

Page 27: Vector Space

is an inner product.

3. Let and let Define Check that is

an inner product. Hint: Note that

4. let Show that

is an inner

product in 5. Consider the real vector space . In this example, we define three products that

satisfy two conditions out of the three conditions for an inner product. Hence the three products are not inner products.

1. Define Then it is easy to verify that the third condition is not valid whereas the first two conditions are valid.

2. Define Then it is easy to verify that the first condition is not valid whereas the second and third conditions are valid.

3. Define Then it is easy to verify that the second condition is not valid whereas the first and third conditions are valid.

DEFINITION (Length/Norm of a Vector)   For we define the length (norm) of

denoted by the positive square root.

A very useful and a fundamental inequality concerning the inner product is due to Cauchy and Schwartz. The next theorem gives the statement and a proof of this inequality.

THEOREM (Cauchy-Schwartz inequality)   Let be an inner product space. Then

for any

Page 28: Vector Space

The equality holds if and only if the vectors and are linearly dependent. Further, if

, then DEFINITION 5.1.8 (Angle between two vectors)   Let be a real vector space. Then

for every by the Cauchy-Schwartz inequality, we have

We know that is an one-one and onto function. Therefore, for

every real number there exists a unique such that

1. The real number with and satisfying is called the angle between the two vectors and in

2. The vectors and in are said to be orthogonal if

3. A set of vectors is called mutually orthogonal if

for all

DEFINITION (Orthogonal Complement)   Let be a subspace of a vector space

with inner product . Then the subspace

is called the orthogonal complement of in

THEOREM   Let be an inner product space. Let be a set of non-zero, mutually orthogonal vectors of

1. Then the set is linearly independent.

2.

3. Let and also let for Then for any

Page 29: Vector Space

In particular, for all if and only if

Therefore, we have obtained the required result.

DEFINITION (Orthonormal Set)   Let be an inner product space. A set of non-zero,

mutually orthogonal vectors in is called an orthonormal set if

for

If the set is also a basis of then the set of vectors is called an orthonormal basis of

EXAMPLE   1. Consider the vector space with the standard inner product. Then the standard

ordered basis is an orthonormal set. Also, the basis

is an orthonormal set.

Assignment

1. Recall the following inner product on for and

1. Find the angle between the vectors and

2. Let Find such that

3. Find two vectors such that and 2. Find an inner product in such that the following conditions hold:

Page 30: Vector Space

[Hint: Consider a symmetric matrix Define and

solve a system of equations for the unknowns .]

3. Let Find with respect to the standard inner product.

4. Let be a subspace of a finite dimensional inner product space . Prove that

5. Let be the real vector space of all continuous functions with domain

That is, Then show that is an inner product space with inner

product

For different values of and find the angle between the functions

and

6. Let be an inner product space. Prove that

This inequality is called the TRIANGLE INEQUALITY.

7. Let Use the Cauchy-Schwartz inequality to prove that

When does the equality hold?

Lecture 7

Definitions

Page 31: Vector Space

In this chapter, the linear transformations are from a given finite dimensional vector space to itself. Observe that in this case, the matrix of the linear transformation is a square matrix. So, in this chapter, all the matrices are square matrices and a vector

means for some positive integer

Let be a matrix of order In general, we ask the question:

For what values of there exist a non-zero vector such that

(1)

Here, stands for either the vector space over or over Equation (1) is equivalent to the equation

This system of linear equations has a non-zero solution, if

So, to solve (1), we are forced to choose those values of for which

Observe that is a polynomial in of degree We are therefore lead to the following definition.

DEFINITION (Characteristic Polynomial)   Let be a matrix of order The

polynomial is called the characteristic polynomial of and is denoted by

The equation is called the characteristic equation of If is a

solution of the characteristic equation then is called a characteristic value of

Some books use the term EIGEN VALUE in place of characteristic value.

THEOREM   Let Suppose is a root of

the characteristic equation. Then there exists a non-zero such that

Proof. Since is a root of the characteristic equation, This shows

that the matrix is singular and therefore by Theorem the linear system

Page 32: Vector Space

has a non-zero solution.

Remark Observe that the linear system has a solution for every

So, we consider only those that are non-zero and are solutions of the linear system

DEFINITION (Eigen value and Eigenvector)   If the linear system has a non-

zero solution for some then

1. is called an eigen value of

2. is called an eigenvector corresponding to the eigen value of and

3. the tuple is called an eigen pair.

Remark   To understand the difference between a characteristic value and an eigen value, we give the following example.

Consider the matrix Then the characteristic polynomial of is

Given the matrix recall the linear transformation defined by

1. If that is, if is considered a COMPLEX matrix, then the roots of

in are So, has and as eigen pairs.

2. If that is, if is considered a REAL matrix, then has no solution

in Therefore, if then has no eigen value but it has as characteristic values.

Page 33: Vector Space

Remark   Note that if is an eigen pair for an matrix then for any non-zero

is also an eigen pair for Similarly, if are

eigenvectors of corresponding to the eigen value then for any non-zero

it is easily seen that if , then is also an eigenvector of corresponding to the eigen value Hence, when we talk of

eigenvectors corresponding to an eigen value we mean linearly independent eigenvectors.

Suppose is a root of the characteristic equation Then

is singular and Suppose Then

by Corollary, the linear system has linearly independent

solutions. That is, has linearly independent eigenvectors corresponding to the

eigen value whenever

EXAMPLE  

1. Let Then Hence, the characteristic

equation has roots That is is a repeated eigen value. Now check that the

equation for is equivalent to the equation

And this has the solution Hence, from the above remark, is a representative for the eigenvector. Therefore, here we have two eigen values mathend000# but only one eigenvector.

2. Let Then The characteristic equation

has roots Here, the matrix that we have is and we know that for

every and we can choose any two linearly independent vectors

from to get and as the two eigen pairs.

Page 34: Vector Space

In general, if are linearly independent vectors in then

are eigen pairs for the identity matrix,

3. Let Then The characteristic

equation has roots Now check that the eigen pairs are and

In this case, we have two distinct eigen values and the corresponding eigenvectors are also linearly independent. The reader is required to prove the linear independence of the two eigenvectors.

4. Let Then The characteristic

equation has roots Hence, over the matrix has no eigen value.

Over the reader is required to show that the eigen pairs are and

Assignment1. Find the eigen values of a triangular matrix.

2. Find eigen pairs over for each of the following matrices:

and

3. Prove that the matrices and have the same set of eigen values. Construct a

matrix such that the eigenvectors of and are different. 4. Let be a matrix such that ( is called an idempotent matrix). Then

prove that its eigen values are either 0 or or both.

Page 35: Vector Space

Lecture 8

THEOREM 6.1.11   Let be an matrix with eigen values

not necessarily distinct. Then and

EXERCISE  

1. Let be a skew symmetric matrix of order Then prove that 0 is an eigen value of

2. Let be a orthogonal matrix .If , then prove that

there exists a non-zero vector such that

Let be an matrix. Then in the proof of the above theorem, we observed that the

characteristic equation is a polynomial equation of degree in Also,

for some numbers it has the form

Note that, in the expression is an element of Thus, we can only substitute by elements of

It turns out that the expression

holds true as a matrix identity. This is a celebrated theorem called the Cayley Hamilton Theorem. We state this theorem without proof and give some implications.

THEOREM (Cayley Hamilton Theorem)   Let be a square matrix of order Then satisfies its characteristic equation. That is,

holds true as a matrix identity.

Some of the implications of Cayley Hamilton Theorem are as follows.

Page 36: Vector Space

Remark

1. Let Then its characteristic polynomial is Also, for

the function, and This shows that the

condition for each eigen value of does not imply that 2. Suppose we are given a square matrix of order and we are interested in

calculating where is large compared to Then we can use the division

algorithm to find numbers and a polynomial such that

 

    

3.Hence, by the Cayley Hamilton Theorem,

4.

5. Let be a non-singular matrix of order Then note that and

This matrix identity can be used to calculate the inverse.

Note that the vector (as an element of the vector space of all matrices) is a linear

combination of the vectors

EXERCISE   Find inverse of the following matrices by using the Cayley Hamilton Theorem

Page 37: Vector Space

THEOREM   If are distinct eigen values of a matrix with

corresponding eigenvectors then the set is linearly independent.

Proof. The proof is by induction on the number of eigen values. The result is obviously true if as the corresponding eigenvector is non-zero and we know that any set containing exactly one non-zero vector is linearly independent.

Let the result be true for We prove the result for We consider the equation

(1)

for the unknowns We have

 

   

  (2)

From Equations (1) and (2), we get

This is an equation in eigenvectors. So, by the induction hypothesis, we have

But the eigen values are distinct implies for We therefore

get for Also, and therefore (1) gives

Thus, we have the required result.

We are thus lead to the following important corollary.

Page 38: Vector Space

COROLLARY   The eigenvectors corresponding to distinct eigen values of an matrix are linearly independent.

Assignment

1. For an matrix prove the following.

1. and have the same set of eigen values.

2. If is an eigen value of an invertible matrix then is an eigen value of

3. If is an eigen value of then is an eigen value of for any positive integer

4. If and are matrices with nonsingular then and have the same set of eigen values.

In each case, what can you say about the eigenvectors?

2. Let and be matrices for which and 1. Do and have the same set of eigen values? 2. Give examples to show that the matrices and need not be similar.

3. Let be an eigen pair for a matrix and let be an eigen pair for another matrix

1. Then prove that is an eigen pair for the matrix

2. Give an example to show that if are respectively the eigen values of

and then need not be an eigen value of

Lecture 9

Diagonalisation

Page 39: Vector Space

DEFINITION (Matrix Diagonalisation)   A matrix is said to be diagonalisable if there exists a non-singular matrix such that is a diagonal matrix.

Remark   Let be an diagonalisable matrix with eigen values By

definition, is similar to a diagonal matrix Observe that as similar matrices have the same set of eigen values and the eigen values of a diagonal matrix are its diagonal entries.

EXAMPLE   Let Then we have the following: 1. Let Then has no real eigen value and hence doesn't have

eigenvectors that are vectors in Hence, there does not exist any non-singular

real matrix such that is a diagonal matrix.

2. In case, the two complex eigen values of are and the

corresponding eigenvectors are and respectively. Also, and

can be taken as a basis of Define a complex matrix by

Then

THEOREM   let be an matrix. Then is diagonalisable if and only if has linearly independent eigenvectors.

COROLLARY 6.2.5   let be an matrix. Suppose that the eigen values of are distinct. Then is diagonalisable.

EXAMPLE  

1. Let Then Hence, has

eigen values It is easily seen that and are the only eigen pairs. That is, the matrix has exactly one eigenvector

Page 40: Vector Space

corresponding to the repeated eigen value Hence, by Theorem  the matrix is not diagonalisable.

2. Let Then Hence, has

eigen values It can be easily verified that and

correspond to the eigen value and corresponds to the eigen value

Note that the set consisting of eigenvectors corresponding to the eigen value are not orthogonal. This set can be replaced by

the orthogonal set which still consists of eigenvectors

corresponding to the eigen value as . Also,

the set forms a basis of So, by Theorem , the

matrix is diagonalisable. Also, if is the

corresponding unitary matrix then

Observe that the matrix is a symmetric matrix. In this case, the eigenvectors are

mutually orthogonal. In general, for any real symmetric matrix there always exist eigenvectors and they are mutually orthogonal. This result will be proved later.

Assignment1. By finding the eigen values of the following matrices, justify whether or not

for some real non-singular matrix and a real diagonal matrix

for any with

2. Are the two matrices and diagonalisable?

Page 41: Vector Space

3. Find the eigen values and eigenvectors of , where if and otherwise.

4. Let be an matrix and an matrix. Suppose Then show that is diagonalisable if and only if both and are diagonalisable.

5. Let be a linear transformation with and

Then

1. determine the eigen values of 2. find the number of linearly independent eigenvectors corresponding to

each eigen value? 3. is diagonalisable? Justify your answer.

6. Let be a non-zero square matrix such that Show that cannot be diagonalised.

7. Are the following matrices diagonalisable?