118
Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension 4.6 Rank of a Matrix and Systems of Linear Equations 4.7 Coordinates and Change of Basis 4.1

Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

Embed Size (px)

Citation preview

Page 1: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

Chapter 4

Vector Spaces

4.1 Vectors in Rn

4.2 Vector Spaces

4.3 Subspaces of Vector Spaces

4.4 Spanning Sets and Linear Independence

4.5 Basis and Dimension

4.6 Rank of a Matrix and Systems of Linear Equations

4.7 Coordinates and Change of Basis

4.8 Applications of Vector Spaces4.1

Page 2: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.2

4.1 Vectors in Rn

a sequence of n real numbers 1 2( , , , )nx x x

An ordered n-tuple ( 有序的 n 項 ) :

the set of all ordered n-tuples

Rn-space (n 維空間 ) :

n = 4 R4-space = set of all ordered quadruple of real numbers ),,,( 4321 xxxx

R1-space = set of all real numbersn = 1

n = 2 R2-space = set of all ordered pair of real numbers ),( 21 xx

n = 3 R3-space = set of all ordered triple of real numbers ),,( 321 xxx

(R1-space can be represented geometrically by the x-axis)

(R2-space can be represented geometrically by the xy-plane)

(R3-space can be represented geometrically by the xyz-space)

Page 3: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.3

Notes:

Ex:

a point

21, xx

a vector

21, xx

0,0

(1) An n-tuple can be viewed as a point in Rn

with the xi’s as its coordinates ( 座標 )

(2) An n-tuple also can be viewed as a vector

in Rn with the xi’s as its components ( 分量 )

),,,( 21 nxxx

),,,( 21 nxxx

1 2( , , , )nx x xx

※ A vector on the plane is expressed geometrically by a directed line segment whose initial point is the origin and whose terminal point is the point (x1, x2)

or

Page 4: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.4

1 2 1 2, , , , , , ,n nu u u v v v u v (two vectors in Rn)

Equality: if and only if vu

Vector addition (the sum of u and v): nn vuvuvu , , , 2211 vu

Scalar multiplication (the scalar multiple of u by c): ncucucuc ,,, 21 u

Notes:

The sum of two vectors and the scalar multiple of a vector

in Rn are called the standard operations in Rn

1 1 2 2, , , n nu v u v u v

Page 5: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.5

Difference between u and v:

1 1 2 2 3 3( 1) ( , , ,..., )n nu v u v u v u v u v u v

Zero vector ( 零向量 ):)0 ..., ,0 ,0(0

Page 6: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.6

Notes:

A vector in can be viewed as:),,,( 21 nuuu u nR

1 2[ ]nu u uu

nu

u

u

2

1

u

※ Therefore, the operations of matrix addition and scalar multiplication generate the same results as the corresponding vector operations (see the next slide)

or

a n×1 column matrix (column vector):

a 1×n row matrix (row vector):Use comma to separate components

Use blank space to separate entries

Page 7: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.7

), , ,(

) , , ,() , , ,(

2211

2121

nn

nn

vuvuvu

vvvuuu

vu

1 2 1 2

1 1 2 2

[ ] [ ]

[ ]n n

n n

u u u v v v

u v u v u v

u v

nnnn vu

vu

vu

v

v

v

u

u

u

22

11

2

1

2

1

vu

Vector addition Scalar multiplication

nn cu

cu

cu

u

u

u

cc

2

1

2

1

u

), ,,(

),,,(

21

21

n

n

cucucu

u uucc

u

1 2

1 2

[ ]

[ ]n

n

c c u u u

cu cu cu

u

Regarded as 1×n row matrix

Regarded as n×1 column matrix

Page 8: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.8

Theorem 4.2: Properties of vector addition and scalar multiplication

Let u, v, and w be vectors in Rn, and let c and d be scalars(1) u+v is a vector in Rn (closure ( 封閉性 ) under vector addition)

(2) u+v = v+u (commutative ( 交換律 ) property of vector addition)(3) (u+v)+w = u+(v+w) (associative ( 結合律 ) property of vector addition)

(4) u+0 = u (additive identity property)

(5) u+(–u) = 0 (additive inverse property)

(6) cu is a vector in Rn (closure ( 封閉性 ) under scalar multiplication)

(7) c(u+v) = cu+cv (distributive property ( 分配律 ) of scalar multiplication over vector addition)

(8) (c+d)u = cu+du (distributive property ( 分配律 ) of scalar multiplication over real-number addition)

(9) c(du) = (cd)u (associative ( 結合律 ) property of multiplication)(10) 1(u) = u (multiplicative identity property)

※ Except Properties (1) and (6), these properties of vector addition and scalar multiplication actually inherit the properties of matrix addition and scalar multiplication in Ch 2 because we can regard vectors in Rn as special cases of matrices

(Note that –u is just the notation of the additive inverse of u, and –u = (–1)u will be proved in Thm. 4.4 )

Page 9: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.9

Ex 5: Practice standard vector operations in R4

Sol: (a)

Let u = (2, –1, 5, 0), v = (4, 3, 1, –1), and w = (–6, 2, 0, 3) be

vectors in R4. Solve x in each of the following cases.

(a) x = 2u – (v + 3w)

(b) 3(x+w) = 2u – v+x

2 ( 3 )

2 ( 1)( 3 )

2 3

(4, 2, 10, 0) (4, 3, 1, 1) ( 18, 6, 0, 9)

(4 4 18, 2 3 6, 10 1 0, 0 1 9)

(18, 11, 9, 8)

x u v w

u v w

u v w (distributive property of scalar multiplication over vector addition)

Page 10: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.10

(b)

312 2

3 91 12 2 2 2

9112 2

3( ) 2

3 3 2

3 2 3

2 2 3

2, 1,5,0 2, , , 9, 3,0,

9, , , 4

x w u v x

x w u v x

x x u v w

x u v w

x u v w

(distributive property of scalar multiplication over vector addition)

(subtract (3w+x) from both sides)

(scalar multiplication for the both sides with a scalar to be 1/2)

Page 11: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.11

Theorem 4.3: (Properties of additive identity and additive inverse)

Let v be a vector in Rn and c be a scalar. Then the following

properties are true(1) The additive identity is unique, i.e., if v+u = v, u must be 0

(2) The additive inverse of v is unique, i.e., if v+u = 0, u must be –v

(3) 0v = 0

(4) c0 = 0

(5) If cv = 0, either c = 0 or v = 0

(6) –(–v) = v

Notes:

(1) The zero vector 0 in Rn is called the additive identity ( 加法單位元素 ) in Rn (see Property 4)

(2) The vector –u is called the additive inverse ( 加法反元素 ) of u (see Property 5)

These three properties are valid for any vector

space and will be proved on Slides 4.22-4.23

(Since –v + v = 0, the additive inverse of –v is v, i.e., v can be expressed as –(–v). Note that v and –v are the additive inverses for each other)

Page 12: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.12

Linear combination ( 線性組合 ) in Rn:

Ex 6:

Given x = (–1, –2, –2), u = (0,1,4), v = (– 1,1,2), and w = (3,1,2) in R3, find a, b, and c such that x = au+bv+cw.

Sol:

2224

2

13

cba

cba

cb

1 ,2 ,1 cba

wvux 2 Thus

The vector x is called a linear combination of ,

if it can be expressed in the form

1 2, ,..., nv v v

1 2 1 2, where , , , are real numbersn n nc c c c c c 1 2x v v v

Page 13: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.13

Keywords in Section 4.1:

ordered n-tuple: 有序的 n 項 Rn-space: n 維空間 equal: 相等 vector addition: 向量加法 scalar multiplication: 純量乘法 zero vector: 零向量 additive identity: 加法單位元素 additive inverse: 加法反元素 linear combination: 線性組合

Page 14: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.14

4.2 Vector Spaces Vector spaces ( 向量空間 ):

Let V be a set on which two operations (addition and scalar multiplication) are defined. If the following ten axioms are satisfied for every element u, v, and w in V and every scalar (real number) c and d, then V is called a vector space, and the elements in V are called vectors

Addition:(1) u+v is in V

(2) u+v = v+u

(3) u+(v+w) = (u+v)+w

(4) V has a zero vector 0 such that for every u in V, u+0 = u

(5) For every u in V, there is a vector in V denoted by –u such that u+(–u) = 0

Page 15: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.15

Scalar multiplication:

(6) is in Vuc

(7) vuvu ccc )(

(8) uuu dcdc )(

(9) uu )()( cddc

(10) uu )(1

※ This type of definition is called an abstraction definition because you abstract a collection of properties from Rn to form the axioms for defining a more general space V

※ Thus, we can conclude that Rn is of course a vector space

Page 16: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.16

Notes:

A vector space consists of four entities:

V: nonempty set c: scalar

( , ) :

( , ) :c c

u v u v

u u

vector addition

scalar multiplication

, ,V is called a vector space

a set of vectors, a set of real-number scalars, and two operations

※ The set V together with the definitions of vector addition and scalar multiplication satisfying the above ten axioms is called a vector space

Page 17: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.17

Four examples of vector spaces are shown as follows. (It is straightforward to show that these vector spaces satisfy the above ten axioms)

(1) n-tuple space: Rn

),,(),,(),,( 221122121 nnn vuvuvuvvvuuu

),,(),,( 2121 nn kukukuuuuk

(2) Matrix space ( 矩陣空間 ): m nV M

Ex: (m = n = 2)

22222121

12121111

2221

1211

2221

1211

vuvu

vuvu

vv

vv

uu

uu

2221

1211

2221

1211

kuku

kuku

uu

uuk

(standard matrix addition)

(standard scalar multiplication for matrices)

(standard vector addition)

(standard scalar multiplication for vectors)

(the set of all m×n matrices with real-number entries)

Page 18: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.18

(3) n-th degree or less polynomial space ( 多項式空間 ): (the set of all real-valued polynomials of degree n or less)

nV P

nnn xbaxbabaxqxp )()()()()( 1100

nn xkaxkakaxkp 10)(

(4) Continuous function space ( 函數空間 ): (the set of all real-valued continuous functions defined on the entire real line)

)()())(( xgxfxgf

( , )V C

)())(( xkfxkf

※ By the fact that the set of real numbers is closed under addition and multiplication, it is straightforward to show that Pn satisfies the ten axioms and thus is a vector space

※ By the fact that the sum of two continuous function is continuous and the product of a scalar and a continuous function is still a continuous function, is a vector space( , )C

(standard polynomial addition)

(standard scalar multiplication for polynomials)

(standard addition for functions)

(standard scalar multiplication for functions)

Page 19: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.19

Summary of important vector spaces

2

3

set of all real numbers

set of all ordered pairs

set of all ordered triples

set of all -tuples

( , ) set of all continuous functions defined on the real number line

[ , ] set of all continuo

n

R

R

R

R n

C

C a b

,

,

us functions defined on a closed interval [ , ]

set of all polynomials

set of all polynomials of degree

set of matrices

set of square matrices

n

m n

n n

a b

P

P n

M m n

M n n

※ The standard addition and scalar multiplication operations are considered if there is no other specification ※ Each element in a vector space is called a vector, so a vector can be a real number, an n-tuple, a matrix, a polynomial, a continuous function, etc.

Page 20: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.20

Notes: To show that a set is not a vector space, you need only find one axiom that is not satisfied

Ex 7: The set of all (exact) second-degree polynomial functions is not a vector space

Pf: Let and2)( xxp 1)( 2 xxxq

Vxxqxp 1)()(

(it is not closed under vector addition)

121 , and is a real-number scalarVV2

121 )1)(( (it is not closed under scalar multiplication)

scalar

Pf:

Ex 6: The set of all integers ( 整數 ) is not a vector space

integernoninteger

Page 21: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.21

Ex 8:

V=R2=the set of all ordered pairs of real numbers

vector addition: ),(),(),( 22112121 vuvuvvuu scalar multiplication: )0,(),( 121 cuuuc

)1 ,1()0 ,1()1 ,1(1 the set (together with the two given operations) is not a vector space

Verify V is not a vector space

Sol:

(nonstandard definition)

This kind of setting can satisfy the first nine axioms of the definition of a vector space (you can try to show that), but it violates the tenth axiom

Page 22: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.22

Theorem 4.4: Properties of scalar multiplication

Let v be any element of a vector space V, and let c be any

scalar. Then the following properties are true

(1) 0

(2)

(3) If , either 0 or

(4) ( 1)

c

c c

v 0

0 0

v 0 v 0

v v

※ The first three properties are extension of Theorem 4.3, which simply considers the space of Rn. In fact, these four properties are not only valid for Rn but also for any vector space, e.g., for all vector spaces mentioned on Slide 4.19.

Pf:(8) (9) (5)

(1) 0 ( ( )) ( ) ( ( ))c c c c c c v v v v v v 0

(the additive inverse of v equals ((–1)v)

Page 23: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.23

(4) (7)

(3)

(5) (4)

(2) ( )

( ) ( ( )

( ) [ ( )]

c c c c

c c c c c

c c c c c

c c

0 0 0 0 0

0 0 0 0) + 0

0 0 0 0 + 0

0 = 0 0 0 = 0

(10) (9)

(3) Prove by contradiction: Suppose that , but 0 and

1 1 1 1

if = , either 0 or

c c

c cc c c

c c

v 0 v 0

v v v v 0 0

v 0 v 0(8)

(5)

(4) 0 (1 ( 1)) 1 ( 1)

( 1)

( 1)

v v v v

0 v v

v v

(add (–c0) to both sides)

※ The proofs are valid as long as they are logical. It is not necessary to follow the same proofs in the text book or the solution manual

(By comparing with Axiom (5), (–1)v is the additive inverse of v)

(By the first property, 0v = 0)

(By the second property, c0 = 0)

Page 24: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.24

Keywords in Section 4.2:

vector space: 向量空間 n-space: n 維空間 matrix space: 矩陣空間 polynomial space: 多項式空間 function space: 函數空間

Page 25: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.25

4.3 Subspaces of Vector Spaces Subspace ( 子空間 ):

( , , ) :V a vector space

:W

W V

a nonempty subset of V

( , , ) :W The nonempty subset W is called a subspace if W is a vector space under the operations of vector addition and scalar multiplication defined on V

Trivial subspace ( 顯然子空間 ): Every vector space V has at least two subspaces

(1) Zero vector space {0} is a subspace of V

(2) V is a subspace of V

※ Any subspaces other than these two are called proper (or nontrivial) subspaces

(It satisfies the ten axioms)

Page 26: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.26

Theorem 4.5: Test whether a nonempty subset being a subspace

If W is a nonempty subset of a vector space V, then W is

a subspace of V if and only if the following conditions hold

(1) If u and v are in W, then u+v is in W

(2) If u is in W and c is any scalar, then cu is in W

Examination of whether W being a subspace– Since the vector operations defined on W are the same as those defined on V, and most of the ten axioms inherit the properties for vector operations, it is not needed to verify those axioms

– Therefore, the following theorem tells us it is sufficient to test for the closure conditions under vector addition and scalar multiplication to identify that a nonempty subset of a vector space is a subspace

Page 27: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.27

Pf:1. Note that if u, v, and w are in W, then they are also in V.

Furthermore, W and V share the same operations. Consequently, vector space axioms 2, 3, 7, 8, 9, and 10 are satisfied automatically

2. Suppose that the closure conditions hold in Theorem 4.5, i.e., the axioms 1 and 6 for vector spaces are satisfied

3. Since the axiom 6 is satisfied (i.e., cu is in W if u is in W), we can obtain

3.1. for a scalar 0, zero vector in

axiom is sati fied4 s

c c W W

u 0

3.2. for a scalar 1, ( 1) ( 1)

st. +( ) +( 1)

axiom is satisfi5

c W

u u u

u u u u 0

ed

Page 28: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.28

Ex 2: A subspace of M2×2

Let W be the set of all 2×2 symmetric matrices. Show that

W is a subspace of the vector space M2×2, with the standard

operations of matrix addition and scalar multiplication

2 2

First, we knon that , the set of all 2 2 symmetric matrices,

is an nonempty subset of the vector space

W

M

Sol:

)( 21212121 AAAAAAWAW,A TTT

, ( ) T Tc R A W cA cA cA

2 2Thus, Thm. 4.5 is applied to obtain that is a subspace of W M

1 2( )A A W

( )cA W

The definition of a symmetric matrix A is that AT = A

Second,

Page 29: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.29

1 0

0 1A B I W

2 2 is not a subspace of W M

Ex 3: The set of singular matrices is not a subspace of M2×2

Let W be the set of singular (noninvertible) matrices of

order 2. Show that W is not a subspace of M2×2 with the

standard matrix operations

1 0 0 0,

0 0 0 1A W B W

Sol:

(W is not closed under vector addition)

Page 30: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.30

Ex 4: The set of first-quadrant vectors is not a subspace of R2

Show that , with the standard

operations, is not a subspace of R2

Sol:

Let (1, 1) W u

2 is not a subspace of W R

}0 and 0:),{( 2121 xxxxW

W 1 ,11 ,111 u(W is not closed under scalar multiplication)

Page 31: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.31

Ex 6: Identify subspaces of R2

Which of the following two subsets is a subspace of R2?

(a) The set of points on the line given by x+2y=0

(b) The set of points on the line given by x+2y=1

Sol:

RtttyxyxW ),2(02),( (a)

1 1 1 2 2 2Let 2 , and 2 ,t t W t t W v v

1 2 1 2 1 22 ,t t t t W v v

1 1 12 ,c ct ct W v

2 is a subspace of W R

(closed under vector addition)

(closed under scalar multiplication)

(Note: the zero vector (0,0) is on this line)

Page 32: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.32

12, yxyxW

Consider (1,0) W v

1 1,0 W v 2 is not a subspace of W R

(b) (Note: the zero vector (0, 0) is not on this line)

Note: Subspaces of R2

(1) consists of the 0, 0W single point 0(2) consists of all points on a passing through the originW line

2(3) R

(trivial subspace)

(trivial subspace)

Page 33: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.33

Ex 8: Identify subspaces of R3

RxxxxxxW

RxxxxW

R

313311

2121

3

,),,( (b)

,)1,,( (a)

? of subspace a is subsets following theofWhich

Sol:(a)

( 1) (0,0, 1) W v3 is not a subspace of W R

(Note: the zero vector is not in W)

(Note: the zero vector is in W)

Consider (0,0,1) W v

Page 34: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.34

1 1 3 3 1 1 3 3Consider ( , , ) and ( , , )v v v v W u u u u W v u

1 1 1 1 3 3 3 3, ,v u v u v u v u W v u

1 1 3 3, ,c cv cv cv cv W v

3

is closed under vector addition and scalar multiplication,

so is a subspace of

W

W R

(b)

Page 35: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.35

Note: Subspaces of R3

3(4) R

(1) consists of the 0,0,0W single point 0

(2) consists of all points on a passing through the originW line

(3) consists of all points on a passing through the origin

(The in problem (b) is a plane passing through the origin)

W plane

W

※ According to Ex. 6 and Ex. 8, we can infer that if W is a subspace of a vector space V, then both W and V must contain the same zero vector 0

(trivial subspace)

(trivial subspace)

Page 36: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.36

Theorem 4.6: The intersection of two subspaces is a subspace

If and are both subspaces of a vector space ,

then the intersection of and (denoted by )

is also a subspace of

V W U

V W V W

U

※ However, the union of two subspaces is not a subspace (see an exercise in Section 4.3)

1 2 1 2

1 2 1 2

(1) For and in , since and are in (and ),

is in (and ). Therefore, is in

V W V W

V W V W

v v v v

v v v v

Pf:

1 1

1 1

(2) For in , since is in (and ),

is in (and ). Therefore, is in

V W V W

c V W c V W

v v

v v

Consequently, we can conclude the intersection of and

( ) is also a subspace of

V W

V W U

Page 37: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.37

Keywords in Section 4.3:

subspace: 子空間 trivial subspace: 顯然子空間

Page 38: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.38

4.4 Spanning Sets and Linear Independence

1 1 2 2 ,k kc c c u v v v

1 2

A vector in a vector space is called a linear combination of

the vectors in if can be written in the formk

V

, , , V

u

v v v u

Linear combination ( 線性組合 ):

1 2where , , , are real-number scalarskc c c

This section introduces the spanning set, linear independence,

and linear dependence The above three notions are associated with the representation of

any vector in a vector space as a linear combination of a

selected set of vectors in that vector space

Page 39: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.39

Ex 2 and 3: Finding a linear combination

1 2 3

1 2 3

1 2 3

(1,2,3) (0,1,2) ( 1,0,1)

Prove (a) (1,1,1) is a linear combination of , ,

(b) (1, 2,2) is not a linear combination of , ,

v v v

w v v v

w v v v

Sol:

1 1 2 2 3 3(a) c c c w v v v

1012103211,1,1 321 ,,c,,c,,c )23 ,2 ,( 3212131 ccccccc

1 3

1 2

1 2 3

1

2 1

3 2 1

c c

c c

c c c

Page 40: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.40

1123

1012

1101G.-J. E.

0000

1210

1101

321

1

32 vvvw t

tctctc 321 , 21 , 1

(this system has infinitely many solutions)

2

1 2 33 5 2

t

w v v v

Page 41: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.41

1 1 2 2 3 3

(b)

c c c w v v v

2123

2012

1101

7000

4210

1101

1 2 3

This system has no solution since the third row means

0 0 0 7c c c

1 1 2 2 3 3 can not be expressed as c c c w v v v

G.-J. E.

Page 42: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.42

If S = {v1, v2,…, vk} is a set of vectors in a vector space V,

then the span of S is the set of all linear combinations of

the vectors in S,

The span of a set: span(S)

1 1 2 2span( )

(the set of all linear combinations of vectors in )k k iS c c c c R

S

v v v

Definition of a spanning set of a vector space:

If every vector in a given vector space V can be written as a linear combination of vectors in a set S, then S is called a spanning set ( 生成集合 ) of the vector space V

Page 43: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.43

span( )

spans (generates)

is spanned (generated) by

is a spanning set of

S V

S V

V S

S V

Note: The above statement can be expressed as follows

Ex 4:3

31 2 3

1 2 3

(a) The set {(1,0,0), (0,1,0), (0,0,1)} spans because any vector

( , , ) in can be written as

(1,0,0) (0,1,0) (0, 0,1)

S R

u u u R

u u u

u

u

22

22

2

(b) The set {1, , } spans because any polynomial function

( ) in can be written as

( ) (1) ( ) ( )

S x x P

p x a bx cx P

p x a b x c x

Page 44: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.44

3Show that the set (1,2,3),(0,1,2),( 2,0,1) spans S R

Ex 5: A spanning set for R3

31 2 3

1

2 3

We must examine whether any vector ( , , ) in

can be expressed as a linear combination of (1,2,3),

(0,1,2), and ( 2,0,1)

u u u R

u

v

v v

Sol:

1 1 2 2 3 3If c c c u v v v

3321

221

131

2 3

2

2

uccc

ucc

ucc

1 2 3

The above problem thus reduces to determine whether this

system is consistent for all values of , , and u u u

Page 45: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.45

0

123

012

201

A

has exactly one solution for every A x u u

3span( )S R

Note: ※ For any set S1 containing the set S, if S can span R3, S1 can span R3 as well

(e.g., ).

※ Actually, in this case, what S1 can span is only R3. Since v1, v2, and v3

span R3, v4 must be a linear combination of v1, v2, and v3. So, adding v4

will not generate more combinations. As a consequence, v1, v2, v3, and v4

can only span R3

1 1 2 3 4, , , (1, 2,3), (0,1,2), ( 2,0,1), (1,0,0)S v v v v

※ From Thm. 2.11, if A is an invertible matrix, then the

system of linear equations Ax = b has a unique solution

(x = A-1b) given any b

※ From Thm. 3.7, a square matrix A is invertible

(nonsingular) if and only if det (A) 0

Page 46: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.46

Theorem 4.7: span(S) is a subspace of V

If S = {v1, v2,…, vk} is a set of vectors in a vector space V, then

(a) span(S) is a subspace of V

(b) span(S) is the smallest subspace of V that contains S, i.e., every other subspace of V containing S must contain span(S)

1 v

2 v3 v

kv

span(S)

any subspace containing S

V

※ For example, V = R5, S = {(1, 0, 0, 0, 0), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0)} and thus span(S) = R3, and U = R4, U contains S and contains span(S) as well

Page 47: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.47

Pf:

1 1 2 2 1 1 2 2

Consider any two vectors and in span( ), that is,

= and =k k k k

S

c c c d d d u v

u v v v v v v v

1 1 1 2 2 2

1 1 2 2

Then + =( + ) ( + ) ( + ) span( )

and =( ) ( ) ( ) span( ), because they

can be written as linear combinations of vectors in

So, according to Theorem 4.5, we can conclud

k k k

k k

c d c d c d S

c cc cc cc S

S

u v v v v

u v v v

e that span( ) is

a subspace of

S

V

(a)

Page 48: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.48

=1

is a subspace

=1

Let be another subspace of that contains , and we want to

show span( )

Consider any span( ), i.e., = , where

contains =

Since for any vector

k

i i ii

kU

i i ii

U V S

S U

S c S

U S U c U

u u v v

v u v

u

1 2

span( ), also belongs to ,

we can conclude that span( ) , and therefore span( )

is the smallest subspace of that contains { , , , }k

S U

S U S

V S

u

v v v

(b)

(because U is closed under vector addition and scalar multiplication, and any linear combination can be evaluated with finite vector additions and scalar multiplications)

Page 49: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.49

1 2

1 2

(1) If the equation has only the trivial solution ( 0)

then (or , , , ) is called

(2) If the equation has a nontrivial solution (i.e., not all 's are zero),

k

k

i

c c c

S

c

v v v linearly independent

1 2 then (or , , , ) is called (The name of

linear dependence is from the fact that in this case, there exist at least

one which can be represented by the linea

k

i

S v v v linearly dependent

v

1 2

1 1

r combination of { , ,

, , , } in which the coefficients are not all zeros. This

statement will be proved in Theorem 4.8 on Slide 4.54)i i k

v v

v v v

1 2, , , : a set of vectors in a vector space kS V v v v

Definitions of Linear Independence (L.I.) ( 線性獨立 ) and Linear

Dependence (L.D.) ( 線性相依 ):

1 1 2 2For k kc c c v v v 0

Page 50: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.50

1 2 3, , 1, 2, 3 , 0, 1, 2 , 2, 0, 1S v v v

Ex 8: Testing for linear independence

0 23

0 2

02

321

21

31

ccc

cc

cc

0vvv 332211 ccc

Sol:

Determine whether the following set of vectors in R3 is L.I. or L.D.

0123

0012

0201G.-J. E.

1 0 0 0

0 1 0 0

0 0 1 0

solution trivialonly the 0321 ccc

1 2 3 is (or , , are) linearly independentS v v v

(or det( ) 1 0, so there is only the trivial solution)A

Page 51: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.51

Ex 9: Testing for linear independence

Determine whether the following set of vectors in P2 is L.I. or

L.D.

c1v1+c2v2+c3v3 = 0

i.e., c1(1+x – 2x2) + c2(2+5x – x2) + c3(x+x2) = 0+0x+0x2

c1+2c2 = 0 c1+5c2+c3 = 0–2c1– c2+c3 = 0

Sol:

This system has infinitely many solutions (i.e., this system has nontrivial solutions, e.g., c1=2, c2= – 1, c3=3)

S is (or v1, v2, v3 are) linearly dependent

1 2 0 0

1 5 1 0

2 1 1 0

1 2 0 0

10 1 030 0 0 0

G. E.

2 2 21 2 3, , 1 2 ,2 5 ,S x x x x x x v v v

Page 52: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.52

Ex 10: Testing for linear independence

Determine whether the following set of vectors in the 2×2

matrix space is L.I. or L.D.

1 2 3

2 1 3 0 1 0, , , ,

0 1 2 1 2 0S

v v v

Sol:

00

00

02

01

12

03

10

12321 ccc

c1v1+c2v2+c3v3 = 0

Page 53: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.53

(This system has only the trivial solution)c1 = c2 = c3= 0

S is linearly independent

2c1+3c2+ c3 = 0c1 = 0

2c2+2c3 = 0c1+ c2 = 0

0011

0220

0001

0132

0000

0100

0010

0001

G.-J. E.

Page 54: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.54

Theorem 4.8: A property of linearly dependent sets

A set S = {v1,v2,…,vk}, for k 2, is linearly dependent if and

only if at least one of the vectors vi in S can be written as a

linear combination of the other vectors in S

is linearly dependent (there exist nontrivial solutions)S

ci 0 for some i

1 111 1 1

i i ki i i k

i i i i

c c cc

c c c c

v v v v v

()

c1v1+c2v2+…+civi+…+ckvk = 0

Pf:

Page 55: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.55

)(

Let

(there exits at least this nontrivial solution)

S is linearly dependent

Corollary to Theorem 4.8:

Two vectors u and v in a vector space V are linearly dependent

(for k = 2 in Theorem 4.8) if and only if one is a scalar multiple

of the other

vi = d1v1+…+di-1vi-1+di+1vi+1+…+dkvk

d1v1+…+di-1vi-1+(–1)vi +di+1vi+1+…+dkvk = 0

c1 = d1 , c2 = d2 ,…, ci = –1 ,…, ck = dk

(A corollary is a must-be-true result based on the already proved theorem)

Page 56: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.56

Keywords in Section 4.4:

linear combination: 線性組合 spanning set: 生成集合 trivial solution: 顯然解 linear independence: 線性獨立 linear dependence: 線性相依

Page 57: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.57

4.5 Basis and Dimension

Basis ( 基底 ):

V: a vector space

(1)

(2)

S spans V (i.e., span(S) = V)

S is linearly independent

SpanningSets

BasesLinearly

IndependentSets

S is called a basis for V Notes:

S = {v1, v2, …, vn}V

A basis S must have enough vectors to span V, but not so many vectors that one of them could be written as a linear combination of the other vectors in S

(For any , has at least one solution (If there is

exact one solution (det( ) 0) or if there are infinitely many

solutions (det( ) 0))

i iV c A

A

A

u v x u

(For = , there is only the trivial solution (det( ) 0).

See the definition on Slide 4.49 and Ex 8 on Slide 4.50)i ic A A v x 0

V

( For = , det( ) 0)i ic A A v x

Page 58: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.58

Notes:

(1) the standard basis for R3:

{i, j, k}, for i = (1, 0, 0), j = (0, 1, 0), k = (0, 0, 1)

(2) the standard basis for Rn :

{e1, e2, …, en}, for e1=(1,0,…,0), e2=(0,1,…,0),…, en=(0,0,…,1)

Ex: For R4, {(1,0,0,0), (0,1,0,0), (0,0,1,0), (0,0,0,1)}

※Express any vector in Rn as the linear combination of the vectors in the

standard basis: the coefficient for each vector in the standard basis is the value of the corresponding component of the examined vector, e.g., (1, 3, 2) can be expressed as 1·(1, 0, 0) + 3·(0, 1, 0) + 2·(0, 0, 1)

Page 59: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.59

Ex: matrix space:

10

00,

01

00,

00

10,

00

01

22

(3) the standard basis for mn matrix space:

{ Eij | 1im , 1jn }, and in Eij

(4) the standard basis for Pn(x):

{1, x, x2, …, xn}

Ex: P3(x) {1, x, x2, x3}

1

other entries are zeroija

Page 60: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.60

Ex 2: The nonstandard basis for R2

21 2Show that { , } {(1,1), (1, 1)} is a basis for S R v v

1 2 121 2 1 1 2 2

1 2 2

(1) For any ( , ) , c c u

u u R c cc c u

u v v u

Because the coefficient matrix of this system has a nonzero determinant, the system has a unique solution for each u. Thus you can conclude that S spans R2

1 21 1 2 2

1 2

0(2) For

0

c cc c

c c

v v 0

Because the coefficient matrix of this system has a nonzero determinant, you know that the system has only the trivial solution. Thus you can conclude that S is linearly independent

According to the above two arguments, we can conclude that S is a (nonstandard) basis for R2

Page 61: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.61

Theorem 4.9: Uniqueness of basis representation for any vectors

If is a basis for a vector space V, then every

vector in V can be written in one and only one way as a linear

combination of vectors in S

nS vvv ,,, 21

Pf:

basis a is S(1) span(S) = V(2) S is linearly independent

span(S) = V Let v = c1v1+c2v2+…+cnvn

v = b1v1+b2v2+…+bnvn

v + (–1)v = 0 = (c1–b1)v1 + (c2 – b2)v2 + … + (cn – bn)vn

is linearly independent with only the trivial solutionS

(i.e., unique basis representation) c1 = b1 , c2 = b2 ,…, cn = bn

coefficients for are all zeroi v

Page 62: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.62

If is a basis for a vector space V, then every set containing more than n vectors in V is linearly dependent (In other words, every linearly independent set contains at most n vectors)

Theorem 4.10: Bases and linear dependence

nS vvv ,,, 21

Pf:

S1 = {u1, u2, …, um} , m > nLet

span( )S V

uiV

nnmmmm

nn

nn

ccc

ccc

ccc

vvvu

vvvu

vvvu

2211

22221122

12211111

Page 63: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.63

L.I. is S di = 0 i i.e.,

0

0

0

2211

2222121

1212111

mnmnn

mm

mm

kckckc

kckckc

kckckc

Consider k1u1+k2u2+…+kmum= 0

(di = ci1k1+ci2k2+…+cimkm) d1v1+d2v2+…+dnvn= 0

Theorem 1.1: If the homogeneous system has fewer equations (n equations) than variables (k1, k2, …, km), then

it must have infinitely many solutions

m > n k1u1+k2u2+…+kmum = 0 has nontrivial (nonzero) solution

S1 is linearly dependent

(if ki’s are not all zero, S1 is linearly dependent)

Page 64: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.64

Theorem 4.11: Number of vectors in a basis

If a vector space V has one basis with n vectors, then every

basis for V has n vectorsPf:

S = {v1, v2, …, vn}

S' = {u1, u2, …, um} are two bases with different sizes for V

is a basis spanning

' is a set of L.I. vectors

' is a basis spanning

is a set of L.I. vectors

S Vm n

Sn m

S Vn m

S

※ According to Thm. 4.10, every linearly independent set contains at most n vectors in a vector space if there is a basis of n vectors spanning that vector space

※ For R3, since the standard basis {(1, 0, 0), (0, 1, 0), (0, 0, 1)} can span this vector space, you can infer any basis that can span R3 must have exactly 3 vectors ※ For example, S = {(1, 2, 3), (0, 1, 2), (–2, 0, 1)} in Ex 5 on Slide 4.44 is another basis of R3 (because S can span R3 and S is linearly independent), and S has 3 vectors

Page 65: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.65

Finite dimensional:

A vector space V is finite dimensional if it has a basis consisting of a finite number of elements

Infinite dimensional:

If a vector space V is not finite dimensional, then it is called infinite dimensional

Dimension:

V: a vector space S: a basis for V

dim(V) = #(S) (the number of vectors in a basis S)

The dimension of a vector space V is defined to be the

number of vectors in a basis for V

Page 66: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.66

Notes:

(1) dim({0}) = 0

(2) Given dim(V) = n, for SV

S: a spanning set #(S) n (from Ex 5 on Slides 4.44 and 4.45)

S: a L.I. set #(S) n (from Theorem 4.10)

S: a basis #(S) = n

(3) Given dim(V) = n, if W is a subspace of V dim(W) n

SpanningSets

BasesLinearly

IndependentSets

#(S) n #(S) = n #(S) n

dim(V) = n

(If V consists of the zero vector alone, the dimension of V is defined as zero)

※ For example, if V = R3, you can infer the dim(V) is 3, which is the number of vectors in the standard basis ※ Considering W = R2, which is a subspace of R3, due to the number of vectors in the standard basis, we know that the dim(W) is 2, that is smaller than dim(V)=3

(Since a basis is defined to be a set of L.I. vectors that can spans V, #(S) = dim(V) = n) (see the above figure)

Page 67: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.67

Ex: Find the dimension of a vector space according to the standard basis

(1) Vector space Rn standard basis {e1 , e2 , , en}

(2) Vector space Mmn standard basis {Eij | 1im , 1jn}

(3) Vector space Pn(x) standard basis {1, x, x2, , xn}

(4) Vector space P(x) standard basis {1, x, x2, }

dim(Rn) = n

dim(Mmn) = mn

dim(Pn(x)) = n+1

dim(P(x)) =

※ The simplest way to find the dimension of a vector space is to count the number of vectors in the “standard” basis for that vector space

and in Eij 1

other entries are zeroija

Page 68: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.68

Ex 9: Determining the dimension of a subspace of R3

(a) W = {(d, c – d, c): c and d are real numbers}

(b) W = {(2b, b, 0): b is a real number}

Sol: (Hint: find a set of L.I. vectors that spans the subspace, i.e., find a basis for the subspace.)

(a) (d, c – d, c) = c(0, 1, 1) + d(1, – 1, 0)

S = {(0, 1, 1) , (1, – 1, 0)} (S is L.I. and S spans W)

S is a basis for W

dim(W) = #(S) = 2

S = {(2, 1, 0)} spans W and S is L.I.

S is a basis for W

dim(W) = #(S) = 1

(2 , ,0) (2,1,0)b b b(b)

Page 69: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.69

Ex 11: Finding the dimension of a subspace of M22

Let W be the subspace of all symmetric matrices in M22.

What is the dimension of W?

Sol:

Rcba

cb

baW ,,

10

00

01

10

00

01cba

cb

ba

10

00,

01

10,

00

01S spans W and S is L.I.

S is a basis for W dim(W) = #(S) = 3

Page 70: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.70

Let V be a vector space of dimension n

(1) If is a linearly independent set of

vectors in V, then S is a basis for V

(2) If spans V, then S is a basis for V

(Both results are due to the fact that #(S) = n)

SpanningSets Bases

LinearlyIndependent

Sets

dim(V) = n

#(S) n#(S) = n

#(S) n

1 2, , , nS v v v

1 2, , , nS v v v

Theorem 4.12: Methods to identify a basis in an n-dimensional space

Page 71: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.71

Keywords in Section 4.5:

basis: 基底 dimension: 維度 finite dimension: 有限維度 infinite dimension: 無限維度

Page 72: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.72

4.6 Rank of a Matrix and Systems of Linear Equations

In this section, three vector spaces are investigated Row space: the vector space spanned by the row vectors of a matrix A

Column space: the vector space spanned by column vectors of a matrix A

Nullspace: the vector space consisting of all solutions of the homogeneous system of linear equations Ax = 0

Next, discuss the basis and the dimension of each vector space

Finally, the relationship between the solutions of Ax = b and Ax = 0 will be discussed

To begin the introduction, we consider the row and column vectors of a matrix A with the size m×n on the next slide

Page 73: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.73

mmnmm

n

n

A

A

A

aaa

aaa

aaa

A

2

1

21

22221

11211 11 12 1 (1)

21 22 2 (2)

1 2 ( )

( )

( )

( )

n

n

m m mn m

a a a A

a a a A

a a a A

row vectors of A row vectors: (with size 1×n)

11 12 1

1 221 22 2

1 2

n

nn

m m mn

a a a

a a aA A A A

a a a

mn

n

n

mm a

a

a

a

a

a

a

a

a

2

1

2

22

12

1

21

11

column vectors of A column vectors: (with size m×1)

|| || ||A

(1) A

(2) A

(n)

※ So, the row vectors are vectors in Rn

※ So, the column vectors are vectors in Rm

Page 74: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.74

Let A be an m×n matrix: Row space ( 列空間 ):

},...,,|...{)( 21)()2(2)1(1 RAAAARS mmm

Column space ( 行空間 ):

},,{)( 21)((2)

2(1)

1 RAAAACS nn

n

Note:

dim(RS(A)) (or dim(CS(A)) equals the number of linearly independent row (or column) vectors of A

(If A(1), A(2), …, A(m) are linearly independent, A(1), A(2), …, A(m) can form a basis for RS(A))

(If A(1), A(2), …, A(n) are linearly independent, A(1), A(2), …, A(n)

can form a basis for CS(A))

The row space of A is a subspace of Rn spanned by the row vectors of A

The column space of A is a subspace of Rm spanned by the column vectors of A

Page 75: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.75

In Ex 5 of Section 4.4, S = {(1, 2, 3), (0, 1, 2), (–2, 0, 1)} spans R3. Use these vectors as row vectors to construct A

Note:dim(RS(A)) = 3 and dim(RS(A1)) = 3

(Since (1, 2, 3), (0, 1, 2), (–2, 0, 1) are linearly independent, they can form a basis for RS(A))

3

1 2 3

0 1 2 ( )

2 0 1

A RS A R

31 1

1 2 3

0 1 2( )

2 0 1

1 0 0

A RS A R

Since S1 = {(1, 2, 3), (0, 1, 2), (–2, 0, 1), (1, 0, 0)} also spans R3,

(Since (1, 2, 3), (0, 1, 2), (–2, 0, 1) (1, 0, 0) are not linearly independent, they cannot be a basis for RS(A1))

Page 76: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.76

Pf:

Theorem 4.13: Row-equivalent matrices have the same row space

If an mn matrix A is row equivalent to an mn matrix B,

then the row space of A is equal to the row space of B

(1) Since B can be obtained from A by elementary row operations, the row vectors of B can be expressed as linear combinations of the row vectors of A The linear combinations of row vectors in B must be linear combinations of row vectors in A any vector in RS(B) lies in RS(A) RS(B) RS(A)

(2) Since A can be obtained from B by elementary row operations, the row vectors of A can be written as linear combinations of the row vectors of B The linear combinations of row vectors in A must be linear combinations of row vectors in B any vector in RS(A) lies in RS(B) RS(A) RS(B)

( ) = ( )RS A RS B

Page 77: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.77

Notes: (1) The row space of a matrix is not changed by elementary row operations

RS(r(A)) = RS(A) r: any elementary row operation

(2) But elementary row operations will change the column space Theorem 4.14: Basis for the row space of a matrix

If a matrix A is row equivalent to a matrix B in the (reduced) row-echelon form, then the nonzero row vectors of B form a basis for the row space of A

1. The row space of A is the same of the row space of B (Thm. 4.13), spanned by all row vectors in B2. For the row space of B, it can be constructed by the linear combinations of only nonzero row

vectors since it is impossible to generate more combinations when taking zero row vectors into consideration (i.e., nonzero row vectors span the row space of B)

3. Since it is impossible to express a nonzero row vector as the linear combination of other nonzero row vectors in a row-echelon form matrix (see Ex. 2 on the next slide), according to Thm. 4.8, we can conclude that the nonzero row vectors in B are linearly independent

4. As a consequence, since the nonzero row vectors in B are linearly independent and span the row space of B, according to the definition on Slide 4.57, they form a basis for the row space of B and for the row space of A as well

Page 78: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.78

Find a basis of the row space of A =

2402

1243

1603

0110

3131

Ex 2: Finding a basis for a row space

Sol:

2402

1243

1603

0110

3131

A=

0000

0000

1000

0110

3131 1

2

3

w

w

wB =G. E.

bbbbaaaa 43214321

Page 79: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.79

Notes:

3 1 2 3 1 2(2) 2 2 b b b a a a

1 2 4

1 2 4

(because these columns have the leading (1) { , , } is L.I.

{ , , }

1

'

is L

)

.I.

s

b b b

a a a

a basis for RS(A) = {the nonzero row vectors of B} (Thm 4.14)

= {w1, w2, w3} = {(1, 3, 1, 3), (0, 1, 1, 0), (0, 0, 0, 1)}

Although row operations can change the column space of a matrix (mentioned in Slide 4.77), they do not change the dependency relationships among columns

(The linear combination relationships among column vectors in B still hold for column vectors in A)

(Check: w1, w2, w3 are linearly independent, i.e., aw1 + bw2 + cw3 = 0 has only the trivial solution or it is impossible to express any one of them to be the linear combination of the others (Theorem 4.8))

Page 80: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.80

Ex 3: Finding a basis for a subspace using Thm. 4.14

Find a basis for the subspace of R3 spanned by

}8) 1, (5, ,3) 0, (3, ,5) 2, 1,({321 vvv

S

Sol:

a basis for span({v1, v2, v3})

= a basis for RS(A)= {the nonzero row vectors of B} (Thm 4.14)

= {w1, w2}

= {(1, –2, – 5) , (0, 1, 3)}

815

303

521

3

2

1

v

v

v

000

310

521

BA = G. E.

1

2

w

w

(Construct A such that RS(A) = span(S))

Page 81: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.81

Ex 4: Finding a basis for the column space of a matrix Find a basis for the column space of the matrix A given in Ex 2

2402

1243

1603

0110

3131

A

Sol. 1:

1

2G. E.

3

1 0 3 3 2 1 0 3 3 2

3 1 0 4 0 0 1 9 5 6

1 1 6 2 4 0 0 1 1 1

3 0 1 1 2 0 0 0 0 0

TA B

w

w

w

Since CS(A)=RS(AT), to find a basis for the column space of the matrix A is equivalent to find a basis for the row space of the matrix AT

Page 82: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.82

(a basis for the column space of A)

a basis for CS(A)

= a basis for RS(AT)

= {the nonzero row vectors of B}

= {w1, w2, w3}

1

1

1

0

0

,

6

5

9

1

0

,

2

3

3

0

1

Page 83: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.83

Notes: The bases for the column space derived from Sol. 1 and Sol. 2

are different. However, both these bases span the same CS(A), which is a subspace of R5

Sol. 2:

G. E.

1 3 1 3 1 3 1 3

0 1 1 0 0 1 1 0

3 0 6 1 0 0 0 1

3 4 2 1 0 0 0 0

2 0 4 2 0 0 0 0

A B

Leading 1’s {b1, b2, b4} is a basis for CS(B) (not for CS(A))

{a1, a2, a4} is a basis for CS(A) ※ This method utilizes that B is with the same dependency relationships among columns

as A (mentioned on Slides 4.77 and 4.79), which does NOT mean CS(B) = CS(A)

1 2 3 4 1 2 3 4 a a a a b b b b

Page 84: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.84

Theorem 4.16: The definition of the nullspace ( 零空間 )

1 2 1 2

1 2 1 2

1 1

( )

( ) is not empty ( )

Let , ( ) (i.e., and )

Then (1) ( )

(2) ( ) ( ) ( )

Thus ( ) is a subspace of

n

n

NS A R

NS A A

NS A A A

A A A

A c c A c

NS A R

0 0

x x x 0 x 0

x x x x 0 0 0

x x 0 0

Pf:

Notes: The nullspace of A is also called the solution space ( 解空間 ) of the homogeneous system Ax = 0

}|{)( 0xx ARANS n

If A is an mn matrix, then the set of all solutions of the

homogeneous system of linear equations Ax = 0 is a subspace of Rn called the nullspace of A, which is denoted as

(closure under addition)

(closure under scalar multiplication)

Page 85: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.85

Ex 6: Finding the solution space (or the nullspace) of a homogeneous system with the coefficient matrix A as follows.

Sol: The nullspace of A is the solution space of Ax = 0

G.-J. E.

1 2 2 1 0 1 2 0 3 0

augmented matrix 3 6 5 4 0 0 0 1 1 0

1 2 0 3 0 0 0 0 0 0

x1 = –2s – 3t, x2 = s, x3 = –t, x4 = t

21

4

3

2

1

1

1

0

3

0

0

1

232

vvx tsts

t

t

s

ts

x

x

x

x

},|{)( 21 RtstsANS vv

1 2 2 1

3 6 5 4

1 2 0 3

A

Page 86: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.86

Theorem 4.15: Row and column space have equal dimensions

If A is an mn matrix, then the row space and the column

space of A have the same dimension

dim(RS(A)) = dim(CS(A))

Rank ( 秩 ): The dimension of the row (or column) space of a matrix A

is called the rank of A rank(A) = dim(RS(A)) = dim(CS(A))

※You can verify this result numerically through comparing Ex 2 (bases for the row space) with Ex 4 (bases for the column space) in this section. In these two examples, A is a 54 matrix, dim(RS(A)) = #(basis for RS(A)) = 3, and dim(CS(A)) = #(basis for CS(A)) = 3

Nullity ( 核次數 ): The dimension of the nullspace of A is called the nullity of A nullity(A) = dim(NS(A))

Page 87: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.87

Theorem 4.17: Dimension of the solution space

If A is an mn matrix of rank r, then the dimension of

the solution space of Ax = 0 (the nullsapce of A) is n – r, i.e., n = rank(A) + nullity(A) = r + (n – r)

(n is the number of columns or the number of unknown variables)

Pf:

11 12 1, -

21 22 2, -

1 2 , -

Since rank( ) , the reduced row echelon form of [ | ] after G.-J. E. should be

1 0 0 0

0 1 0 0

0 0 1 0

0 0 0 0

n r

n r

r r r n r

A r A

c c c

c c c

c c c

0

0 0 0

0 0 0 0 0 0 0

r

n r

m r

r

1. From Thm 4.14, the nonzero row vectors of B, which is the (reduced) row-echelon form of A, forms a basis for RS(A)

2. From the definition that rank(A) = dim(RS(A)) = r

※ According to the above two facts, the reduced row echelon form should have r nonzero rows like the left matrix

Page 88: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.88

Therefore, the corresponding system of linear equations is

Solving for the first r variables in terms of the last n – r parametric variables produces n – r vectors in the basis of the solution space. Consequently, the dimension of the solution space of Ax = 0 is n – r because there are n – r free parametric variables. (For instance, in Ex 6, there are two parametric variables, so dim(NS(A)) = 2)

1 11 1 12 2 1,

2 21 1 22 2 2,

1 1

+ + + = 0

+ + + + = 0

+

r r n r n

r r n r n

r r r

x c x c x c x

x c x c x c x

x c x

2 2 , + + + = 0r r r n r nc x c x

Notes:

(1) rank(A) can be viewed as the number of leading ones (or nonzero rows) in the reduced row-echelon form for solving Ax = 0

(2) nullity(A) can be viewed as the number of free variables in the reduced row-echelon form for solving Ax = 0

Page 89: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.89

Ex 7: Rank and nullity of a matrix

Let the column vectors of the matrix A be denoted by a1, a2,

a3, a4, and a5

4 5

1 0 2 1 0

0 1 3 1 3

2 1 1 1 3

0 3 9 0 12

A

a1 a2 a3 a4 a5

(a) Find the rank and nullity of A(b) Find a subset of the column vectors of A that forms a basis for the column space of A(c) If possible, write the third column of A as a linear combination of the first two columns

Page 90: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.90

Sol: Derive B to be the reduced row-echelon form of A.

00000

11000

40310

10201

120930

31112

31310

01201

BA

a1 a2 a3 a4 a5 b1 b2 b3 b4 b5

(by Thm. 4.17)nuillity( ) rank( ) 5 3 2 A n A

(a) rank(A) = 3 (by Theorems 4.13 and 4.14, rank(A) = dim(RS(A)) = dim(RS(B)) = the number of nonzero rows in B)

Page 91: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.91

(b) Leading 1’s

,

0

1

1

1

and,

3

1

1

0

,

0

2

0

1

421

aaa

1 2 4

1 2 4

{ , , } is a basis for ( )

{ , , } is a basis for ( )

CS B

CS A

b b b

a a a

213 32 aaa 3 1 22 3 b b b(c)(due to the fact that elementary row operations do not change the dependency relationships among columns)

Page 92: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.92

Theorem 4.18: Solutions of a nonhomogeneous linear system

Pf:

( )p pA A A x x x x b b 0

Let be the solution of and h h pA x x 0 x x x

p h x x x

Let x be another solution of Ax = b other than xp

( )p x x is a solution of Ax = 0

If xp is a particular solution of the nonhomogeneous system

Ax = b, then every solution of this system can be written

in the form x = xp + xh , wher xh is a solution of the

corresponding homogeneous system Ax = 0

Page 93: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.93

Ex 8: Finding the solution set of a nonhomogeneous system

Find the set of all solution vectors of the system of linear equations

Sol:

952

853

52

421

321

431

xxx

xxx

xxx

G.-J. E.

1 0 2 1 5 1 0 2 1 5

3 1 5 0 8 0 1 1 3 7

1 2 0 5 9 0 0 0 0 0

s t

Page 94: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.94

0

0

7

5

1

0

3

1

1

0

1

2

00

00

73

52

4

3

2

1

ts

ts

ts

ts

ts

x

x

x

x

x

pts xuu 21

i.e.,

0

0

7

5

px

and xh = su1 + tu2 is a solution of Ax = 0 (you can replace

the constant vector with a zero vector to check this result)

is a particular solution vector of Ax=b,

Page 95: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.95

For any n×n matrix A with det(A) ≠ 0

※ If det(A) ≠ 0, Ax = b is with the unique solution (xp) and Ax

= 0 has only the trivial solution (xh = 0). According to

Theorem 4.18, the solution of Ax = b can be written as x = xp + xh. The result of xh = 0 implies that there is only one

solution for Ax = b and the solution is x = xp + 0 = xp

※ In this scenario, nullity(A) = dim(NS(A)) = dim({0}) = 0. Furthermore, according to Theorem 4.17 that n = rank(A) + nullity(A) , we can conclude that rank(A) = n

※ Finally, according to the definition of rank(A) = dim(RS(A)) = dim(CS(A)), we can further obtain that dim(RS(A)) = dim(CS(A)) = n, which implies that there are n rows (and n columns) of A which are linearly independent (see the definitions on Slide 4.74)

Page 96: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.96

det(A) ≠ 0 det(A) = 0

For Ax = 0 Only the trivial solution xh=0

Infinitely many xh

For Ax = b, the solution is x = xp + xh shown in Thm. 4.18

xp must exist xp exists xp does not exist

Only one solution x = xp + 0 = xp

Infinitely many solutions x = xp

+ xh

Solution x = xp + xh does not exist

※ The relationship between the solutions of Ax = b and Ax = 0 for an n×n matrix A

Page 97: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.97

Theorem 4.19: Solution of a system of linear equations

The system of linear equations Ax = b is consistent if and only

if b is in the column space of A (i.e., b can be expressed as a

linear combination of the column vectors of A)

Pf:

Let

11 12 1 1 1

21 22 2 2 2

1 2

, , and

n

n

m m mn n n

a a a x b

a a a x bA

a a a x b

x b

be the coefficient matrix, the unknown vector, and the constant-term vector, respectively, of the system Ax = b

Page 98: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.98

Then

11 12 1 1 11 1 12 2 1

21 22 2 2 21 1 22 2 2

1 2 1 1 2 2

11 12 1

21 22 21 2

1 2

n n n

n n n

m m mn n m m mn n

n

n

m m

a a a x a x a x a x

a a a x a x a x a xA

a a a x a x a x a x

a a a

a a ax x x

a a

x

(1) (2) ( )1 2

n nn

mn

x A x A x A

a

b

Hence, Ax = b is consistent (x has solutions) if and only if b is a linear combination of the column vectors of A. In other words, the system is consistent if and only if b is in the subspace of Rm spanned by the column vectors of A

Page 99: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.99

Ex 9: Consistency of a system of linear equations depending on whether b is in the column space of A

123

3

1

321

31

321

xxx

xx

xxx

Sol:

G.-J. E.

1 1 1 1 1 0 1 3

[ ] 1 0 1 3 0 1 2 4

3 2 1 1 0 0 0 0

A

b

a1 a2 a3 b w1 w2 w3 v

(In other words, b is in the column space of A)

1 2

1 2 3

3 4

3 4 0

v w w

b a a a

The system of linear equations is consistent

(due to the fact that elementary row operations do not change the dependency relationships among columns)

Page 100: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.100

A property that can be inferred:

If rank(A) = rank([A|b]), then the system Ax = b is

consistent

The above property can be analyzed as follows:

(1) By Theorem 4.19 in which Ax = b is consistent if and only if b is a linear combination of the column vectors of A, we can infer that appending b to the right of A does NOT increase the number of linearly independent columns, so dim(CS(A)) = dim(CS([A|b]))

(2) By definition of the rank on Slide 4.86, rank(A) = dim(CS(A)) and rank([A|b]) = dim(CS([A|b]))

(3) By combining (1) and (2), we can obtain rank(A) = rank([A|b]) if and only if Ax = b is consistent

Check for Ex. 9:rank( ) rank([ | ]) 2A A b

Page 101: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.101

Summary of equivalent conditions for square matrices:

If A is an n×n matrix, then the following conditions are equivalent

(1) A is invertible

(2) Ax = b has a unique solution for any n×1 matrix b

(3) Ax = 0 has only the trivial solution

(4) A is row-equivalent to In

(5) det (A) 0

(6) rank(A) = n

(7) There are n row vectors of A which are linearly independent

(8) There are n column vectors of A which are linearly independent

(The above five statements are from Slide 3.39)

(The last three statements are from the arguments on Slide 4.95)

Page 102: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.102

Keywords in Section 4.6:

row space: 列空間 column space: 行空間 nullspace: 零空間 solution space: 解空間 rank: 秩 nullity: 核次數

Page 103: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.103

4.7 Coordinates and Change of Basis Coordinate representation relative to a basis

Let B = {v1, v2, …, vn} be an ordered basis for a vector space V

and let x be a vector in V such that .2211 nnccc vvvx

The scalars c1, c2, …, cn are called the coordinates of x relative

to the basis B (x 於基底 B 的座標 ). The coordinate matrix of x relative to B is a real-number column matrix whose components are the coordinates of x

n

B

c

c

c

2

1

x

※ The “ordered” basis means the sequence of the vectors in the basis is specified ( 被指定的 )

Page 104: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.104

Ex 1: Coordinates and components in Rn

Find the coordinate matrix of x = (–2, 1, 3) in R3

relative to the standard basis

S = {(1, 0, 0), ( 0, 1, 0), (0, 0, 1)}

Sol:( 2, 1, 3) 2(1, 0, 0) 1(0, 1, 0) 3(0, 0, 1) x

2

[ ] 1

3S

x

※ For the standard basis in Rn, the coordinates of a vector are the same as the components of that vector

Page 105: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.105

Ex 3: Finding a coordinate matrix relative to a nonstandard basis

Find the coordinate matrix of x = (1, 2, –1) in R3 relative to the

following (nonstandard) basis

B' = {u1, u2, u3}={(1, 0, 1), (0, – 1, 2), (2, 3, – 5)}

Sol:

)5 ,3 ,2()2 ,1 ,0()1 ,0 ,1()1 ,2 ,1( 321 ccc

332211 uuux ccc

1

2

1

521

310

201

152

23

12

3

2

1

321

32

31

c

c

c

ccc

cc

cc

1

2

3

5

[ ] 8

2B

c

c

c

x

Page 106: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.106

Change of basis problem ( 基底變換問題 ):

You were given the coordinates of a vector relative to one

basis B' and were asked to find the coordinates relative to

another basis B

},{ },,{ 2121 uuuu BB

d

c

b

aBB ][ ,][ fI 21 uu

212211 , .e.,i uuuuuu dcba

Ex: Change of basis

Consider two bases for a vector space V

(To represent the basis vectors in B’ by the coordinate matrices relative to B)

(B’ is the original basis and B is the target basis)

Page 107: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.107

2

1][ ,k

kV BvvConsider any

221121

212211

2211

)()(

)()(

uu

uuuu

uuv

dkbkckak

dckbak

kk

BBB

B k

k

db

ca

dkbk

ckak

vuu

v

][

21

2

1

21

21

(v is expressed as a coordinate matrix relative to the basis B’)

(v can be expressed as a coordinate matrix relative to the basis B)

Page 108: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.108

Transition matrix from B' to B:

V

BB nn

space vector afor

bases twobe }...,,{ nda },...,,{et L 2121 uuuuuu

then [ ] [ ]B BP v v

1 nB B B B...

2u u u v

1 ... nB B BP 2u u u

where

is called the transition matrix from B' to B ( 從 B' 到 B 的轉移矩陣 ), which is constructed by the coordinate matrices of ordered vectors in B' relative to B

If [v]B is the coordinate matrix of v relative to B

[v]B’ is the coordinate matrix of v relative to B'

(the coordinate matrix relative to the target basis B is derived by multiplying the transition matrix P to the left of the coordinate matrix relative to the original basis B')

Page 109: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.109

Theorem 4.20: The inverse of a transition matrix

If P is the transition matrix from a basis B' to a basis B in Rn,

then

(1) P is invertible

(2) The transition matrix from B to B' is P–1

Pf:

1 2 2

2

1 2 (derived by interchanging the roles of and )

{ , , ..., } and { , , ..., }

[ ] [ ] ... [ ]

[ ] [ ] ... [ ]

Replacing in the first equat

n n

B B n BB B B

B B n BB B B

B

B B'

B B'

P

Q

1

1

u u u u u u

v u u u v v

v u u u v v

v

1

1 2

1

ion with the second equation

is invertible and [ ] [ ] ... [ ]

is the transition matrix from to '

B B

B B n B

PQ PQ I

P P Q

P B B

v v

u u u

Page 110: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.110

Theorem 4.21: Deriving the transition matrix by G.-J. E.

Let and be two bases

for Rn. Then the transition matrix P from B' to B can be found

by using Gauss-Jordan elimination on the n×2n matrix

as follows

Similarly, the transition matrix P–1 from B to B' can be found

via

B B

G.-J. E. 1nB B I P

G.-J. E.nB B I P

(The next slide uses the case of n = 2 to show why works)

G.-J. E.nB B I P

Note that the target basis is always on the left

The resulting matrix is the transition matrix from the original basis to the target basis

1 2{ , , ..., }nB u u u 2{ , , ..., }nB' 1u u u

Construct the matrices B and B’ by using ordered basis vectors as column vectors

Page 111: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.111

Consider two bases for a vector space V, },{ },,{ 2121 uuuu BB

d

c

b

aBB ][ ,][ fI 21 uu

The transition matrix from ' to is

(see the slide 4.107), so finding

is equivalent to solving and

P B B

a cP

b d

a c

b d

11 11 21 11 21

12 12 22 12 22

11G.-J. E.

12

ˆ1 0ˆ0 1

u au bu u u a aB

u au bu u u b b

auB

u b

��������������

212211 , .e.,i uuuuuu dcba

21 11 21 11 21

22 12 22 12 22

21G.-J. E.

22

ˆ1 0ˆ0 1

u cu du u u c cB

u cu du u u d d

cuB

u d

��������������

11 21

12 22

G.-J. E.

ˆ ˆ1 0[ ]

ˆ ˆ0 1

u uB B B

u u

a cI P

b d

��������������

※ This method is very similar to the method for solving A-1 through G.-J. E., i.e.

G.-J. E. 1| |A I I A

Page 112: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.112

Ex 5: (Finding a transition matrix)

B = {(–3, 2), (4,–2)} and B' = {(–1, 2), (2,–2)} are two bases for R2

(a) Find the transition matrix from B' to B

(b)

(c) Find the transition matrix from B to B'

'

1Let [ ] , find [ ]

2B B

v v

1 2For the original basis:

2 2

3 4For the target basis:

2 2

B

B

Page 113: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.113

Sol:

(a)

22

21

22

43

3 2 1 1[ ] [ ]

2 1 2 0B BP

v v

(b)

12

23

10

01

G.-J. E.

B B' I P

12

23P (the transition matrix from B' to B)

Page 114: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.114

(c)

(the transition matrix from B to B')

32

211P

22

43

22

21

32

21

10

01

G.-J. E.

B' B I P –1

Check:

1 3 2 1 2 1 0

2 1 2 3 0 1PP I

Page 115: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.115

Ex 6: Coordinate representation in P3(x)

Find the coordinate matrix of p = 3x3 – 2x2 + 4 relative to the

nonstandard basis in P3(x), S = {1, 1 + x, 1 + x2, 1 + x3}

Sol:

Solve p = a(1) + b(1+x) + c(1+x2 ) + d(1+x3 )

p = 3(1) + 0(1+x) + (–2)(1+x2 ) + 3(1+x3 )

[p]S =

3

2

0

3

Page 116: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.116

Ex: Coordinate representation in M2x2

Find the coordinate matrix of x = relative to

the standard basis in M2x2

B =

Sol:

87

65

10

00,

01

00,

00

10,

00

01

5 6 1 0 0 1 0 0 0 05 6 7 8

7 8 0 0 0 0 1 0 0 1

5

6

7

8

B

x

x

Page 117: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.117

Keywords in Section 4.7:

coordinates of x relative to B: x 於基底 B 的座標 coordinate matrix: 座標矩陣 change of basis problem: 基底變換問題 transition matrix from B' to B: 從 B' 到 B 的轉移矩陣

Page 118: Chapter 4 Vector Spaces 4.1 Vectors in R n 4.2 Vector Spaces 4.3 Subspaces of Vector Spaces 4.4 Spanning Sets and Linear Independence 4.5 Basis and Dimension

4.118

4.8 Applications of Vector Spaces

Conic Sections ( 二次曲線 ) and Rotation (see Application in

Ch4.pdf)