42
6 6.1 © 2012 Pearson Education, Inc. Orthogonality INNER PRODUCT, LENGTH, AND ORTHOGONALITY

Lecture 12 orhogonality - 6.1 6.2 6.3

Embed Size (px)

Citation preview

Page 1: Lecture 12   orhogonality - 6.1 6.2 6.3

6

6.1

© 2012 Pearson Education, Inc.

Orthogonality

INNER PRODUCT, LENGTH,

AND ORTHOGONALITY

Page 2: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 2© 2012 Pearson Education, Inc.

INNER PRODUCT

If u and v are vectors in Rn, then we regard uand v as nx1 matrices

The number uTv is called the inner product of u and v, and it is written as u∙v

aka dot product

Scalar

Page 3: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 3© 2012 Pearson Education, Inc.

INNER PRODUCT

If 𝐮 =

𝑢1

𝑢2

⋮𝑢𝑛

and 𝐯 =

𝑣1

𝑣2

⋮𝑣𝑛

then the inner product of u and v is

𝑢1 𝑢2 ⋯ 𝑢𝑛

𝑣1

𝑣2

⋮𝑣𝑛

= 𝑢1𝑣1 + 𝑢2𝑣2 + ⋯+ 𝑢𝑛𝑣𝑛

Page 4: Lecture 12   orhogonality - 6.1 6.2 6.3

Inner Product - Example

Find u∙v & v∙u, for:

𝐮 =2−5−1

, 𝐯 =323

Slide 6.1- 4© 2012 Pearson Education, Inc.

Page 5: Lecture 12   orhogonality - 6.1 6.2 6.3

INNER PRODUCT - Properties

Theorem 1: Let u, v, and w be vectors in Rn, and let c

be a scalar. Then

a. u•v = v•u

b. (u + v)•w = u•w + v•w

c. (cu)•v = c(u•v) = u•(cv)

d. u•u ≥0, and =0 iff u = 0

Properties (b) and (c) can be combined several

times to produce the following useful rule:

(c1u1 + c2u2 + … + cnun)•v = c1(u1•v) + … + cp(up•v)

Page 6: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 6© 2012 Pearson Education, Inc.

THE LENGTH OF A VECTOR

Definition: The length (or norm) of v:

𝐯 = 𝐯 ∙ 𝐯 = 𝑣12 + 𝑣2

2 + ⋯+ 𝑣𝑛2

||cv|| = |c| ||v||

Unit vector has length 1

Any vector divided by its length is a unit vector in same direction normalizing

Page 7: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 7© 2012 Pearson Education, Inc.

Normalization – Example 1

Example 1: Let v = (1, -2, 2, 0). Find a unit vector

u in the same direction as v.

Page 8: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 8© 2012 Pearson Education, Inc.

Normalization – Example 2

Example 2: Let W be a subspace of R2 spanned

by x = 2/31

. Find a unit vector z that is a basis for

W.

Page 9: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 9© 2012 Pearson Education, Inc.

DISTANCE IN Rn

Definition: For u and v in Rn, the distance between u

and v, written as dist (u, v), is the length of the vector

u-v. i.e.,

dist(u, v) = ||u – v||

Example: Find the distance between u=(7,1) & v=(3,2)

Page 10: Lecture 12   orhogonality - 6.1 6.2 6.3

Orthogonal Vectors

Definition: 2 vectors u and v are orthogonal

if u•v = 0.

Theorem 6-2: Pythagorean Theorem: 2

vectors u and v are orthogonal iff

𝐮 + 𝐯 2 = 𝐮 2 + 𝐯 2

Slide 6.1- 10© 2012 Pearson Education, Inc.

Page 11: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 11© 2012 Pearson Education, Inc.

Orthogonal Complement

If a vector z is orthogonal to every vector in a

subspace W of Rn, then z is said to be

orthogonal to W.

The set of all vectors z that are orthogonal to W

is called the orthogonal complement of W and

is denoted by W┴ (and read as “W

perpendicular” or simply “W perp”).

Page 12: Lecture 12   orhogonality - 6.1 6.2 6.3

ORTHOGONAL COMPLEMENTS

Theorem 3: Let A be an mxn matrix. The orthogonal

complement of the row space of A is the null space of

A, and the orthogonal complement of the column

space of A is the null space of AT:

Proof: ai = ith row of A

Row(A) = Span{rows of A}

Nul(A) = x:Ax = 0

ai • x = 0 for every i

x is orthogonal to each row of A

(Row ) Nul A A

(Col ) Nul TA A

Page 13: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 13© 2012 Pearson Education, Inc.

ANGLES & Inner Products

u•v = ||u|| ||v|| cosϑ

Page 14: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.1- 14© 2012 Pearson Education, Inc.

ANGLES & Inner Products - Example

Find the angle between 110

,011

Page 15: Lecture 12   orhogonality - 6.1 6.2 6.3

6

6.1

© 2012 Pearson Education, Inc.

Orthogonal Sets

Page 16: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 16© 2012 Pearson Education, Inc.

ORTHOGONAL SETS

A set of vectors {u1,…,up} in Rn is said to be an

orthogonal set if each pair of distinct vectors

from the set is orthogonal, i.e, if

ui •uj = 0 if i ≠ j.

Page 17: Lecture 12   orhogonality - 6.1 6.2 6.3

Orthogonal Sets - Example

Show that S={u1,u2,u3} is an orthogonal set.

𝐮1 =311

, 𝐮2 =−121

, 𝐮1 =−1/2−27/2

Slide 6.1- 17© 2012 Pearson Education, Inc.

Page 18: Lecture 12   orhogonality - 6.1 6.2 6.3

Theorem 6-4 – Orthogonal Set as a Basis

Theorem 4: If S = {u1, …, up} is an

orthogonal set of nonzero vectors in Rn ,

then S is linearly independent and hence is

a basis for the subspace spanned by S.

Slide 6.1- 18© 2012 Pearson Education, Inc.

Page 19: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 19© 2012 Pearson Education, Inc.

ORTHOGONAL Basis

Definition: An orthogonal basis for a subspace W

of Rn is a basis for W that is also an orthogonal set.

Theorem 5: Let {u1,…,up} be an orthogonal basis for

a subspace W of Rn. For each y in W, the weights in

the linear combination

are given by y = c1u1+ … + cpup

𝑐𝑗 =𝐲∙𝐮𝒋

𝐮𝒋∙𝐮𝒋(𝑗 = 1,… , 𝑝)

Page 20: Lecture 12   orhogonality - 6.1 6.2 6.3

Orthogonal Basis – Example 2

Same S as in Example 1:

𝐮1 =311

, 𝐮2 =−121

, 𝐮1 =−1/2−27/2

write 𝐲 =61−8

as a linear combination of the

vectors in S.

Slide 6.1- 20© 2012 Pearson Education, Inc.

Page 21: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 21© 2012 Pearson Education, Inc.

AN ORTHOGONAL PROJECTION

Decompose y into 2 vectors:

One in direction of u ŷ = αu

One orthogonal to u z = y - ŷ

Page 22: Lecture 12   orhogonality - 6.1 6.2 6.3

Orthogonal Projections

ŷ is the projection of y onto u

If L is subspace spanned by u

ŷ is the projection of y onto L

𝐲 = proj𝐿𝐲 =𝐲 ∙ 𝐮

𝐮 ∙ 𝐮𝐮

Slide 6.1- 22© 2012 Pearson Education, Inc.

Page 23: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 23© 2012 Pearson Education, Inc.

ORTHOGONAL PROJECTION - Example

Example 1: 𝐲 =76

, 𝐮 =42

a) Find the orthogonal projection of y onto u.

b) Write y as the sum of two orthogonal vectors,

one in Span {u} and one orthogonal to u.

Page 24: Lecture 12   orhogonality - 6.1 6.2 6.3

ORTHOGONAL PROJECTION - Example

7 8 1

6 4 2

yy ˆ(y y)

Page 25: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 25© 2012 Pearson Education, Inc.

AN ORTHOGONAL PROJECTION

Check: ŷ•z = 0?

Distance from y to L?

Distance from y to u?

Page 26: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 26© 2012 Pearson Education, Inc.

ORTHONORMAL SETS

A set {u1,…,up} is an orthonormal set if it is an

orthogonal set of unit vectors (||u||=1).

If W is the subspace spanned by such a set, then

{u1,…,up} is an orthonormal basis for W, since

the set is automatically linearly independent, by

Theorem 4.

The simplest example of an orthonormal set is the

standard basis {e1,…,en} for Rn.

Page 27: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 27© 2012 Pearson Education, Inc.

ORTHONORMAL SETS - Example

Example 2: Show that {v1, v2, v3} is an

orthonormal basis of R3, where

1

3 / 11

v 1/ 11

1/ 11

2

1/ 6

v 2 / 6

1/ 6

3

1/ 66

v 4 / 66

7 / 66

Page 28: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 28© 2012 Pearson Education, Inc.

ORTHONORMAL SETS

When the vectors in an orthogonal set of

nonzero vectors are normalized to have unit

length, the new vectors will still be orthogonal,

and hence the new set will be an orthonormal

set.

Page 29: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 29© 2012 Pearson Education, Inc.

ORTHONORMAL SETS

Theorem 6: An mxn matrix U has orthonormal columns if and only if UTU = I

Page 30: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.2- 30© 2012 Pearson Education, Inc.

ORTHONORMAL SETS

Theorem 7: Let U be an mxn matrix with

orthonormal columns, and let x and y be in Rn.

Then

a. ||Ux|| = ||x||

b. (Ux)•(Uy) = x•y

c. (Ux)•(Uy) = 0 iff x•y=0

Properties (a) and (c) say that the linear

mapping x Ux preserves lengths and

orthogonality.

Page 31: Lecture 12   orhogonality - 6.1 6.2 6.3

Orthonormal Sets - Example

𝑈 =

1 2 2 3

1 2 2/30 1/3

, 𝐱 = 23

Slide 6.1- 31© 2012 Pearson Education, Inc.

Page 32: Lecture 12   orhogonality - 6.1 6.2 6.3

Orthogonal Matrix

If U is square and U-1=UT

Called “Orthogonal Matrix”

Really should be Orthonormal matrix

Orthonormal columns

Orthonormal rows as well

Slide 6.1- 32© 2012 Pearson Education, Inc.

Page 33: Lecture 12   orhogonality - 6.1 6.2 6.3

6

6.1

© 2012 Pearson Education, Inc.

Orthogonal Projections

Page 34: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 34© 2012 Pearson Education, Inc.

ORTHOGONAL PROJECTIONS

Extend ideas of previous section from R2 to Rn

Given a vector y and a subspace W in Rn, there

is a vector ŷ in W such that

ŷ is the unique vector in W for which y - ŷ is

orthogonal to W

ŷ is the unique vector in W closest to y. See the

following figure. y

Page 35: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 35© 2012 Pearson Education, Inc.

THE ORTHOGONAL DECOMPOSITION

THEOREM

Theorem 8: Let W be a subspace of Rn . Then

each y in Rn can be written uniquely in the form

y = ŷ + z

where ŷ is in W and z is in W┴ .

If {u1,…,up} is any orthogonal basis of W, then

𝐲 =𝐲 ∙ 𝐮1

𝐮1 ∙ 𝐮1𝐮1 + ⋯+

𝐲 ∙ 𝐮𝑝

𝐮𝑝 ∙ 𝐮𝑝𝐮𝑝

and

z = y - ŷ

Page 36: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 36© 2012 Pearson Education, Inc.

THE ORTHOGONAL DECOMPOSITION

THEOREM

The vector ŷ is called the orthogonal

projection of y onto W and often is written as

projWy. See the following figure.

Page 37: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 37© 2012 Pearson Education, Inc.

THE ORTHOGONAL DECOMPOSITION

THEOREM - Example

Example 1: Write y as the sum of a vector in W =

span{u1,u2}, and a vector orthogonal to W.

𝐮1 =25−1

, 𝐮2 =−211

, 𝐲 =123

Page 38: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 38© 2012 Pearson Education, Inc.

PROPERTIES OF ORTHOGONAL

PROJECTIONS

If {u1,…,up} is an orthogonal basis for W and if y

happens to be in W, then the formula for projWy

is exactly the same as the representation of y

given in Theorem 5 in Section 6.2.

If y is in W = Span {u1,…,up}, projwy = y

Page 39: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 39© 2012 Pearson Education, Inc.

THE BEST APPROXIMATION THEOREM

Theorem 9: Let W be a subspace of Rn, let y be any

vector in Rn, and let ŷ be the orthogonal projection of

y onto W. Then ŷ is the closest point in W to y, in the

sense that

||y – ŷ|| < ||y – v||

for all v in W distinct from ŷ.

Page 40: Lecture 12   orhogonality - 6.1 6.2 6.3

The Best Approximation Theorem

ŷ in Theorem 9 is called the best

approximation to y by elements of W.

The distance from y to v, given by ||y – v||,

can be regarded as the “error” of using v in

place of y.

Theorem 9 says that this error is minimized

when v=ŷ

Note – does not depend of basis of W used

Slide 6.1- 40© 2012 Pearson Education, Inc.

Page 41: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 41© 2012 Pearson Education, Inc.

THE BEST APPROXIMATION THEOREM

Page 42: Lecture 12   orhogonality - 6.1 6.2 6.3

Slide 6.3- 42© 2012 Pearson Education, Inc.

Best Approximation - Example

Example 2: Find the distance from y to W = Span

{u1,u2} for:

𝐮1 =5−21

, 𝐮2 =12−1

, 𝐲 =−1−510