Download - 線性代數 Ch04_Orthogonality
-
8/19/2019 Ch04_Orthogonality
1/34
National United UniversityDepartment of Electrical Engineering, Taiwan
Chapter 4
Orthogonality
Jen-Chieh Liu
Outline
Length and Dot Product of Vectors
Orthogonality of the Four Subspace
Projections
Least Squares Approximations
Orthogonal Bases and Gram-Schmidt
-
8/19/2019 Ch04_Orthogonality
2/34
Length and Dot Product in R n
Length of vector ( ) in Rn is :),,,( 21 nvvv L=v
Properties of the dot product :
Length of a Vector in R n
In R5, the length of is)2,4,1,2,0( −−=vv
),,( 173
17
2
17
2 −
=vv
In R3
, the length of is
-
8/19/2019 Ch04_Orthogonality
3/34
Orthogonal (1/3)
Def : Product or inner product (over Rn).
[ ] nn
n
nT wvwvwv
w
w
w
vvvwvwv vvKvvvv
vM
v
v
vKvvvvvv +++=
==⋅ 22112
1
21 ,,,
(real number )
222
2
2
1 vvvvvv nT vv
Kvvvv
=+++== (norm)
Length of a vector :
Def : Two vectors and are orthogonal, ifvv
wv
0=wvT vv
Orthogonal (2/3)
In R2, and are orthogonal ?
Sol:Sol:Sol:Sol:
−1
1
1
1
Remark :
-
8/19/2019 Ch04_Orthogonality
4/34
Orthogonal (3/3)
Proof
Remark :222
0 wvwvwvT vvvvvv
+=+⇔=
Def : Two subspace V and W are orthogonal, if
all and all0=wv
T vv
V v∈v
W w∈v
Find Dot Products
)3,4(),8,5(,)2,2( −==−= wvuvvv
Find each solution
(a) ; (b) ; (c) ;
(d) ; (e)
vuvv⋅ wvu
vvv)( ⋅ )2( vu
vv⋅
2||||w
v)2( wvu
vvv−⋅
Sol:Sol:Sol:Sol:
-
8/19/2019 Ch04_Orthogonality
5/34
Nullspace and Left Nullspace
∴The nullspace N(A) and the row space C(AT ) are orthogonal
subspace of Rn.
0)(
0)2(
0)1(
0
0
0
2
1
=⋅
=⋅
=⋅
⇔
=
=
xrowm
xrow
xrow
x
rowm
row
row
x A
v
M
v
v
M
v
M
v
Similarly,
=
=
0
0
0
)(
)2(
)1(
M
v
M
v y
columnn
column
column
y A
T
T
T
T
The left nullspace N(AT ) and the row space C(A) are orthogonal
subspace of Rm.
Orthogonal Complement (1/3)
Def : The orthogonal complement V ⊥ of a subspace V
contains every vector that is orthogonal to V .
ie.,
N(A)⊥ C(AT ) , N(AT )⊥ C(A)
{ V vall for vwwV T
∈==⊥ vvvv
,0:Remark : V ⊥ is also a subspace.
(1)
(2)
Claim : (C(AT ))⊥ = N(A)
=
0
0
0
M
v x
(C(AT ))⊥⊥⊥⊥
-
8/19/2019 Ch04_Orthogonality
6/34
Orthogonal Complement (2/3)
Claim : ( N(A))⊥ = C(AT )
Proof
Orthogonal Complement (3/3)
(C(AT ))⊥ = N(A) ( N(A))⊥ = C(AT )
Similarly, A = AT , (C(A))⊥ = N(AT ), ( N(AT ))⊥ = C(A)
Fundamental theorem of linear algebra Part II.
-
8/19/2019 Ch04_Orthogonality
7/34
Bases from Subspace (1/3)
Proof
Claim : are independent in Rn if and only if
span Rn .nvvv
vK
vv,,, 21
nvvv vK
vv,,,
21
Bases from Subspace (2/3)
Claim : If V ⊥W . then V ∩ W = {0}
Proof
Claim : For any vector , we can have
When and
n R x∈
vnr x x x
vvv+=
)( T
r AC x ∈v
)( A N xn ∈v
Proof
-
8/19/2019 Ch04_Orthogonality
8/34
Bases from Subspace (3/3)
Orthogonality of the Four Subspace (1/2)
Remark : The decomposition
is uniquenr x x x vvv+=
Remark : For every ,
Where and
ln x x x cvvv
+=m
R x∈v
)( AC xc ∈v
)(ln
T A N x ∈
v
(column space) (let nullspace)
-
8/19/2019 Ch04_Orthogonality
9/34
Orthogonality of the Four Subspace (2/2)
Bases from Subspace
Proof
Claim : Every in the column space comes from one and
only one vector in the row space.bv
we can have
= 63
21
A
= 3
4
x
v
-
8/19/2019 Ch04_Orthogonality
10/34
Projections (1/2)
what are the projections of onto the z axis
and x-y plane?
=
4
3
2
bv b
v
=
5
2
1
bv
Projections (2/2)
-
8/19/2019 Ch04_Orthogonality
11/34
Projections onto a Line (1/2)
bv
pv
Lev
av
We want to find the projection of onto the line
in the direction of
=
mb
b
b
bM
v2
1
=
ma
aa
aM
v 2
1
Projections onto a Line (2/2)
-
8/19/2019 Ch04_Orthogonality
12/34
Projection and Error Matrix (1/3)
=
1
1
1
bv
=
2
2
1
av
, projects ontobv
av
Projection matrix
Projection and Error Matrix (2/3)
Check, pbe vvv−=
Note :bv
L
av
-
8/19/2019 Ch04_Orthogonality
13/34
Projection and Error Matrix (3/3)
In general,
Projection onto a subspace (1/3)
Start, with n independent vectors in Rm. We
want to find as a projection of
a given vector .
naaa vK
vv,,,
21
nna xa xa xP v
Kvv
ˆˆˆ2211 +++=
bv
Let
= naaa A v
Kvv21
Then
[ ]
x A x A
x
x
x
aaa A
n
n
ˆˆ
ˆ
ˆ
ˆ
2
1
21
==
=
v
MvKvv
The error vector should be orthogonal to the subspace
spanned by (the column space of A)n
aaa vK
vv,,,
21
x Ab ˆ−v
b
v
pv
-
8/19/2019 Ch04_Orthogonality
14/34
Projection onto a subspace (2/3)
The matrix is square and symmetric, and it is invertible
iff are independent (to be proved later)
A AT
naaa vK
vv,,,
21
Projection onto a subspace (3/3)
-
8/19/2019 Ch04_Orthogonality
15/34
Find the Projection Vector and Matrix (1/2)
=
=
0
0
6
21
11
01
band Av
projects onto Abv
Sol:Sol:Sol:Sol:
Find the Projection Vector and Matrix (2/2)
Check :
-
8/19/2019 Ch04_Orthogonality
16/34
Rank for Projection (1/2)
Claim : ( ) ( ) ( ) Arank AArnak A Arank T T ==
Proof
n× ×× × n m× ×× × m m× ×× × n
Rank for Projection (2/2)
Therefore, , which implies( ) ( ) A N A A N T =( ) ( ) ( ) ( ) Arank A Arank Arank n A Arank n T T =⇒−=−
Similarly, putting AT as A, we can get
rank ( AT A) = rank ( AT ) = rank ( A)
Remark : AT A is invertible iff A has independent columns
Proof
-
8/19/2019 Ch04_Orthogonality
17/34
Least Squares Approximations
It often happens that has no solution.
We cannot always get the error down to zero.
In this case, we may want the length of , or as small
as possible.
least square solution
b x Avv
=
x Abe vvv−=
ev 2
ev
Find the best line to the points (0, 6), (1, 0) and (2, 0)
Sol:Sol:Sol:Sol:
By Geometry
Sol:Sol:Sol:Sol:
-
8/19/2019 Ch04_Orthogonality
18/34
By Linear Algebra (1/2)
Sol:Sol:Sol:Sol:
By Linear Algebra (2/2)
-
8/19/2019 Ch04_Orthogonality
19/34
By Calculus (1/2)
Sol:Sol:Sol:Sol: ( ) ( )( )
( )( )22
222
B A
A B A B A A
E
+=
′+=+
∂
∂
By Calculus (2/2)
-
8/19/2019 Ch04_Orthogonality
20/34
Least Squares Approximations - Best Line (1/3)
Find the best line to the points (1, 1), (2, 2) and (3, 2)
Sol:Sol:Sol:Sol:
)1,1(
)2,2(
)2,3(
Dt C y +=
1 p
2 p
1b
3e
Least Squares Approximations - Best Line (2/3)
-
8/19/2019 Ch04_Orthogonality
21/34
Least Squares Approximations - Best Line (3/3)
Fitting a straight Line (1/2)
Fit heights, b1, b2, …. , bm at times, t 1, t 2, … , t m by a
straight line C + Dt .
Sol:Sol:Sol:Sol:
=
⇒
=+
=+
=+
mmmm b
b
b
DC
t
t
t
b Dt C
b Dt C
b Dt C
MMMMM
2
1
2
1
22
11
1
1
1
This is not solvable. Best fit : b A x A A T T v=ˆ
=
= ∑∑
∑2
2
1
21
1
1
1
111
i
i
i
i
i
i
m
m
T
t t
t m
t
t
t
t t t A A MML
L
-
8/19/2019 Ch04_Orthogonality
22/34
Fitting a straight Line (2/2)
=
=
∑
∑
i
i
i
i
i
m
m
T
bt
b
b
b
b
t t t b A
ML
Lv2
1
21
111
Solve :
Dand C for bt
b
D
C
t t
t m
i
ii
i
i
i
i
i
i
i
i
=
∑
∑
∑∑
∑2
The best
( )∑=
−+==−
m
x
ii b Dt C eb x A1
222
min vvv
( ) DC x ,ˆ =
Fitting a Parabola
2 Et Dt C ++
=++
=++
=++
mmm b Et Dt C
b Et Dt C
b Et Dt C
2
2
2
22
1
2
11
MM
=
2
2
22
2
11
1
1
1
mm t t
t t
t t
AMMM
=
E
D
C
xv
=
mb
b
b
bM
v2
1
x for b A x A A T T ˆˆ
v=
-
8/19/2019 Ch04_Orthogonality
23/34
Orthogonal Bases and Gram-Schmidt
Def : The vector are orthogonal if
when ever i ≠ j .
nqqq vvv,...,,
21 0= jT
i qq vv
Claim : If nonzero vectors are orthogonal
then they are independent.nqqq
vvv,...,,
21
Proof
Orthogonal Matrix
Def : The vector are orthonormal
if
nqqq vvv,...,, 21
=
≠=
jiif
jiif qq j
T
i,1
,0vv
A matrix Q (m×n) with orthonormal columns satisfies
[ ] I qqq
q
q
q
QQ n
T
n
T
T
T =
=
=
1000
0010
0001
21
2
1
L
MMM
L
L
vL
vv
vM
v
v
When Q is a square matrix, QT Q = I means that QT = Q-1.In this case, Q is called an orthogonal matrix.
-
8/19/2019 Ch04_Orthogonality
24/34
Angle Between Two Vectors
π θ θ ≤≤⋅
= 0,||||||||
cosvu
vu
Sol:Sol:Sol:Sol:
Rotation Matrix
Sol:Sol:Sol:Sol:
−=
θ θ
θ θ
CosSin
SinCosQ
-
8/19/2019 Ch04_Orthogonality
25/34
Reflection Matrix
−=
10
011Q
(x,y)(-x,y)
=
01
102Q
(x,y)
(y,x)45°°°°
Orthonormal Columns
Claim : If Q has orthonormal columns i.e. QT Q = I , then
(i)
(ii)
x xQ vv=
( ) ( ) y x yQ xQ T T vvvv=
Proof
-
8/19/2019 Ch04_Orthogonality
26/34
Projection Using Orthonormal (1/3)
Bases : Suppose the basis vector are orthonormal
[ ] I QQwithqqqQ T n == vvv,...,, 21
The least squares solution of b xQvv
=
bQ xbQ x I bQ xQQ T T T T vvvvvv
=⇒=⇒=
The projection vector
b
q
q
qqqq
T
n
T
T
nv
vM
v
vv
Lvv
= 2
1
21
bQQ xQ p T
vvv==
Projection Using Orthonormal (2/3)
=
bq
bq
bqqqq
p
T
n
T
T
n
vvM
vv
vvv
Lvv
v 2
1
21
real
( ) ( ) ( )bqqbqqbqq T nnT T
vvvvvvvvv+++= ...2211
The projection vector
( )qbqa
ab
a
aa
a
ba
aaa
ba p
T
T
T
T
T
vvvv
vv
v
vv
v
vv
vvv
vvv
=
==
=
2
bv
pvθ
av
θ
θ
Cosb
CosbqbqT
v
vvvv
=
=
-
8/19/2019 Ch04_Orthogonality
27/34
Projection Using Orthonormal (3/3)
are orthonormalnqqq vvv,...,,
21
( ) ( ) ( )bqqbqqbqq p T nnT T
vvvvvvvvvr+++= ...2211
The projection matrix P = QQT
When Q is square matrix (m = n)
The subspace (C (Q)) is the whole vector space Rn and
QT = Q-1, which is the exact solution
to
[ ]nqqqQ vvv
,...,,21
=
bQbQ x T
vvv 1−==
b xQvv
=
In this case, P = QQT = QQ-1 and the projection of is
itself. i.e., Therefore,
bv
bv
b p vv =
bqqbqqbqqb T
nn
T T vvvvvvvvvv
+++= ...2211
The Gram-Schmidt (Orthogonalization) Process (1/2)
Given n independent vectors . We want to
find n orthonormal vectors with the same spannaaa
vvv,...,,
21
nqqq vvv,...,, 21
2
2212122
1
1111
)(.2
.1
A
Aqthenqaqa A
A
Aqthena A
T v
vvvvvvv
v
vvvv
=−=
==
2av
pv
1
qv
2 Av
-
8/19/2019 Ch04_Orthogonality
28/34
The Gram-Schmidt (Orthogonalization) Process (2/2)
3
3
3
23213133 )()(.3
A
Aqthen
qaqqaqa A T T
v
vv
vvvvvvvv
=
−−=
ni for qaqa Ageneral In
i
j
jiT
jii ,...2,1)(,1
=−= ∑=vvvvv
3av
1qv
3 A
v
2qv
spans the same plane with and , then it is also
orthogonal with and .3 Av
1qv
2qv
1qv
2qv
Independent Non-orthogonal Vectors (1/2)
Sol:Sol:Sol:Sol:
−=
−
=
−=
3
3
3
,
2
0
2
,
0
1
1
321 aaa vvv
-
8/19/2019 Ch04_Orthogonality
29/34
Independent Non-orthogonal Vectors (2/2)
The Factorization A = QR (1/4)
Given independent vector :
Gram-Schmidt constructs :321
,, aaa vvv
321 ,, qqq vvv
1av
1 Av
1qv
1. , and span the subspace
2. , and span the same subspace3. , and span the same subspace.
Therefore, we can have
21,aa
vv
21, A A
vv
21,qq
vv
321 ,, aaa vvv
321 ,, A A Avvv
321 ,, qqq vvv
-
8/19/2019 Ch04_Orthogonality
30/34
The Factorization A = QR (2/4)
[ ]321 ,, aaa A vvv
= [ ]321 ,, qqqQ vvv
=
A Q
[ ] [ ]
=
33
3222
312111
321321
00
0,,,,
aq
aqaqaqaqaq
qqqaaaT
T T
T T T
vv
vvvv
vvvvvv
vvvvvv
R (upper triangular)
In general, from independent vectors
Gram-Schmidt construct we can have A = QRnaaa
vvv,...,,
21
nqqq vvv,...,, 21
[ ] [ ] Rqqqaaa 321321 ,,,, vvvvvv=
The Factorization A = QR (3/4)
R IRQRQ AQQR A T T
===⇒=
1111111 aq Aaq A A T vvvvvvv
=⇒==1qv 1a
v
( )
( ) 222221212
1212222
aq Aq Aqaqa
qaqaq A A
T T
T
vvvvvvvvv
vvvvvvv
=∴+=⇒
−==
> 0
( ) ( )
( ) ( ) 333332321312
2321313333
aq Aq Aqaqqaqa
qaqqaqaq A A
T T T
T T
vvvvvvvvvvvv
vvvvvvvvvv
=∴++=⇒
−−==
> 0
Therefore, the diagonal elements of R are
n
T
nn
T T aq Aaq Aaq A
vvvvvvvvv=== ,...,,
222111
-
8/19/2019 Ch04_Orthogonality
31/34
The Factorization A = QR (4/4)
R is upper-triangular with positive diagonal elements
R is invertible. (n nonzero pivots)
The least squares solution to is satisfyingb x Avv
= xv
Analysis of a Network (1/2)
Set up a system of linear equations to represent the network.
Then solve the system.
Sol:Sol:Sol:Sol:
20
1010 x1
x2 x3
x4
x5
-
8/19/2019 Ch04_Orthogonality
32/34
Analysis of a Network (2/2)
Analysis of a Electrical Network
Determine the currents I 1, I 2, and I 3 for the electrical network.
Sol:Sol:Sol:Sol:
-
8/19/2019 Ch04_Orthogonality
33/34
Forming Uncoded Row Matrices
Write the uncoded row matrices of size 1×3 for the message
“MEET ME MONDAY”.
Sol:Sol:Sol:Sol:
[ ] [ ]025141415
1305130205513
M E E T __ M E __ M
O N D A Y __
Encoding a Message
Use the following invertible matrix
Sol:Sol:Sol:Sol:
−−
−
−
=
411
311
221
A
-
8/19/2019 Ch04_Orthogonality
34/34
Decoding a Message
Use the inverse matrix
Sol:Sol:Sol:Sol:
[ ]
−−
−
−
=
100
010
001
411
311
221
I A
[ ]
−−
−−−
−−−
=−
110
561
8101
100
010
0011
A I