2.iii. basis and dimension 1.basis 2.dimension 3.vector spaces and linear systems 4.combining...

33
2.III. Basis and Dimension 1. Basis 2. Dimension 3. Vector Spaces and Linear Systems 4. Combining Subspaces

Post on 20-Dec-2015

226 views

Category:

Documents


3 download

TRANSCRIPT

2.III. Basis and Dimension

1. Basis2. Dimension3. Vector Spaces and Linear Systems4. Combining Subspaces

2.III.1. Basis

Definition 1.1: Basis

A basis of a vector space V is an ordered set of linearly independent (non-zero) vectors that spans V.

Notation:

1 , , nβ β

Example 1.2:

2 1,

4 1B

is a basis for 2

B is L.I. :

2 1 0

4 1 0a b

→2 0

4 0

a b

a b

→0

0

a

b

B spans 2:

2 1

4 1

xa b

y

2

4

a b x

a b y

→ 1

22

a y x

b x y

L.I. → Minimal

Span → Complete

Example 1.3:

1 2,

1 4B

is a basis for 2 that differs from B only in order.

Definition 1.5: Standard / Natural Basis for n

1 0 0

0 1 0, , ,

0 0 1

n

E 1 2, , , n e e e

i i kkekth component of ei =

1

0

i kfor

i k

Example 1.6:

For the function space cos sin ,a b a b R

a natural basis is cos , sin

Another basis is cos sin , 2cos 3sin

Proof is straightforward.

Example 1.7:

For the function space of cubic polynomials 3 ,

a natural basis is 2 31, , ,x x x

Other choices can be 3 2, 3 , 6 , 6x x x

2 2 31,1 ,1 ,1x x x x x x

Proof is again straightforward.

Rule: Set of L.C.’s of a L.I. set is L.I. if each L.C. contains a different vector.

Example 1.8:

The trivial space { 0 } has only one basis, the empty one .

Note: By convention, 0 does not count as a basis vector. ( Any set of vectors containing 0 is linearly dependent. )

Example 1.9:The space of all finite degree polynomials has a basis with infinitely many elements 1, x, x2, … .

Example 1.10: Solution Set of Homogeneous SystemsThe solution set of

0

0

x y w

z w

1 1

1 0,

0 1

0 1

y w y w

Ris

1 1

1 0,

0 1

0 1

Span

( Proof of L.I. is left as exercise )

Example 1.11: MatricesFind a basis for this subspace of 22 : 2 0

0

a ba b c

c

S

2,

0

b c bb c

c

S R

Solution:

1 1 2 0,

0 0 1 0b c b c

R

∴ Basis is1 1 2 0

,0 0 1 0

( Proof of L.I. is left as exercise )

Theorem 1.12:In any vector space, a subset is a basis if and only if each vector in the space can be expressed as a linear combination of elements of the subset in a unique way.

Proof: A basis is by definition spanning

→ every vector can be expressed as a linear combination of the basis vectors.

Let i i i ii i

c d βv β then i i ii

c d β 0

∴ L.I. uniqueness

Definition 1.13: Representation wrt a Basis

Let = β1 , …, βn be a basis of vector space V and

Then the representation of v wrt is

1i i

n

i

c

βv

1

2Rep

n

c

c

c

vB

B

cj are called the coordinates (components) of v wrt .

Subscript is often omitted

Example 1.14: 3

Let 2 31, 2 , 2 , 2x x xB 2 31 ,1 , ,x x x x x x D

Then

2

0

1/ 2Rep

1/ 2

0

x x

B

B

2

0

0Rep

1

0

x x

D

D

vB

Exercises 2.III.1

2. A square matrix is symmetric if for all indices i and j, entry i, j equals entry j, i.

(a) Find a basis for the vector space of symmetric 2 2 matrices.(b) Find a basis for the space of symmetric 3 3 matrices.(c) Find a basis for the space of symmetric n n matrices.

1. Find a basis for each.(a) The subspace { a2 x2 + a1 x + a0 | a2 2a1 = a0 } of 2 .(b) The space of three-wide row vectors whose 1st and 2nd components add to zero.(c) This subspace of the 22 matrices

2 00

a bc b

c

is a vector space under these operations.

3. One of the exercises in the Subspaces subsection shows that the set

1

x

y x y z

z

1 2 1 2

1 2 1 2

1 2 1 2

1x x x x

y y y y

z z z z

1x rx r

r y r y

z rz

Find a basis.

2.III.2. Dimension

Definition 2.1 A vector space is finite-dimensional if it has a basis with only finitely many vectors.

Lemma 2.2: Exchange Lemma

Assume that = β1 , …, βn is a basis for a vector space, and that for the vector v the relationship holds:

To be proved:

All bases for a vector space have the same number of elements.

→ Dimension Number of vectors in basis.

→ Basis = Minimal spanning = L.I. set = Smallest set.

Then exchanging βj for v yields another basis for the space.

1 1 j j n nc c c v β β β where cj 0.

Proof: See Hefferon p.120.

Theorem 2.3: In any finite-dimensional vector space, all of the bases have the same number of elements.

Proof:

Let = β1 , …, βn be a basis of n elements.

Any other basis = δ1 , …, δm must have m n.

1 1 1 k k n nc c c δ β β β Let with ck 0.

By lemma 2.2, 1 = β1 , …, βk 1 ,δ1 ,βk + 1 , …, βn is a basis.

Next, replacing βj in 1 begets

2 = β1 , …, βk 1 ,δ1 ,βk + 1 , …, βj 1 ,δ2 ,βj + 1 , …, βn

Repeating the process n times results in a basis n = δ1 , …, δn that spans V.

Which contradicts with the assumption that is L.I.

1 1 1n n nc c δ δ δ with at least one ck 0.

If m > n, then we can write

Hence m = n.

Definition 2.4: DimensionThe dimension of a vector space is the number of vectors in any of its bases.

Example 2.5: n Any basis for n has n vectors since the standard basis n has n vectors. → n is n-D.

Example 2.6: n

dim n = n+1. since its natural basis, 1, x, x2, …, xn , has n+1 elements.

Example 2.7: A trivial space is 0-D since its basis is empty.

Comments:

All results in this book are applicable to finite-D vectors spaces.

Most of them are also applicable to countably infinite-D vectors spaces.

For uncountably infinite-D vectors spaces, e.g., Hilbert spaces, convergence most be taken into account.

Corollary 2.8:No L.I. set can have a size greater than the dimension of the enclosing space.

Example 2.9 : Only subspaces in 3.

2-D: Planes thru 0

1-D: Lines thru 0

0-D: {0}

Corollary 2.10:Any L.I. set can be expanded to make a basis.

Corollary 2.11: Any spanning set can be shrunk to a basis.

Corollary 2.12: In an n-D space, a set of n vectors is L.I. iff it spans the space.

Remark 2.13:The statement ‘any infinite-dimensional vector space has a basis’ is known to be equivalent to a statement called the Axiom of Choice.

Mathematicians differ philosophically on whether to accept or reject this statement as an axiom on which to base mathematics (although, the great majority seem to accept it).

Exercises 2.III.2.

2. Observe that, where S is a set, the functions f : S → form a vector spaceunder the natural operations: ( f + g ) (s) = f(s) + g(s) and (r f )(s) = r f(s). What is the dimension of the space resulting for each domain?(a) S = {1} (b) S = { 1, 2 } (c) S = { 1, 2, …, n }

1. What is the dimension of the span of the set { cos2θ , sin2θ , cos 2θ, sin 2θ}

This span is a subspace of the space of all real-valued functions of one real variable.

3. Prove that if U and W are both three-dimensional subspaces of 5 then U W is non-trivial. Generalize.

2.III.3. Vector Spaces and Linear Systems

Definition 3.1: Row Space & Row Rank The row space of a matrix is the span of the set of its rows.The row rank is the dimension of the row space, the number of L.I. rows.

Example 3.2:

2 3

4 6A

→ 1 1 1 22 3 4 6 ,RowSpace A c c c c R

2 3c c R 2 3Span

Lemma 3.3:Row-equivalent matrices have the same row space & hence the same row rank.

Proof: Let A & B be row-equivalent matrices.

Each row of B is a lin.comb. of rows of A → RowSpace(B) RowSpace(A)

Each row of A is a lin.comb. of rows of B → RowSpace(A) RowSpace(B)

Hence, RowSpace(A) = RowSpace(B)

Lemma 3.4:The nonzero rows of an echelon form matrix make up a L.I. set.

Proof:

This is just a re-statement of Lemma III.2.5, which states that, in an echelon form matrix, no nonzero row is a linear combination of the other rows.

Gaussian reduction ~ Finding a basis for the row space.

Example 3.5:

1 3 1

1 4 1

2 0 5

1 3 1

0 1 0

0 0 3

1 3 1

0 1 0

0 6 3

Basis for the row space is { (1 3 1), (0 1 0), (0 0 3) }.

Definition 3.6: Column Space & Column RankThe column space of a matrix is the span of the set of its columns. The column rank is the dimension of the column space, the number of L.I. columns.

A linear system is equivalent to the linear combination of the columns of the corresponding coefficient matrix.

1 2 3 1

1 2 3 2

2 3 3

1 3 4

3 7

2 3 8

2

4 4

c c c d

c c c d

c c d

c c d

1

21 2 3

3

4

1 3 7

2 3 8

0 1 2

4 0 4

d

dc c c

d

d

~

Basis for the column space can be found by applying the Gaussian reduction to the transpose of the corresponding coefficient matrix.

Definition 3.8: TransposeThe transpose of a matrix is the result of interchanging the rows and columns of that matrix, i.e.,

Tjii j

A A

Example 3.7: 1 3 7

2 3 8

0 1 2

4 0 4

A

Find a basis for the column space of

1 2 0 4

3 3 1 0

7 8 2 4

TA

~

21 0 4

31

0 1 43

0 0 0 0

B

1 0 0

0 1 0

2 10

3 34 4 0

TB

→ Basis is

1 0

0 1,2 1

3 34 4

Example 3.9:Get a basis for

2 4 2 4 2 4, 2 3 , 3S span x x x x x x

Solution:

0 0 1 0 1 , 0 0 2 0 3 , 0 0 1 0 3T span

Vector spaces T & S are isomorphic.

Let

0 0 1 0 1

0 0 2 0 3

0 0 1 0 3

~

0 0 1 0 1

0 0 0 0 1

0 0 0 0 0

→ Basis is 2 4 4,x x x

Lemma 3.10: Row operations do not change the column rank.

Proof: Rationale for Gaussian reduction:Row operations do not affect linear relationships among columns.

Example: Reading basis from reduced echelon form

1 3 1 6

2 6 3 16

1 3 1 6

~

1 3 0 2

0 0 1 4

0 0 0 0

Basis for row space (rows with leading entries):

1 3 0 2 , 0 0 1 4

Basis for column space (columns containing leading entries):

1 0

0 , 1

0 0

Theorem 3.11: The row rank and column rank of a matrix are equal.

Proof:

Lemmas 3.3 & 3.10 → row rank of matrix = row rank of its echelon form.

In reduced echelon form: Row rank = Number of leading entries

= Column rank.

Definition 3.12: RankThe rank of a matrix is its row rank or column rank.→ dim(RowSpace) = dim(ColumnSpace)

Theorem 3.13:For linear systems with n unknowns and matrix of coefficients A, the following statements are equivalent.(1) rank of A is r(2) space of solutions of the associated homogeneous system has dim n r

Proof: rank A = r reduced echelon form has r non-zero rows (L.I. eqs)

there are n r free variables (L.D.eqs)

Corollary 3.15:Where the matrix A is nn, the following statements are equivalent.(1) the rank of A is n(2) A is nonsingular(3) the rows of A form a linearly independent set(4) the columns of A form a linearly independent set(5) any linear system with matrix of coefficients A has one and only one solutionProof: Trivial (see Hefferon p.129)

Exercises 2.III.3.

1. Find a basis for the span of each set.

1 21 3 , 1 3 , 1 4 , 2 1 M(a)

3

1 3 1

2 , 1 , 3

1 1 3

R(b)

2 231 ,1 , 3 2x x x x P(c

)

2 3

1 0 1 1 0 3 1 0 5, ,

3 1 1 2 1 4 1 1 9

M(d)

2. (a) Show that the following set of column vectors is a subspace of 3.

1 1

2 2

3 3

3 2 4

has solution(s) for , , and

2 2 5

d x y z d

d x z d x y z

d x y z d

(b) Find a basis.

2.III.4. Combining Subspaces

Definition 4.1: Sum of Vector SubspacesWhere W1, …, Wk are subspaces of a vector space, their sum is the span of their union:

1 1k kW W span W W 1 kW W

Example 4.2: 3

x

y

z

v

0 0

0 0

0 0

x

y

z

→ 3 = x-axis + y-axis + z-axis

Example 4.3: A sum of subspaces can be less than the entire space.

Let L = { a + b x | a, b } and C = { c x3 | c } be subspaces of 4.

Then L + C = { a + b x + c x3 | a, b, c } 4.

Example 4.4: 3

A space can be described as a combination of subspaces in more than one way.

x

y

z

v 1 2

0

0

x

y y

z

→ 3 = xy-plane + yz-plane

This decomposition of v is obviously not unique.

1 2y y y

Definition 4.7: The concatenation of the sequences

11 1,1 1,, , n β βB … , 1 ,, ,kk k k n β βB

is their adjoinment11 2 1,1 1, , 1 ,, , , , , ,

kk n k k n β β β βB B B º º º

Lemma 4.8:

Let V be a vector space that is the sum of some of its subspaces

V = W1 + … + Wk.

Let 1, . . . , k be any bases for these subspaces.

Then the following are equivalent.

(1) For every v V , the expression v = w1 + …+ wk (with wi Wi ) is unique.

(2) The concatenation 1 … k is a basis for V .

(3) The nonzero members of {w1 , … , wk } (with wi Wi ) form a linearly

independent set — among nonzero vectors from different Wi’s, every linear

relationship is trivial.

Proof: See Hefferon, p.134.

Definition 4.9: Independent Set of Subspaces A collection of subspaces { W1 , …, Wk } is independent if no nonzero vector from any Wi is a linear combination of vectors from the other subspaces W1 , …, Wi 1 Wi + 1 , …, Wk .

Definition 4.10: ( Internal ) Direct Sum A vector space V is the direct sum of its subspaces W1 , …, Wk if

V = W1 +…+ Wk and { W1 , …, Wk } is independent. We write

V = W1 … Wk

Example 4.11: 3 = x-axis y-axis z-axis.

Example 4.12:

2 2

0 0 0 0,

0 0 0 0

a ba d b c

d c

M R R R

Corollary 4.13: The dimension of a direct sum is the sum of the dimensions of its summands.

Proof. Follows directly from Lemma 4.8

Definition 4.14: Complement SubspacesWhen a vector space is the direct sum of two of its subspaces,then they are said to be complements.

Lemma 4.15:

A vector space V is the direct sum of two of its subspaces W1 and W2 iff

V = W1+W2 and W1 W2 = { 0 }

Caution: fails if there are more than 2 subspaces.

See Example 4.19 below.

Proof : V = W1 W2 → W1 and W2 are L.I. → W1 W2 = { 0 }

Proof : W1 W2 = { 0 } → W1 and W2 are L.I.

Example 4.16: Direct Sum Decomposition is not Unique

In 2, the x-axis and the y-axis are complements, i.e., 2 = x-axis y-axis.

So are lines y = x and y = 2 x.

Example 4.17: Complement to a Subspace is not Unique

In the space F = { a cosθ + b sinθ | a,b }, the subspaces

W1 = { a cosθ | a } and W2 = { b sinθ | b } are complements.

Another complement of W1 is W3 = { b cosθ + b sinθ | b }.

Example 4.18:

In 3, the xy-plane and the yz-planes are not complements.

Example 4.19: 3

Let W1 = x-axis, W2 = y-

axis,

If there are more than two subspaces, then having a trivial intersection is not enough to guarantee unique decomposition.

3 ,

q

W q q r

r

R

→3

1 2 3W W W R

1 2 2 3 1 3 1 2 3W W W W W W W W W 0

Decomposition is not unique:

0 0

0

0 0

x x

y y x x

z z

0

0 0

0 0

x y y

y

z

Reason: vector from W3 can be a L.C. of those from W1 & W2

0

0

0 0 0

q q

q q

→ 31 2 3W W W R

Exercises 2.III.4.

1. Let W1 ,W2 be subspaces of a vector space.

(a) Assume that the set S1 spans W1, and that the set S2 spans W2.

Can S1 S2 span W1 +W2? Must it?

(b) Assume that S1 is a linearly independent subset of W1 and that S2 is a

linearly independent subset of W2. Can S1 S2 be a linearly

independent subset of W1 +W2? Must it?

3. Let W1 , W2 , W3 be subspaces of a vector space.

Prove that (W1W2 ) + (W1W3 ) W1 (W2 +W3 ) .

Does the inclusion reverse?

2. The example of the x-axis and the y-axis in 2 shows that

W1 W2 = V does not imply that W1 W2 = V .

Can W1 W2 = V and W1 W2 = V happen?

Let V and W be vector spaces.

Use wikipedia to find out the meanings of their

• Direct sum , V W .

• Direct product , V W .

• Tensor product , V W .