Here V is a vector space over some field F and B a finite subset of V.
I
 will try this off the top of my head, I do not have a linear algebra 
book handy and I am putting together this statement from random facts so
 be weary!
TFAE
1. B is a basis for V(ie B spans V and B is linearly independent)
2. Every vector in V can be written as a unique linear combination of elements of V.
3. B is a maximal linearly independent subset of V.
1 => 2
Say
 1 holds. Since B spans V we can write every vector in V as a linear 
combination of elements of  B with coefficients coming from F. Take x in
 V and say we have,
x = a_1v_1 + ... + a_nv_n
x = b_1v_1 + ... + b_nv_n.
Then
 0 = x -x = (a_1-b_1)v_1 + ... + (a_n-b_n)v_n. The independence of B 
implies a_i - b_i = 0, so a_i = b_i for all i. This shows uniqueness.
2 => 1
Assume
 2 holds. In particular this means B spans V. Now suppose a_1v_1 + ... +
 a_nv_n = 0 for some a_i in F, v_i in B. Now 0 is in V, so 0 has a 
unique representation as a linear combination of elements of V with 
coefficients in F. But 0  = 0*v_1 + ... + 0*v_n, so uniqueness implies 
a_i = 0 for all i, so we get independence.
3 = > 1.
Assume B
 is a maximal linearly independent subset. We only need to show B spans 
V. So take x in V. Consider the set B u {x}. Now B u {x} contains B, and
 since B is maximal, B u {x} cannot be independent, hence there are 
a_1,..., a_m s.t. a_1v_1 + ... + a_(m-1)v_(m-1) + a_mx = 0 where not all
 a_i are 0. Now a_m != 0. (If it were 0, let k be the largest index s.t.
 a_k ! = 0, then a_1v_1 + ... + a_kv_k = 0, then a_i = 0 for all i by 
independence, a contradiction)
Then x = (a_m)^-1 * -a_1v_1 - ... - (a_m)^-1 * a_(m-1)v_(m-1)) is in the span of B, so B spans V.
1 => 3
Suppose
 B is a basis. Suppose there is some linearly independent set S which 
properly contains B. Then there is some x in S\B.  Since B is a basis, x
 = a_1v_1 +... + a_nv_n for some a_i in F and v_i in B. Then, 1*x - 
a_1v_1 - ... - a_nv_n = 0 and all of these vectors are in S since S 
contains B. By the independence of S, all the coefficients are 0, in 
particular 1 = 0, a contradiction. Thus no such S can exist and so B is a
 maximal linearly independent subset.
So we showed 1 <=>2 and 1 <=> 3 so 1 <=> 2 <=> 3 and that's it.
I hope there are no mistakes!
 
No comments:
Post a Comment