This site is being phased out.

Internal structure of a vector space: part 3

From Mathematics Is A Science
Jump to navigationJump to search

Proof of the Comparison Theorem

This is what we are after: $$|{\rm spanning \hspace{3pt} set}| \geq |{\rm linearly \hspace{3pt} independent \hspace{3pt} set}|$$

Given

  • 1. ${\rm span}\{w_1,\ldots,w_n\} = V$
  • 2. $\{v_1,\ldots,v_m\}$ is linearly independent

(1.) $\Rightarrow$

$\begin{array}{} v_1=a_{11}w_1+a_{12}w_2+\ldots+a_{1n}w_n \\ v_2=a_{21}w_1+a_{22}w_2+\ldots+a_{2n}w_n \\ \vdots \\ v_m=a_{m1}w_1+a_{m2}w_2+\ldots+a_{mn}w_n \end{array}$

Let's use it.

Assume $n < m$. Find now $c_1, \ldots, c_n$ such that $c_1v_1 + c_2v_2 + \ldots + c_mv_m = 0$ (linear dependence).

Now form a system of linear equations based on (*). It's matrix is the transpose of the above:

$\left[ \begin{array}{} a_{11} & a_{21} & \ldots & a_{m1} \\ a_{12} & a_{22} & \ldots & a_{m2} \\ \vdots \\ a_{1n} & a_{2n} & \ldots & a_{mn} \end{array} \right]$

Consider the homogeneous system of linear equations with this matrix.

Good news: $n<m$, so we have fewer equations than variables. Then, by the theorem about homogeneous systems, there is at least one non-trivial (non-zero) solution. We have $c_1,\ldots,c_m \in {\bf R}$, not all zero, and

(**) $\left. \begin{array}{} a_{11}c_1+a_{21}c_2+\ldots+a_{m1}c_m=0 \\ a_{12}c_1+a_{22}c_2+\ldots+a_{m2}c_m=0 \\ \vdots \\ a_{1n}c_1 + a_{2n}c_2 + \ldots + a_{mn}c_m = 0 \end{array} \right]$ (derive!)

We need to show that indeed $$c_1v_1 + \ldots + c_mv_m = 0.$$

(***) Want $c_1v_1 + \ldots + c_mv_m =0$?

Substitute (*) into (***): $$c_1(a_{11}w_1+a_{12}w_2+\ldots+a_{1n}w_n) $$ $$+c_2(a_{21}w_1+a_{22}w_2+\ldots+a_{2n}w_n) $$ $$+ \ldots $$ $$+ c_m(a_{m1}w_1+ a_{m2}w_2+\ldots+a_{mn}w_n)$$

Now use (**), each equation will correspond to $w_1,\ldots,w_n$.

Collect similar terms:

$$=w_1(c_1a_{11}+c_2a_{21}+\ldots+c_ma_{m1})$$ $$ + w_2(c_1a_{12}+c_2a_{22}+\ldots+c_ma_{m2})$$ $$+ \ldots $$ $$+ w_n(c_1a_{1n}+c_2a_{2n}+\ldots+c_ma_{mn})$$ $$= w_1 \cdot 0 + w_2 \cdot 0 + \ldots + w_n \cdot 0$$ $$=0$$

So, $w_1,\ldots,w_n$ is linearly dependent. A contradiction. $\blacksquare$

Conclusions...

Summarize:

  • $V$ vector space,
  • $S,T \subset V$,
  • $|S|, |T| < \infty$,
  • ${\rm span \hspace{3pt}} S = V$,
  • $T$ is linearly independent.

Then

  • $|S| \geq |T|$.

The point is to be able to compare the size of bases.

This is easy now.

Corollary: All finite bases have the same number of elements.

Proof: Just an illustration:

ProofThatAllFiniteBasesHaveSameNumberOfElements.png

So, $|S|=|T|$. $\blacksquare$

The above is about finite dimensional spaces, i.e., ones with a finite basis. What if $S$ has an infinite basis?

Is it possible to have $T$ finite? No.

Why? Here $S$ is linearly independent, so every subset $S'$ of $S$ is linearly independent, including finite ones.

Then by Comparison Theorem, since $T$ spans $V$, $n = |T| \geq |S'|$.

Then, every finite subset of $S$ is smaller than $T$, $S' \subset S$, $|S'|< \infty$ implies $|S'| \leq n$. Hence $S$ isn't infinite.

Facts:

  • ${\rm dim \hspace{3pt}} V < \infty$ implies all bases have $n={\rm dim \hspace{3pt}} V$ number of elements
  • ${\rm dim \hspace{3pt}} V = \infty$ implies all bases have infinitely many elements

Conclusion: ${\rm dim \hspace{3pt}} V$ is well defined.


This has been an entirely "existential" discussion...

Where do bases come from?

How to build bases

Suppose $V$ is our space. Let's construct a basis.

Outline:

  • Start with one, any one, vector, non-zero. Call it $v_1$.
  • Let $B=\{v_1\}$, does it span the space?
  • Suppose not, ${\rm span \hspace{3pt}} B \neq V$.
  • Then add one more vector, $v_2$ such that $\{v_1, v_2\}$ is still linearly independent. We can choose $v_2 \in V \setminus {\rm span \hspace{3pt}} B$.
  • Let $B = \{v_1, v_2\}$, does it span the space?
  • Suppose not, ${\rm span \hspace{3pt}} \neq V$.
  • Then add one more vector, $v_3 \in V \setminus {\rm span \hspace{3pt}} B$.
  • etc

That's the idea.

Example: Take ${\bf R}^3$. Its basis looks like this:

BasisOfR3.png

But we don't know it yet.

Start with $v_1$ arbitrary, not equal to $0$.

V1v2v3VectorsPicked.png

Continue with $v_2$ arbitrary, not in the span of $v_1$.

Here we have a sequence:

  • span of $v_1$ is a line,
  • span $\{v_1,v_2\}$ is a plane,
  • span $\{v_1,v_2,v_3\}$ is the whole space.

We justify each step of this construction by...

Expansion Lemma: Given a linearly independent set $S = \{v_1, \ldots, v_n\}$ in a (finite-dimensional) space $V$, suppose $v_{n+1} \in V \setminus {\rm span \hspace{3pt}} S$. Then $S \cup \{v_{n+1}\} = \{v_1,\ldots,v_n,v_{n+1}\}$ is linearly independent.

Proof: Definition of linearly dependence versus one element is a linear combination of the rest. Review the proof. $\blacksquare$

Expansion Theorem: Given a linearly independent set $S = \{v_1, \ldots, v_n\}$ in a finite dimensional space $V$. Then there are $\{ v_{n+1}, \ldots, v_k\} \subset V$ such that $B = \{v_1,\ldots,v_n, v_{n+1}, \ldots, v_k\}$ is a basis of $V$.

Proof: By induction.

Statement: there is a set $\{v_1, \ldots, v_s\} \subset V$ linearly independent for all $s = n, \ldots, k$ where $k = {\rm dim \hspace{3pt}} V$.

Base of induction: $s=n$. True, by the condition of the theorem.

Assume that the statement holds for $s$. Prove that the statement holds for $s+1$.

$S_s=\{v_1,\ldots,v_s\}$ is linearly independent. Pick any element $v_{s+1} \in V \setminus {\rm span \hspace{3pt}} S_s$. What if there is no $v_{s+1}$ like that? Then ${\rm span \hspace{3pt}} S_s = V$. Hence $S_s$ is a basis of $V$.

If such a $v_{s+1}$ exists, define $S_{s+1}=S_s \cup \{v_{s+1}\}$. Then by the Extension Lemma, it is linearly independent.

Step is proven. Then it holds for $s=k$, i.e., $S_k$ is linearly independent.

Now need to show that $S_k$ spans $V$. Suppose it doesn't, then there is another $v_{k+1} \in V \setminus {\rm span \hspace{3pt}} S_k$. Again $S_{k+1} = \{v_1,\ldots,v_{k+1}\}$ is linearly independent.

This is impossible by the Comparison Theorem. $\blacksquare$

Note: Given $n$, we need to determine $u \in {\rm span}\{v_1,\ldots,v_n\}$. How? Solve $u=a_1v_1+\ldots+a_nv_n$, which is a (non-homogeneous) system of linear equations.


Above: Expand while preserving linear independence.

Below: Contract while preserving the span.

Reduction Lemma: Suppose $S = \{v_1,\ldots,v_n\} \subset V$ and ${\rm span \hspace{3pt}} S = V$. If $v_n$ is a linear combination of $v_1,\ldots,v_{n-1}$, then ${\rm span}{v_1,\ldots,v_{n-1}}=V$.

Proof: Refer to previous discussion,

If $$v_n=a_1v_1+\ldots+a_{n-1}v_{n-1}$$ then $$c_1v_1+\ldots+c_{n-1}v_{n-1}+c_nv_n$$ $$=c_1v_1+\ldots+c_{n-1}v_{n-1}+c_n(a_1v_1+\ldots+a_{n-1}v_{n-1})$$ $$=(c_1+c_na_1)v_1+\ldots+(c_{n-1}+c_na_{n-1})v_{n-1}.$$ So $${\rm span}\{v_1,\ldots,v_{n-1}\}={\rm span}\{v_1,\ldots,v_n\}.$$ Prove more details, exercise. $\blacksquare$

Reduction Theorem: Given a set $S=\{v_1,\ldots,v_n\} \subset V$ such that ${\rm span \hspace{3pt}} S = V$, there is a subset $B \subset S$ which is a basis of $V$.

Proof: Idea: Reduce the number of elements in a spanning set. Proof is by induction, with induction step proven by Reduction Lemma. Steps when no vector is a linear combination of the rest, i.e., they are linearly independent. $\blacksquare$

Note: Each step requires solving a system of linear equations. Solve $a_1v_1+\ldots+a_nv_n=0$. This vector equation is a homogeneous system.

Corollary 1: A subspace $T$ of a finite dimensional space $V$ is finite dimensional and ${\rm dim \hspace{3pt}} T \leq {\rm dim \hspace{3pt}} V$.

Proof idea: Expansion theorem. Expand $B_T$ to $B_V$. Exercise. $\blacksquare$

What if ${\rm dim \hspace{3pt}} T = {\rm dim \hspace{3pt}} V$? Then $T=V$.

Proof? "$\subset$" obvious. "$\supset$" By contradiction, suppose $v \in V \setminus T$. Suppose $B_T$ is a basis of $T$, then expand $B_T$ to $B_V$. Then $|B_T| < |B_V|$, or ${\rm dim \hspace{3pt}} T < {\rm dim \hspace{3pt}} V$.


The dimension is a measure of the space. The corollary shows that this analogy makes sense.

Corollary 2: Suppose ${\rm dim \hspace{3pt}} V = n$. Then

  • (a)If $\{v_1,\ldots,v_n\}$ is linearly independent, then it's a basis.
  • (b) If $\{v_1,\ldots,v_n\}$ spans $V$, then it's a basis.


Recall, $B=\{v_1,\ldots,v_n\}$ is a basis if it's both

  • linearly independent, and
  • a spanning set of $V$.

So, as it turns out, we need to verify only one of the conditions, provided $|B|={\rm dim \hspace{3pt}} V$.


What about infinite dimensional spaces?

Take ${\bf P}$. Its standard basis is the powers: $B= \{1, x^2, \ldots, x^n, \ldots\}$. Can we say if $D =\{p_1,\ldots,p_n,\ldots\}$ (infinite many elements), then it's a basis assuming $D$ is linearly independent?

Does $D$ have to be a basis? ${\rm span \hspace{3pt}} D ={\bf P}$?

No. Example? Try $D=B \setminus \{1\}$. Then $1 \not\in {\rm span \hspace{3pt}} D$. The set is the polynomials with $0$ free term.

Using a basis to represent elements, coordinates

Observation: A vector space $V$ with ${\rm dim \hspace{3pt}} V=n$ is "similar" to ${\bf R}^n$.

In what sense? $V$ has a basis of $n$ elements (in fact all of them).

What's so special about ${\bf R}^n$? $${\bf R}^n = \{v = (x_1,\ldots,x_n) \colon x_1,\ldots,x_n \in {\bf R}\}$$

Recall $x_i$ is the $i^{\rm th}$ coordinate of $v$.

What else is $x_i$?

Observe: $$v = x_1(1,0,0,\ldots,0)+x_2(0,1,0,\ldots)+\ldots+x_n(0,0,\ldots,0,1) $$ $$= x_1e_1+x_2e_2+\ldots+x_ne_n.$$ So, $x_i$'s are the coefficients of the representation of $v$ as a linear combination of the elements of $B_0=\{e_1,\ldots,e_n\}$, the standard basis of ${\bf R}^n$.

But what about other bases of ${\bf R}^n$?

Example: In ${\bf R}^3$, choose $B=\{(1,0,0),(1,1,0),(1,1,1)\}$. Then

  • $(2,1,1) = 1e_1 + 1e_2 + 1e_3$,

but

  • $(2,1,1) \neq 1u_1+1u_2+u_3$!

So, the coordinates aren't absolute!

They depend on the choice of basis.

Observe, we can find a similar representation of $v$ in terms of $B=\{u_1,u_2,u_3\}$: $$v=(2,1,1)$$ $$=a_1u_1+a_2u_2+a_3u_3 $$ $$= a_1(1,0,0)+a_2(1,1,0)+a_3(1,1,1) $$ $$= (a_1+a_2+a_3,a_2+a_3,a_3)$$

Hence we have a system:

$\left\{ \begin{array}{} a_1+a_2+a_3 &= 2 \\ a_2+a_3 &= 1 \\ a_3 &= 1 \end{array} \right.$

Solve it, $a_3=1$, $a_2=0$, $a_1=1$.

Then we say that $1,0,1$ are the coordinates of $v=(2,1,1)$ with repsect to the basis $B=\{u_1,u_2,u_3\}$.

There may be many bases, some of them better than others -- depending on the circumstances...

ABasisOfR3.png

Linear algebra should be independent from a particular choice of basis!

VectorSum.png


Fix $B$ as the "official" basis. Given a vector, we simply write $v=(1,0,1)$ etc and same for all other vectors. But, we have to do a "change of basis"!


Back to abstract vector spaces...

Given a vector space $V$ and a basis $B$, ${\rm dim \hspace{3pt}} V=n$.

Then, as we know, any vector $v \in V$ can be represented by a linear combination of the elements of $B$:

  • $B=\{v_1,\ldots,v_n\}$ then
  • $v=a_1v_1+\ldots+a_nv_n$

for some $a_1,\ldots,a_n \in {\bf R}$.

Definition: Then $a_1,\ldots,a_n$ are called the coordinates of $v$ with respect to $B$.

Always the same question, is it well defined?

  • existence: OK, because $v \in V = {\rm span \hspace{3pt}} B$.
  • uniqueness: OK too but we need to prove it.

Theorem: For a given basis, coordinates are unique (up to the order of elements of the basis).

Proof: Suppose not. Suppose $a_1,\ldots,a_n$ and $b_1,\ldots,b_n$ are two sets of coordinates of $v \in V$ with respect to the same basis $B$:

  • $v=a_1v_1+\ldots+a_nv_n$
  • $= b_1v_1+\ldots+b_nv_n$.

Now, move all to the left and factor. Then $$(a_1-b_1)v_1 + \ldots + (a_n-b_n)v_n = 0,$$ a linear combination of $B$.

Since $v_1,\ldots,v_n$ are linearly independent, then $$a_1-b_1=0, \ldots, a_n-b_n=0.$$ $\blacksquare$

To summarize...

For an ordered basis $B=\{v_1,\ldots,v_n\}$ given a vector $v$, its coordinate vector $\left[ \begin{array}{} a_1 \\ \vdots \\ a_n \end{array} \right]$ with respect to $B$, consist of the ordered coefficients of the representation of $v$ as a linear combination of the elements of $B$: $$v=a_1v_1+\ldots+a_nv_n.$$

It's important to observe that $\left[ \begin{array}{} a_1 \\ \vdots \\ a_n \end{array} \right] \in {\bf R}^n$.


How do we deal with several bases at the same time?

Given basis $B$, the coordinate vector of $v$ may be specified with a subscript:

$\left[ \begin{array}{} a_1 \\ \vdots \\ a_n \end{array} \right]_B$

Example: A vector expressed in terms of the standard basis $B$, $v=\left[ \begin{array}{} 2 \\ 1 \\ 1 \end{array} \right]_B$ can be re-written with respect to another basis $D = \left\{ \left[ \begin{array}{} 1 \\ 0 \\ 0 \end{array} \right], \left[ \begin{array}{} 1 \\ 1 \\ 0 \end{array} \right], \left[ \begin{array}{} 1 \\ 1 \\ 1 \end{array} \right] \right\}$ as $v = \left[ \begin{array}{} 1 \\ 0 \\ 1 \end{array} \right].$

To find the new representation we had to solve a system of linear equations.

Example: Consider ${\bf P}_2$, choose $\{1,x,x^2\}$ as a basis. Take $p(x)=3+5x-7x^2$, then its coordinate vector is $\left[ \begin{array}{} 3 \\ 5 \\ -7 \end{array} \right]$.

Indeed $p(x)=3 \cdot 1 + 5 \cdot x + (-7) \cdot x^2$.

Now, choose $B=\{1,x-1,(x-1)^2\}$ as a new basis.

Solve $$3+5x-7x^2=a_0+a_1(x-1)+a_2(x-1)^2$$ for $a_0, a_1, a_2$. $$ = (a_0-a_1+a_2)+(a_1-2a_2)x+a_2x^2$$

Match the coefficients:

$\begin{array}{} 3 &= a_0-a_1+a_2 \\ 5 &= a_1 - 2a_2 \\ -7 &= a_2 \end{array} \rightarrow \left[ \begin{array}{ccc|c} 1 & -1 & 1 & 3 \\ 0 & 1 & -2 & 5 \\ 0 & 0 & 1 & -7 \end{array} \right]$

Substitution: $a_2=-7$.

$5=a_1-2(-7)=a_1+14 \rightarrow a_1=-9$.

Substitute: $3=a_0-(-9)+(-7)=a_0+2 \rightarrow a_0=1$.

Answer: $\left[ \begin{array}{} 1 \\ -9 \\ -7 \end{array} \right]$, verify.

Example: Let $f={\rm sinh \hspace{3pt}}x$ in $V={\rm span}\{e^x,e^{-x}\}$ (subspace of ${\bf C}({\bf R})$). This is a basis. Indeed show that $e^x \neq ke^{-x}$ (implying $e^{2x}=k$ cannot happen).

Since $B= \{e^x,e^{-x}\}$ and ${\rm sinh \hspace{3pt}}x = \frac{e^x-e^{-x}}{2}$, then

Answer: $\left[ \begin{array}{} \frac{1}{2} \\ -\frac{1}{2} \end{array} \right] = f$

Exercise: Work this out for infinite-dimensional spaces. (What if $B$ is uncountable, like ${\bf R}$?)