This site is being phased out.
Linear algebra of Euclidean space
Contents
Linear subspaces in ${\bf R}^3$
Example. Suppose $u = (0,1,1)$. Is there a $y$ such that $y \in S \setminus S_1$, where $S = {\rm span}\{u \}$?
Yes, for example $v = (1,1,2)$.
Next compute $$\begin{array}{} S_2 &= {\rm span}(u, v) \\ &= \{ \alpha u + \beta v \colon \alpha, \beta \in {\bf R} \} \\ &= \{\alpha(0,1,1) + \beta(1,1,2) \colon \alpha, \beta \in {\bf R} \} \\ &= \{(\beta,\alpha+\beta,\alpha+2\beta): \alpha, \beta \in {\bf R} \} \end{array}$$
The result is the "parametric solution" of the system. It is plane in the $3$-space.
Theorem 1.4.3 The solution set of a homogeneous linear equation $$a_1x_1 + a_2x_2 + a_3x_3 = 0,$$
with at least one of $a_1, a_2, a_3$ not equal to $0$, is a plane passing through the origin.
If $a_1=a_2=a_3=0$, the the equation turns into $0 = 0$, and the solution set is ${\bf R}^3$.
Example. Given: $$u = (1,1,0), v = (1,0,1), w = (3,1,2).$$
Do they all lie in a $2$-dimensional subspace?
Find if $w \in {\rm span}\{u, v \}$? In other words. $$w = \alpha u + \beta v.$$
Find $\alpha, \beta$.
Rewrite the equation: $$\begin{array}{} (3,1,2) = \alpha(1,1,0) + \beta(1,0,1) \\ (3,1,2) = (\alpha + \beta, \alpha, \beta) \end{array}$$
Examining each coordinate of this vector equation produces three equations for these numbers: $$\begin{array}{} 3 &= \alpha + \beta \alpha + \beta & (1) \\ 1 &= \alpha + 0 & (2) \\ 2 &= 0 + \beta & (3) \end{array}$$
Solve it.
There are $2$ unknowns but $3$ equations. So the solution of any two equations must satisfy the third. From (2) and (3) we have $\alpha = 1$ and $\beta = 2$ which when substituted in (1), we see that (1) is satisfied.
Hence, the solution is $w = u + 2v$. In other words, $u = w - 2v$ and $v = \frac{w-u}{2}$. The solution set is the plane is $x + 2y - z = 0$.
Example. Given $$\begin{array}{} x_1 = (1,2,3) \\ x_2 = (1,2,-3) {\rm \hspace{3pt} [follows \hspace{3pt} that \hspace{3pt}} x_2 \not\in {\rm span}\{x_1\}] \\ x_3 = (1,4,0). \end{array}$$
Find ${\rm span}\{x_1, x_2, x_3 \}$.
Denote $$\left| \begin{array}{r} 1 & 2 & 3 \\ 1 & 2 & -3 \\ 1 & 4 & 0 \end{array}\right| = [A]$$
Then $${\rm det}(A) = 0 - 6 + 12 - 6 + 12 - 0 = 12.$$
Since the determinant is non-zero so the vectors are linearly independent, hence $${\rm span}\{x_1, x_2, x_3 \} = {\bf R}^3.$$
Example. Represent the solutions of the (system) equation: $$x_3 = 0$$
as linear combinations of two fixed vectors.
Consider the solution set of an equation: $$\begin{array}{} S &= \{x_3 = 0 \} \\ &= \{(*, *, 0) \} \\ &= \{(\alpha, \beta, 0) \colon \alpha, \beta \in {\bf R} \} \\ &= \{ \alpha(1,0,0), \beta(0,1,0) \colon \alpha, \beta \in {\bf R} \}. \end{array}$$
Moreover, $$(1,0,0) = e_1, (0,1,0) = e_2 {\rm \hspace{3pt} form \hspace{3pt} a \hspace{3pt} basis \hspace{3pt} of \hspace{3pt}} S.$$
Example. Same problem for: $$x_1 - x_2 + x_3 = 0$$
Let's guess first: $$u = (1, 1, 0), v = (0, 1, 1).$$
Check $$u = \lambda v?$$
$$(1, 1, 0) = \lambda (0, 1, 1)$$ $$(1, 1, 0) = (0, \lambda, \lambda)$$
Clearly there is no solution for non-zero $\lambda$.
Without a guess: $$x_1 = 0 {\rm \hspace{3pt} implies \hspace{3pt}} -x_2 + x_3 = 0, {\rm \hspace{3pt} so}$$
$$x_2 = x_3$$
Pick $$x_3 = 1 {\rm \hspace{3pt} then \hspace{3pt}} u = (0, 1, 1).$$
Single linear equation
Let's start with these examples:
Non-homogeneous equation: $x_1 + x_2 - x_3 = 1$ [The right hand side is any non-zero number]
The solution set $S$ of the second equation is not a subspace.
Indeed, let us assume that the set $S$ be a linear subspace. Let
For $S$ to be a linear subspace the linear combination of $u$ and $v$ must also be vector in $S$. Let $$z = x + y = (1+3, 0+0, 0+2) = (4,0,2).$$
Now, take the coordinates of $w$ and substitute them into the equation: $$z_1 + z_2 - z_3 = 4 + 0 - 2 = 2 \neq 1.$$
Therefore $w \not\in S$ and so
But what about $$z = x - y?$$
$$z = (1-3, 0-0, 0-2) = (-2, 0, -2).$$
So, $z_1 + z_2 - z_3 = -2 + 0 + 2 = 0 \neq 1$. So, in this case too $z \not\in S$.
But, z = x - y satisfies the homogeneous equation!
This is also true in general case:
Indeed, let x and y be $$x = (x_1, x_2, x_3) \in S {\rm \hspace{3pt} and}$$
$$y = (y_1, y_2, y_3) \in S.$$
Then, $$z = x - y = (x_1 -y_1, x_2 -y_2, x_3 -y_3) \in S$$
Now, since $u,v \in S$ so $$\begin{array}{l} x_1 &+ x_2 &- x_3 = 0 \\ y_1 &+ y_2 &- y_3 = 0 \\ (x_1 -y_1) &+ (x_2 - y_2) &- (x_3 - y_3) = 0 \end{array}$$
Therefore, $z \in S'$.
The $S$ and $S'$ are two parallel planes: $$S: x_1 + x_2 - x_3 = 1,$$
$$S': x_1 + x_2 - x_3 = 0.$$
This means that $S'$ is obtained from $S$ by a shift. Find it.
Theorem. Given a non-homogeneous linear equation, its solution set has the form $$S = S_1 + z,$$
where $S_1$ is the solution set to the corresponding homogeneous equation (subspace) and $z$ is a vector.
Example. Consider an equation (solution set S is a plane):
$$2x_1 - x_2 + 3x_3 = 5.$$
Rewrite: $$2x_1 = x_2 - 3x_3 + 5,$$
$$x_1 = (1/2)x_2 - (3/2)x_3 + (5/2).$$
Let $x_2 = \alpha$ and $x_3 = \beta$ be the parameters. Then, $$x_1 = (1/2) \alpha - (3/2) \beta + (5/2).$$
A solution is $x = (x_1, x_2, x_3) \in {\bf R}^3$, so
$S$ is the set of all such $x$'s. Thus, $$x = \alpha x_1 + \beta x_2 + z$$
Here: $$\begin{array}{} z = (0, 0, 5/2), \\ x_1 = (1, 0, 1/2), \\ x_2 = (0, 1, -3/2). \end{array}$$
This is the answer.
Two linear equations
Consider a system of two equations: $$S_1: x_1 + x_2 - x_3 = 0$$ $$S_2: x_1 + 2x_2 + x_3 = 0$$
Here the solution to each equation is a subspace. Then, the solution set of the intersection of these two subspaces is their intersection: $$S = S_1 ∩ S_2.$$
$S$ may be a line...
Adding the two equations:
$$2x_1 + 3x_2 = 0,$$ $$x_1 = -(3/2)x_2.$$
Substituting in the equation for $S_1$: $$\begin{array}{r} -(3/2)x_2 + x_2 - x_3 = 0 \\ -(1/2)x_2 - x_3 = 0 \\ x_3 = -(1/2)x_2 \end{array}$$
Choose a parameter $$x_2 = \alpha.$$
Then, the solution set $S$ is given by: $$\begin{array}{} x_1 = -(3/2) \alpha \\ x_2 = \alpha \\ x_3 = -(1/2) \alpha \end{array}$$
This solution
Also,
Now what if these equations had a non-zero entries in the right hand sides ("free terms")?
We rewrite the work above and change just a few things: $$S*_1: x_1 + x_2 - x_3 = 1,$$
$$S*_2: x_1 + 2x_2 + x_3 = 2.$$
The solution set S is the intersection of these two subspaces: $$S* = S*_1 ∩ S*_2.$$
Adding the two equations: $$2x_1 + 3x_2 = 3, $$ $$x_1 = -(3/2)x_2 + (3/2)$$
Substituting in the equation for $S*_1$: $$\begin{array}{} -(3/2)x_2 + (3/2) + x_2 - x_3 = 1, \\ -(1/2)x_2 - x_3 = -(1/2), \\ x_3 = -(1/2)x_2 + (1/2). \end{array}$$
Suppose, $x_2 = \alpha$. Then, the solution set is $$\begin{array}{} x_1 = -(3/2) \alpha + (3/2) \\ x_2 = \alpha \\ x_3 = -(1/2) \alpha + (1/2) \end{array}$$
In other words,
Also,
Hence, $S* = S + z$.
Example of two equations but solution set is a plane:
$$x_1 + x_2 + x_3 = 0$$
$$2x_1 + 2x_2 + 2x_3 = 0.$$
In this case the second equation is redundant (it's an identical plane).
Three linear equations
Solution for $3$ equations can be
- Point - when the equations represent planes with no parallel ones;
- Plane - when the equations represent identical planes;
- Line - when the equations represent planes that intersect through it;
- ${\bf R}^3$ - when for example the equations are $0 = 0$.
Example. Find the solution set of the system: $$\begin{array}{} x_1 - x_3 &= 3 & (1) \\ x_2 + 3x_3 &= 5 & (2) \\ 2x_1 - x_2 - 5x_3 &= 1 &(3) \end{array}$$
Substituting $x_1$ from (1) into (3): $$2(x_3 + 3) - x_2 - 5x_3 = 1$$
$$x_3 - 3x_2 = -5 \hspace{5pt}(4)$$
Comparing (4) with (2) it is evident that (4) is a multiple of (2). Comparing (1) with (2) it is evident that (2) is not a multiple of (1). Then, the solution set is a line.
Specifically, we have one parameter, say, $\alpha = x_3$. Then the solution set is:
- $x_1 = \alpha + 3$
- $x_2 = -3 \alpha + 5$
- $x_3 = \alpha$
In other words, $$x = (x_1, x_2, x_3) = ( \alpha + 3, -3 \alpha + 5, \alpha ).$$
Next $$S = {\rm span}\{x \} + z,$$
find $x$ and $z$: $$x = (1, -3, 1) {\rm \hspace{3pt} {\rm \hspace{3pt} and \hspace{3pt}} \hspace{3pt}} z = (3, 5, 0).$$
Example. Find $\alpha _1, \alpha _2, \alpha _3$ so that
$$\{ {\rm Solution \hspace{3pt} of \hspace{3pt}} \alpha _1x_1 + \alpha _2x_2 + \alpha _3x_3 = 0 \} = {\rm span}\{x_1, x_2 \}, {\rm \hspace{3pt} where}$$
$$x_1 = (1, -1, 1) {\rm \hspace{3pt} and \hspace{3pt}} x_2 = (2, 1, 1).$$
Rewrite: $$\begin{array}{} {\rm span}\{x_1, x_2 \} &= \{ \alpha x_1 + \beta x_2: \alpha , \beta \in {\bf R} \} \\ &= \{( \alpha + 2 \beta , - \alpha + \beta , \alpha + \beta : \alpha , \beta \in {\bf R} \} \end{array}$$
Solution: $$\begin{array}{} x_1 &= \alpha + 2 \beta \\ x_2 &= - \alpha + \beta \\ x_3 &= \alpha + \beta \end{array}$$
Substitute: $$\alpha _1( \alpha + 2 \beta ) + \alpha _2(- \alpha + \beta ) + \alpha _3( \alpha + \beta ) = 0$$
$$\alpha ( \alpha _1 - \alpha _2 + \alpha _3) + \beta (2 \alpha _1 + \alpha _2 + \alpha _3) = 0$$
Since $\alpha$ and $\beta$ are arbitrary, we have $$( \alpha _1 - \alpha _2 + \alpha _3) = 0$$
$$(2 \alpha _1 + \alpha _2 + \alpha _3) = 0$$
Exercise: solve this system.
Review example. Given: $u = (1,1,0), v = (0,1,1)$. Is vector $(0,1,-1)$ a linear combination of $u$ and $v$? In other words, find $\alpha$ and $\beta$ such that
$$(0,1,-1) = \alpha x_1 + \beta x_2$$
Rewrite: $$\begin{array}{} (0,1,-1) &= \alpha (1,1,0) + \beta (0,1,1) \\ &= ( \alpha , \alpha + \beta , \beta ) \end{array}$$
$$0 = \alpha , 1 = \alpha + \beta , -1 = \beta$$
This is clearly a contradiction.
More generally,
Theorem. A system of $m$ homogeneous equations with $n$ unknowns has infinitely many solutions when $n > m$.
Geometrically,
$m=$ number of linear subspaces $S_1, S_2, .... S_m$ corresponding to each equation.
The solution set of the system is the intersection of $S_1, S_2, \ldots, S_m$: $$S = S_1 \cap S_2 \cap \ldots \cap S_m.$$
Then $S$ a linear subspace of ${\bf R}^n$.
Observe that $0 \in S$. Now, if $S \neq 0$, then there is $x \in S \setminus \{0 \}$. Then there are more solutions: $${\rm span}\{x \} \subset S.$$
Examples. (1) Solve the system: $$x_1 = 0,$$ $$x_3 = 0$$
Then $$\begin{array}{} S &= \{(0, \alpha , 0): \alpha \in {\bf R} \} \\ &= {\rm line} \\ &= {\rm span}\{(0, 1, 0)\}. \end{array}$$ (2) Solve the system: $$\begin{array}{} x_1 &- x_3 &= 0 \\ x_1 &+ x_3 &= 0 \\ & 2x_1 &= 0 \\ & x_1 &= 0 \end{array}$$
Substituting for $x_1$ we get $x_3 = 0$. The answer is the same as above.
(3) Solve the system: $$x_1 - x_2 = 0 $$ $$-x_1 + x_2 = 0.$$
Then $$x_1 = x_2$$
Find $$S = {\rm span}\{v_1, v_2 \}.$$
Let $x_2$ be a parameter $\alpha$ and $x_3$ be a parameter $\beta$. Then $$\begin{array}{} S &= \{( \alpha , \alpha , \beta ) \colon \alpha , \beta \in {\bf R} \} \\ &= \{ \alpha v_1 + \beta v_2 \colon \alpha , \beta \in {\bf R} \}\\ &= \{ \alpha (1,1,0) + \beta (0,0,1) \colon \alpha , \beta \in {\bf R} \} \end{array}$$
$$S = {\rm span}\{(1,1,0), (0,0,1) \}$$
Next, $${\rm line} = {\rm intersection \hspace{3pt} of \hspace{3pt} planes \hspace{3pt}} P_1 {\rm (equation \hspace{3pt} 1) \hspace{3pt} and \hspace{3pt}} P_2 {\rm \hspace{3pt} (equation 2)}.$$
Examples. Consider these systems:
(1) $$\left\{ \begin{array}{} x_1 + x_2 + x_3 = 1 \\ x_1 + x_2 + x_3 = 2 \end{array} \right.$$ (2) $$\left\{ \begin{array}{} x_1 + x_2 + x_3 &= 1 {\rm \hspace{3pt} (same \hspace{3pt} equation)} \\ 2x_1 + 2x_2 + x_3 &= 1 \end{array} \right.$$
Subtracting: $$-x_1 + x_2 = 0,$$
so there are infinitely many solutions.
(3) $$\left\{ \begin{array}{} x_1 + x_2 &= 1 \\ x_1 + x_2 &= 1 \\ x_1 + x_2 &= 1 \\ \end{array} \right.$$
So there are infinitely many solutions.
(4) $$0∙x_1 + 0∙x_2 + 0∙x_3 = 1, {\rm then}$$
$$0 = 1.$$
So there is no solution.
Affine subspaces
An affine subspace $A$ is a subset given by $$A = x + S,$$
where $S$ is a linear subspace and $x$ is a vector.
Here is what we mean here: $$A = x + S = \{x + s: s \in S \}$$
$${\rm set}= {\rm vector}+{\rm set}.$$
Of course, this representation doesn't have to be unique: $$A = x_1 + S = x_2 + S = x_3 + S.$$
What do we know about $x$?
First,
On the other hand,
Suppose, $A = x + S = y + T$ (equal as sets), where $S, T$ are linear subspaces. Then, $S = T$ and $y \in A$.
The converse is also true i.e.
Affine subspaces are solution sets of systems of non-homogeneous equations.
Example. System of non-homogeneous equations: $$\left\{ \begin{array}{} x_1 + x_2 + x_3 &= 1 &(1) \\ x_1 - x_3 &= 2 &(2) \end{array} \right.$$
Transforming into the corresponding system of homogeneous equations: $$\left\{ \begin{array}{} x_1 + x_2 + x_3 &= 0 \\ x_1 - x_3 &= 0 \end{array} \right.$$
The solution set to the latter is a linear subspace, say, $S$ and the solution to the former is, say, $A$. Then $$A = S + x.$$
What is $x$?
First, $x$ is a solution to the non-homogeneous system: $$x \in A.$$
Suppose we have $S$, find $A$. Turns out, all we need is $x$, i.e., any single solution to the non-homogeneous system.
- If $x = (3, -3, 1)$ then $A = (3, -3, 1) + S$.
- If $x = (2, -1, 0)$ then $A = (2, -1, 0) + S$.
The values of $x$ is such that substitution into equation (1) and (2) satisfies both.
Theorem. The solution set $A$ of a non-homogeneous system of equations is the sum of the solution set $S$ of the corresponding homogeneous system of equations and any solution $x$ of the non-homogeneous system: $$A = x + S.$$
Case 1. Let us consider the case with $2$ equations and $3$ variables. The solution set corresponds to the intersection of $2$ planes in ${\bf R}^3$. This is typically a line and atypically either no intersection (parallel planes) or coincident planes.
Case 2. Let us consider the case with $3$ equations and $3$ variables. The solution set corresponds to the intersection of $3$ planes in ${\bf R}^3$. This is typically a point and atypically either no intersection (parallel planes) or a line.
We know that a plane is determined by three points. Or we can use vectors instead:
To find these vectors: $$x = OA,$$
$$v_1 = AB,$$
$$v_2 = AC.$$
Review example. Find $a_1, a_2, a_3$ so that
Substituting $x_1$ and $x_2$, separately, in the given equation we have $$\left\{ \begin{array}{} a_1 \cdot 1 + a_2 \cdot (-1) + a_3 \cdot 1 &= 0 &(1) \\ a_1 \cdot 2 + a_2 \cdot 1 + a_3 \cdot 1 &= 0 &(2) \end{array} \right.$$
Let's solve the system. Rearranging the equations (1) and (2), $$\left\{ \begin{array}{} a_1 - a_2 + a_3 &= 0 &(3) \\ 2a_1 + a_2 + a_3 &= 0 &(4) \end{array} \right.$$
We have a system of $2$ equations and $3$ unknowns so we need $1$ more equation to make the answer unique. Taking $$a_2 = 1, $$
we can rewrite equations (3) and (4) as $$\left\{ \begin{array}{} a_1 + a_3 &= 1 &(5) 2a_1 + a_3 &= -1 &(6) \end{array} \right.$$
Solving (5) and (6), we have $$a_1 = -2, a_3 = 3.$$
To verify, back substituting $a_1, a_2$ and $a_3$ into the original equation (3) we have $$\left\{ \begin{array}{} a_1 - a_2 + a_3 &= 0 -2 - 1 + 3 &= 0 \end{array} \right.$$
Therefore, the answer is $(-2, 1, 3)$.
Review example. If $S_1 = {\rm span}\{x_1, x_2 \}$ and $S_2 = {\rm span}\{y_1, y_2 \}.$
Find $z$ such that $P = S_1 \cap S_2$. To rephrase, $$\begin{array}{} S_1 \cap S_2 = {\rm span}\{z \}, {\rm \hspace{3pt} so} \\ z \in S_1 \cap S_2, {\rm \hspace{3pt} or} \\ z \in S_1 and z \in S_2. \end{array}$$
Rewriting the last part: $$(z = ) \alpha x_1 + \beta x_2 = \gamma y_1 + \delta y_2.$$
Find $\alpha , \beta (\gamma , \delta).$
Dimension of vector space
Consider a system of two equations: $$\left\{ \begin{array}{} x_1 + x_2 - x_3 &= 0 2x_1 + 2x_2 - 2x_3 &= 0 \end{array} \right.$$
Then $$\begin{array}{} S &= Span{u_1, u_2} \\ &= Span{v_1, v_2}. \end{array}$$
It follows:
or
If $S = {\rm span}\{u, v, w \}$, is there redundancy? In other words can be drop on of them from the list and $S$ remains the same? Yes,
Rewrite: $$\alpha v + \beta w - u = 0.$$
Then $0$ is a linear combination of $v, u, w$. We say that they are "linearly dependent".
Definition. Vectors $x_1, \ldots, x_s$ are said to be linearly dependent if there exists a linear combination, with not all zero coefficients, of them equal to zero. Otherwise, they are said to be linearly independent.
Example. $s = 2$. Two vectors $x_1$ and $x_2$ are linearly dependent when $$\alpha x_1 + \beta x_2 = 0, ( \alpha , \beta \neq 0)$$
Solving for $x_1$, $$x_1 = -( \beta / \alpha )x_2.$$
Therefore, $x_1$ is a multiple of $x_2$ and vice versa.
Example. $s = 3$. Three vectors $u, v, w$ are linearly dependent:
$$\alpha u + \beta v + \gamma w = 0,$$
then we can solve for $u$: $$u = -\frac{\beta v + \gamma w}{\alpha} = -\frac{\beta}{\alpha}v - \frac{\gamma}{\alpha}w.$$
Therefore, $u$ is a linear combination of $v$ and $w$.
Example. Are the vectors $u = (1,1,0), v = (0,1,1), w = (1,1,1)$ linearly independent? To answer the question, we need to find $\alpha , \beta , \gamma$ such that
$$\alpha u + \beta v + \gamma w = 0.$$
Rewrite: $$\alpha (1,1,0) + \beta (0,1,1) + \gamma(1,1,1) = 0$$
Thus we have $3$ equations, for each of the coordinates: $$\left\{ \begin{array}{} \alpha + \gamma = 0 \\ \alpha + \beta + \gamma = 0 \\ \beta + \gamma = 0 \end{array} \right.$$
We can solve the system, but we only case about this question: is there a non-zero solution. Answer: NO (verify). Hence, $u, v$ and $w$ are linearly independent.
Illustration here
Suppose $S = {\rm span}\{u, v, w \}$. If you can remove the vectors one at a time. Check if it is possible to write
$${\rm span}\{u, v, w \} = {\rm span}\{u, v \} = {\rm span}\{u, w \} = {\rm span}\{u \}, {\rm \hspace{3pt} etc.}$$
If it is impossible then ${\rm dim \hspace{3pt}} S = 3$.
Example. Represent
The "standard unit vectors" are: $$u = (1, 0, 0), v = (0, 1, 0), w = (0, 0, 1).$$
We need to prove two things:
1. ${\bf R}^3 = {\rm span}\{u,v,w \}.$
Indeed, any vector is representable in terms of these three: $$(a, b, c) = au + bv + cw.$$
2. $u, v, w$ are linearly independent.
Indeed, if $$\alpha u + \beta v + \gamma w = 0$$
then $$\alpha (1,0,0) + \beta (0,1,0) + \gamma(0,0,1) = 0$$
hence $$\alpha = \beta = \gamma = 0.$$
Definition. Given a vector space (or a linear subspace) $S$, then vectors $v_1, v_2, \ldots, v_n$ are called a basis of $S$ if
- $S = {\rm span}\{v_1, v_2, \ldots, v_n \}$, and
- $v_1, v_2, \ldots, v_n$ are linearly independent.
We also say that the dimension of $S$ is $n$ (the "the" part needs to be proven.).
Example. If $S$ is a linear subspace with one basis $\{v_1, v_2 \}$, then show that ${\rm dim \hspace{3pt}} S = 2$. Let us suppose there exists another basis $\{u_1, \ldots, u_n \}$ with say $n = 3$. In particular, it means that
$${\rm span}\{u_1, u_2, u_3 \} = S.$$
We need to show that $u_1, u_2, u_3$ are linearly dependent. Since $\{v_1, v_2 \}$ is a basis, we can write $$\begin{array}{} u_1 = \alpha v_1 + \beta v_2 u_2 = \gamma v_1 + \delta v_2 u_3 = \lambda v_1 + \theta v_2 \end{array}$$
Next, we check linear independence $u_1, u_2, u_3$ by investigating existence of scalars $a, b, c$ such that $$au_1 + bu_2 + cu_3 = 0.$$
Rewrite: $$a( \alpha v_1 + \beta v_2) + b(γv_1 + δv_2) + c(λv_1 + θv_2) = 0$$
$$v_1(a \alpha + b \gamma + c \lambda) + v_2(a \beta + b \delta + c \theta) = 0$$
But, $v_1$ and $v_2$ are linearly independent. So,
$a \beta + b \delta + c \theta = 0.$
Solve this system.
This is a homogeneous system of equations in $a, b, c$ and there is always a solution $(0, 0, 0)$. If this system has a non-zero solutions for $a, b, c$, then $u_1, u_2, u_3$ are linearly dependent.
The definition implies the following.
Theorem. If ${\rm dim \hspace{3pt}} S = n$, then any collection of linearly independent vectors has at most $n$ elements.
If a subspace $S$ has a basis $\{v_1, v_2, \ldots, v_n \}$ and also another basis $\{u_1, u_2, \ldots, u_m \}$, then, by the theorem, $m \leq n$ but also $m = n$ or $m \geq n$. Hence $m = n$.
Theorem. All bases of $S$ must have same number of elements.
This implies that dimension of $S$ is well defined.
Suppose, $B = \{u_1, u_2,.... u_n \}$ is a basis of $S$.
Further suppose, $$\left\{ \begin{array}{} x = a_1u_1 + a_2u_2 + \ldots + a_nu_n &(1) {\rm \hspace{3pt} and \hspace{3pt} also} \\ x = b_1u_1 + b_2u_2 + \ldots + b_nu_n &(2). \end{array} \right.$$
Subtracting both sides of equation (2) from both sides of equation (1), we obtain $$(a_1 - b_1)u_1 + (a_2 - b_2)u_2 + .... + (a_n - b_n)u_n = 0$$
Since, $B = \{u_1, u_2, \ldots, u_n \}$ is a basis of $S$ so its elements, i.e. $u_1, u_2, \ldots u_n$ must be linearly independent. Therefore,
$a_1 = b_1, a_2 = b_2, ...., a_n = b_n$.
Hence,
Theorem. If $B$ is a basis of $S$ then every element of $S$ is represented by a unique linear combination of the elements of $B$.
For this uniqueness, it makes sense to call $a_1, a_2, \ldots, a_n$ the coordinates of $x$ with respect to $B$: $$x = (a_1, a_2, ...., a_n).$$
Definition. If $A$ is an affine subspace then its dimension is defined as $${\rm dim \hspace{3pt}} A = {\rm dim \hspace{3pt}} S,$$
when $A = z + S$ and $S$ is a linear subspace.
Recall, if $A = z + S = u + T$, then $S = T$. So the dimension of $A$ is well defined.
Theorem. Suppose, $S'$ is a proper subspace of $S$. Then, ${\rm dim \hspace{3pt} S'} < {\rm dim \hspace{3pt}} S$.
Indeed, suppose $S \neq S'$. Then there is $x \in S \setminus S'$. Then let $$T = {\rm span}(S' \cup \{x \}).$$
Therefore according to the theorem, ${\rm dim \hspace{3pt}} S' < {\rm dim \hspace{3pt}} T$ (since $x \neq 0$).
This idea helps us with the next topic.
Building a basis
How do we build a basis for a subspace $S$ of ${\bf R}^n$? This is the procedure:
- Pick $x_1 \in S \setminus \{0 \}$;
- Pick $x_2 \in S \setminus {\rm span}\{x_1 \} \rightarrow {\rm dim \hspace{3pt} span}\{x_1, x_2 \} = 2$;
- $\vdots$
- continue (induction)
- ..
- Pick $x_k \in S \setminus {\rm span}\{x_1, x_2, \ldots x_{k-1} \} \rightarrow {\rm dim \hspace{3pt}}\{x_1, x_2, \ldots, x_k \} = k;$
- ..
- continue
- ..
- Pick $x_n \in S \ {\rm span}\{x_1, x_2, \ldots, x_{n-1} \} \rightarrow {\rm dim \hspace{3pt}}\{x_1, x_2, \ldots, x_n \} = n$.
- Stop
This happens when we can't pick $$x_{n+1} \in S \setminus {\rm span}\{x_1, x_2, \ldots, x_n \}$$
which is empty.
Definition. If $S'$ is an affine subspace of $S$ and ${\rm dim \hspace{3pt}}S' = {\rm dim \hspace{3pt}} S - 1$ then $S'$ is called a hyperplane in $S$.
In ${\bf R}^3$, any plane is a hyperplane.
Theorem. $S$ is a hyperplane in ${\bf R}^3$ iff $S$ is the solution of a linear equation $$a_1x_1 + a_2x_2 + \ldots + a_nx_n = b.$$
where not all of $a_1, a_2, \ldots, a_n$ are zero.
Review example. Suppose $$S = \{(x_1, x_2, x_3) \colon x_2 = 0 \}.$$
Find an equation that represents $S$. Simple: $$S = \{(x_1, 0, x_3) \}$$
Now, find an affine subspace $A$ parallel to $S$ and passing through $z = (-3, 7, 4)$. Recall,
$$A = z + S, {\rm \hspace{3pt} for \hspace{3pt} any \hspace{3pt}} z \in A.$$
Compute: $$\begin{array}{} A &= z + \{(x_1, 0, x_3)\} \\ &= \{(-3, 7, 4) + (x_1, 0, x_3)\} \\ &= \{(x_1 -3, 7, x_3 + 4)\}. \end{array}$$
So, $S$ is shifted $7$ units to the right.
How about a different example? Consider functions: $\{1, x, x^2, x^3 \}$. Is this set linearly independent?
$$a_0 + a_1x + a_2x^2 + a_3x^3 = 0.$$
Since this holds for all $x$, this polynomial is the zero polynomial: $$a_0 = a_1 = a_2 = a_3.$$
So, the answer is Yes. For more see Function spaces.