Skip to main content
Logo image

Section 1.4 Linear Independence

In algebra you're introduced to the idea of “like terms”, which are algebraic terms that are similar enough that they can be combined. In matrix algebra we have a similar concept for vectors that are “geometrically like terms”. A simple example would be the vectors
\begin{equation*} \vv_1 = \mqty[1 \\ 2]\text{ and }\vv_2 = \mqty[-2 \\ -4]\text{.} \end{equation*}
It's clear to see that \(\vv_2\) is a scalar multiple of \(\vv_1\text{,}\) and so \(\vv_1\) and \(\vv_2\) must lie on the same line through the origin in \(\RR^2\text{.}\)
A slightly more complicated example is the collection given by the columns of the matrix
\begin{equation*} A = \mqty[\vv_1 \amp \vv_2 \amp \vv_3 \amp \vv_4 \amp \vv_5] = \mqty[ 1 \amp -1 \amp 0 \amp 2 \amp 5 \\ 0 \amp 1 \amp 1 \amp -2 \amp 0 \\ 1 \amp -1 \amp 0 \amp 3 \amp 4]\text{.} \end{equation*}
Then it can be checked that
\begin{equation*} \vv_3 = \vv_1 + \vv_2\text{ and } \vv_5 = 5\vv_1 -2\vv_2 - \vv_4\text{,} \end{equation*}
so these columns are in some sense redundant just like the vectors in \(\RR^2\) we considered above. In particular, we can write \(\vv_3\) as a linear combination of \(\vv_1\) and \(\vv_2\text{,}\) and we can write \(\vv_5\) as another linear combination of \(\vv_1, \vv_2\) and \(\vv_4\text{.}\) We make this idea of redundancy more precise using the concepts of a span and linear independence.

Subsection Span

The span of a collection of vectors is the set of all possible linear combinations of vectors in the set. In symbols,
\begin{equation*} \spn{S} = \{\sum_{k=1}^n c_i\vv_i : c_i\in\RR\text{ and }\vv_i\in S\}\text{.} \end{equation*}
One reason to consider spans of sets is because these are relatively simple geometric objects: the span of a single vector is either a point or a line; the span of two vectors is either a point, a line or a plane; and so on.

Example 1.4.1. Finding a Span.

Find the span of the set \(\{\ivec{1\\-1}\}\text{.}\)
Solution.
By definition, the span is the set of all possible linear combinations of vectors in this set. Since the set only has one vector, any linear combination would therefore look like \(\lambda\ivec{1\\-1}\) for some scalar \(\lambda\text{.}\) The span of the set must then be the line in \(\RR^2\) passing through the origin and the point \((1,-1)\text{,}\) i.e., the line \(y = -x\text{.}\)

Subsection Definition and Examples of Linear Independence

When working with spans of collections of vectors, it's often useful to recognize any linear combinations that are already present in the set. For example, let
\begin{equation*} \uu_1 = \mqty[1\\2\\0], \uu_2 = \mqty[-2 \\ -4\\1]\text{ and }\uu_3 = \mqty[0 \\ 0 \\ 1] \end{equation*}
and consider the set \(S = \{\uu_1, \uu_2, \uu_3\}\text{.}\) If you stare at this for a bit you might notice that
\begin{equation*} \uu_3 = 2\uu_1 + \uu_2\text{,} \end{equation*}
in other words, \(\uu_3\) is a linear combination of \(\uu_1\) and \(\uu_2\text{.}\) It then follows that \(\spn{S} = \spn\{\uu_1, \uu_2\}\text{.}\) In particular,
\begin{equation*} \spn{S} = \{\lambda\uu_1 + \mu\uu_2 : \lambda,\mu\in\RR\} \end{equation*}
which is a parametric description of the plane in \(\RR^3\) containing the origin and the points \((1,2,0)\) and \((-2,-4,1)\text{.}\) The fact that \(\uu_3\) also lies in the span implies that the point \((0,0,1)\) is on the plane as well. The collection \(S\) here is one of our first examples of a linearly dependent set. To help motivate the definition below, note that the linear combination \(2\uu_1 + \uu_2 = \uu_3\) can be rearranged to \(2\uu_1 + \uu_2 - \uu_3 = \mathbf{0}\text{.}\) Expressions such as this are also called linear dependence relations.

Definition 1.4.2. Linear Independence and Linear Dependence.

Let \(S = \qty{\vv_i}_{i=1}^{m}\subseteq\RR^n\text{.}\) We say that \(S\) is linearly independent if \(\sum_{i=1}^{m}c_i\vv_i = \vb{0}\) only if \(c_i = 0\) for \(1\leq i\leq m\text{.}\) Otherwise we say that \(S\) is linearly dependent.
As another example of a linearly dependent set, consider once again the matrix
\begin{equation*} A = =\mqty[\vv_1 \amp \vv_2 \amp \vv_3 \amp \vv_4 \amp \vv_5] = \mqty[ 1 \amp -1 \amp 0 \amp 2 \amp 5 \\ 0 \amp 1 \amp 1 \amp -2 \amp 0 \\ 1 \amp -1 \amp 0 \amp 3 \amp 4]\text{.} \end{equation*}
Since \(\vv_3 = \vv_1 + \vv_2\) and \(\vv_5 = 5\vv_1-2\vv_2-\vv_4\text{,}\) then it follows that
\begin{equation*} -6\vv_1 + 2\vv_2 + \vv_3 + \vv_4 + \vv_5 = \vb{0}\text{.} \end{equation*}
Therefore the columns of \(A\) form a linearly dependent set.

Example 1.4.3. Determine if a Set is Linearly Independent.

Are the vectors \(\uu=\mqty[2\\2]\) and \(\vv = \mqty[-2\\-4]\) linearly independent?
Solution.
First, note that \(\vv = -2\uu\text{.}\) Therefore
\begin{equation*} 2\uu + \vv = \vb{0} \end{equation*}
which shows that these vectors are linearly dependent. In fact, any collection of two vectors is linearly dependent if and only if one of the vectors is a scalar multiple of the other. This is a useful test to keep in mind!
A useful concept for determining if a collection of vectors is linearly independent is that of a pivot column.

Definition 1.4.4. Pivot Columns.

Let \(A\) be a matrix. The pivot columns of \(A\) are those columns which contain leading entries in any echelon form of \(A\text{.}\)
In the matrix \(A\) at the beginning of this section, the columns \(\vv_1, \vv_2\) and \(\vv_4\) are the pivot columns. It can be seen that they're also linearly independent, and this is no coincidence.

Definition 1.4.5. Rank of a Matrix.

Let \(A\) be a matrix. The rank of a matrix is the size of the largest linearly independent subset of the columns of \(A\text{.}\) This is denoted by \(\rank{A}\text{.}\)
The rank is closely related to the pivot columns.

Subsection Bases in \(\RR^n\)

Definition 1.4.7. Basis of \(\RR^n\).

Let \(\mathcal{B} = \qty{\vb{b}_1,\ldots,\vb{b}_n}\) denote a subset of vectors in \(\RR^n\text{.}\) We say that \(\mathcal{B}\) is a basis for \(\RR^n\) if it satisfies the following properties:
  1. \(\mathcal{B}\) is linearly independent.
  2. \(\mathcal{B}\) is a spanning set, i.e., \(\spn{\mathcal{B}} = \RR^n\text{.}\)

Definition 1.4.8. Column Space.

The column space of a matrix \(A\) is the span of the columns of \(A\text{.}\) Equivalently, the column space is the set of all vectors of the form \(A\vb{x}\text{.}\) The column space of \(A\) is denoted by \(\col{A}\text{.}\)
The column space of an \(m\times n\) matrix \(A\) consists of all vectors of the form \(A\xx\) where \(\xx\in\RR^n\text{.}\) This is because the product \(A\xx\) can be written as a linear combination of the columns of \(A\text{,}\) and all linear combinations of the columns appear in this manner. Therefore the column space also relates to consistency of systems. In particular, the linear system \(A\xx = \bb\) is consistent if and only if \(\bb\in\col{A}\text{.}\)

Example 1.4.9. Dimension and Basis of the Column Space.

Let
\begin{equation*} A = \mqty[0 \amp 1 \amp 0 \amp 2 \\ -1 \amp 0 \amp -4 \amp 2 \\ 0 \amp 4 \amp 0 \amp -1]\text{.} \end{equation*}
Find \(\dim\col{A}\) and a basis for the column space of \(A\text{.}\)
Solution.
We can answer this question completely by identifying the pivot columns of \(A\text{.}\) By row reduction (see the Octave code below, for example), we see that an echelon form of \(A\) is given by
\begin{equation*} \mqty[1 \amp 0 \amp 4 \amp 0 \\ 0 \amp 1 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \amp 1]\text{.} \end{equation*}
This shows that the first, second and fourth columns of \(A\) are pivot columns. Therefore these columns also form a basis of \(\col{A}\) and \(\dim\col{A} = \rank{A} = 3\text{.}\)