5.4. Linear dependence#

An important concept in linear algebra is that of whether a vector from set of vectors can be expressed as a linear combination of the other vectors in the set. If so we say that the vector is linearly dependent upon the the other vectors. Geometrically speaking, if two vectors are linearly dependent then it means they lie on the same plane. Linear dependence can help identify redundant or superfluous vectors within a set and provides insight into the dimensions and structure of vector spaces.

Definition 5.5 (Linear dependence)

Let \(v_1, v_2, \ldots, v_n \in V\) and consider the equation

(5.2)#\[ \alpha_1 v_1 + \alpha_2 v_2 + \cdots + \alpha_n v_n = 0, \]

where \(\alpha \in F\). The objects \(v_1, v_2, \ldots, v_n \in V\) are said to be linearly independent over \(F\) if the only solution to the above equation is when all of the \(\alpha_i\) values are zero (this solution is called the trivial solution). If the above equation is satisfied where \(\alpha_i \neq 0\) for \(1 \leq i \leq n\), then \(v_1, v_2, \ldots, v_n \in V\) are said to be linearly dependent over \(F\).

Another way to think about linear independence is that a set of vectors is linearly independent if none of the vectors in the set can be represented as a linear combination of the other vectors in the same set. For example, are the matrices

\[\begin{split} \begin{align*} A &= \begin{pmatrix} 1 & 1 \\ 0 & 2 \end{pmatrix}, & B &= \begin{pmatrix} -1 & -1 \\ 0 & -2 \end{pmatrix}, & C &= \begin{pmatrix} -4 & 0 \\ 0 & -8 \end{pmatrix}, \end{align*} \end{split}\]

linearly independent over \(\mathbb{R}\)? We can see by inspection that \(B = -A\) therefore \(A\), \(B\) and \(C\) are linearly dependent since

\[ 1A + 1B + 0C = \vec{0}_{2\times 2}. \]

So if any two members of a set are scalar multiples of each other then they are linearly dependent because we can choose \(\alpha_i\) values to satisfy equation (5.2).

Example 5.5

Determine whether the following are linearly dependent

(i)   \(\begin{pmatrix} 1 \\ 0 \\ 2 \end{pmatrix}, \begin{pmatrix} 2 \\ 1 \\ 3 \end{pmatrix}, \begin{pmatrix} -3 \\ -4 \\ -2 \end{pmatrix} \in \mathbb{R}^3\) over \(\mathbb{R}\)

Solution (click to show)

Let \(\alpha_1, \alpha_2, \alpha_3 \in \mathbb{R}\) then equation (5.2) becomes

\[\begin{split} \begin{align*} \alpha_1 \begin{pmatrix} 1 \\ 0 \\ 2 \end{pmatrix} + \alpha_2 \begin{pmatrix} 2 \\ 1 \\ 3 \end{pmatrix} + \alpha_3 \begin{pmatrix} -3 \\ -4 \\ -2 \end{pmatrix} &= \begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}. \end{align*} \end{split}\]

This holds if and only if

\[\begin{split} \begin{align*} \alpha_1 + 2\alpha_2 - 3\alpha_3 &= 0, \\ \alpha_2 - 4\alpha_3 &= 0, \\ 2\alpha_3 + 3\alpha_2 - 2\alpha_3 &= 0. \end{align*} \end{split}\]

Solving this homogeneous system using Gauss-Jordan elimination

\[\begin{split} \begin{align*} & \left( \begin{array}{ccc|c} 1 & 2 & -3 & 0 \\ 0 & 1 & -4 & 0 \\ 2 & 3 & -2 & 0 \end{array} \right) \begin{matrix} \\ \\ R_3 - 2R_1 \end{matrix} & \longrightarrow &\left( \begin{array}{ccc|c} 1 & 2 & -3 & 0 \\ 0 & 1 & -4 & 0 \\ 0 & -1 & 4 & 0 \end{array} \right) \begin{matrix} R_1 - 2R_2 \\ \\ R_3 + R_2 \end{matrix} \\ \\ \longrightarrow &\left( \begin{array}{ccc|c} 1 & 0 & 5 & 0 \\ 0 & 1 & -4 & 0 \\ 0 & 0 & 0 & 0 \end{array} \right) \end{align*} \end{split}\]

Here \(\alpha_3\) is a free variable, so let \(\alpha_3 = r\) then \(\alpha_1 = -5r\) and \(\alpha_2 = 4r\). There vectors are therefore linearly dependent, i.e., if \(r = 1\), then \(\alpha_1 = -5\) and \(\alpha_2 = 4\), i.e.,

\[\begin{split} \begin{align*} - 5 \begin{pmatrix} 1 \\ 0 \\ 2 \end{pmatrix} + 4 \begin{pmatrix} 2 \\ 1 \\ 3 \end{pmatrix} + \begin{pmatrix} -3 \\ -4 \\ -2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}. \end{align*} \end{split}\]

(ii)   \(u = x^2 + x + 1\), \(v = x - 1\) and \(w = x^2 - 1 \in P(\mathbb{R})\) over \(\mathbb{R}\)

Solution (click to show)

Let \(\alpha_1, \alpha_2, \alpha_3 \in \mathbb{R}\) then we need to ascertain when

\[ \begin{align*} \alpha_1 u + \alpha_2 v + \alpha_3 w = 0. \end{align*} \]

Now

\[\begin{split} \begin{align*} \alpha_1 (x^2 + x + 1) + \alpha_2 (x - 1) + \alpha_3 (x^2 - 1) &= 0 \\ (\alpha_1 + \alpha_3)x^2 + (\alpha_1 + \alpha_2)x + (\alpha_1 - \alpha_2 - \alpha_3)x^0 &= 0. \end{align*} \end{split}\]

For a polynomial to be equal to zero, the coefficients of \(x^i\) must all be equal to zero, therefore

\[\begin{split} \begin{align*} \alpha_1 + \alpha_3 &= 0, \\ \alpha_1 + \alpha_2 &= 0, \\ \alpha_1 - \alpha_2 - \alpha_3 &= 0. \end{align*} \end{split}\]

Solving using Gauss-Jordan elimination

\[\begin{split} \begin{align*} & \left( \begin{array}{ccc|c} 1 & 0 & 1 & 0 \\ 1 & 1 & 0 & 0 \\ 1 & -1 & -1 & 0 \end{array} \right) \begin{matrix} \\ R_2 - R_1 \\ R_3 - R_1 \end{matrix} & \longrightarrow & \left( \begin{array}{ccc|c} 1 & 0 & 1 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & -1 & -2 & 0 \end{array} \right) \begin{matrix} \\ \\ R_3 + R_2 \end{matrix} \\ \\ \longrightarrow & \left( \begin{array}{ccc|c} 1 & 0 & 1 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & -3 & 0 \end{array} \right) \begin{matrix} \\ \\ -\frac{1}{3}R_3 \end{matrix}& \longrightarrow & \left( \begin{array}{ccc|c} 1 & 0 & 1 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 1 & 0 \end{array} \right) \begin{matrix} R_1 - R_3 \\ R_2 + R_3 \\ \phantom{x} \end{matrix} \\ \\ \longrightarrow & \left( \begin{array}{ccc|c} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{array} \right) \end{align*} \end{split}\]

Therefore the only solution is \(\alpha_1 = \alpha_2 = \alpha_3 = 0\) so the polynomials \(u\), \(v\) and \(w\) are linearly independent.