5.3. Linear dependence#
An important concept in linear algebra is that of whether a vector from set of vectors can be expressed as a linear combination of the other vectors in the set. If so we say that the vector is linearly dependent upon the the other vectors. Geometrically speaking, if two vectors are linearly dependent then it means they lie on the same plane. Linear dependence can help identify redundant or superfluous vectors within a set and provides insight into the dimensions and structure of vector spaces.
(Linear dependence)
Let \(v_1, v_2, \ldots, v_n \in V\) and consider the equation
where \(\alpha \in F\). The objects \(v_1, v_2, \ldots, v_n \in V\) are said to be linearly independent over \(F\) if the only solution to the above equation is when all of the \(\alpha_i\) values are zero (this solution is called the trivial solution). If the above equation is satisfied where \(\alpha_i \neq 0\) for \(1 \leq i \leq n\), then \(v_1, v_2, \ldots, v_n \in V\) are said to be linearly dependent over \(F\).
Another way to think about linear independence is that a set of vectors is linearly independent if none of the vectors in the set can be represented as a linear combination of the other vectors in the same set. For example, are the matrices
linearly independent over \(\mathbb{R}\)? We can see by inspection that \(B = -A\) therefore \(A\), \(B\) and \(C\) are linearly dependent since
So if any two members of a set are scalar multiples of each other then they are linearly dependent because we can choose \(\alpha_i\) values to satisfy equation (5.2).
Determine whether the following are linearly dependent
(i) \((1, 0, 2), (2, 1, 3),(-3, -4, -2) \in \mathbb{R}^3\) over \(\mathbb{R}\)
Solution
Let \(\alpha_1, \alpha_2, \alpha_3 \in \mathbb{R}\) then equation (5.2) becomes
This holds if and only if
Solving this homogeneous system using Gauss-Jordan elimination
Here \(\alpha_3\) is a free variable, so let \(\alpha_3 = r\) then \(\alpha_1 = -5r\) and \(\alpha_2 = 4r\). There vectors are therefore linearly dependent, i.e., if \(r = 1\), then \(\alpha_1 = -5\) and \(\alpha_2 = 4\), i.e.,
(ii) \(u = x^2 + x + 1\), \(v = x - 1\) and \(w = x^2 - 1 \in P(\mathbb{R})\) over \(\mathbb{R}\)
Solution
Let \(\alpha_1, \alpha_2, \alpha_3 \in \mathbb{R}\) then we need to ascertain when
Now
For a polynomial to be equal to zero, the coefficients of \(x^i\) must all be equal to zero, therefore
Solving using Gauss-Jordan elimination
Therefore the only solution is \(\alpha_1 = \alpha_2 = \alpha_3 = 0\) so the polynomials \(u\), \(v\) and \(w\) are linearly independent.