Vector Spaces#
Solution to Exercise 5.1
Let \(\vec{u}, \vec{v}, \vec{w} \in \mathbb{R}^3\) and \(\alpha, \beta \in \mathbb{R}\) then
A1: \(\vec{u} + (\vec{v} + \vec{w}) = (\vec{u} + \vec{v}) + \vec{w} \checkmark\)
A2: \(\vec{u} + \vec{v} = \vec{v} + \vec{u} \checkmark\)
A3: \(\vec{u} + \vec{0} = \vec{u} \checkmark\)
A4: \(\vec{u} + (-\vec{u}) = \vec{0} \checkmark\)
M1: \(\alpha(\beta \vec{u}) = (\alpha \beta) \vec{u} \checkmark\)
M2: \(1 \vec{u} = \vec{u} \checkmark\)
M3: \(\alpha(\vec{u} + \vec{v}) = \alpha\vec{u} + \alpha \vec{v} \checkmark\)
M4: \((\alpha + \beta) \vec{u} = \alpha \vec{u} + \beta \vec{u} \checkmark\)
All of the axioms of vector spaces hold for \(\mathbb{R}^3\).
Solution to Exercise 5.2
(a) \(U\) is non-empty since \(\vec{0} \in U\). Let \(\vec{u} = (u_1, u_2, 0), \vec{v} = (v_1, v_2, 0) \in U\) and \(\alpha \in \mathbb{R}\) then
therefore \(U\) is a subspace.
(b) \(V\) is non-empty since \((1,2,0) \in V\). However \(\alpha (1, 2, 0) = (\alpha , 2\alpha , 0) \notin V\) for \(\alpha \in \mathbb{R}\) so \(V\) is not a subspace.
(c) \(W\) is non-empty since \(\vec{0} \in W\). Let \(\vec{u} = (0, u_2, 0), \vec{v} = (0, v_2, 0) \in U\) and \(\alpha \in \mathbb{R}\) then
therefore \(W\) is a subspace. Note that \(W \subseteq U\) so since we showed \(U\) is a subspace then \(W\) must also be a subspace.
(d) \(X\) is not a subspace since if \(\vec{u} = (1, 1, 0), \vec{v} = (-1, 1, 0) \in X\) then \(\vec{u} + \vec{v} = (0, 2, 0) \notin X\).
Solution to Exercise 5.3
(a) \(A\) is non-empty since \(\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \in A\). Let \(U = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \in A\) then
so \(A\) is not a subspace.
(b) \(B\) is non-empty since \(0_{2\times 2} \in B\). Let \(U = \begin{pmatrix} u_{11} & 0 \\ u_{11} & u_{11} \end{pmatrix}, V = \begin{pmatrix} v_{11} & 0 \\ v_{11} & v_{11} \end{pmatrix} \in B\) and \(\alpha \in \mathbb{R}\) then
so \(B\) is a subspace.
(c) \(C\) is not a subspace since \(U = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \in C\) but \(2U = \begin{pmatrix} 2 & 0 \\ 0 & 0 \end{pmatrix} \notin C\).
Solution to Exercise 5.4
We need to show that the vectors in the set are linearly independent.
So this set of vectors is a basis for \(\mathbb{R}^3\), calculating the inverse of the coefficient matrix
Let \(U = \{(1, 2, 0), (0, 5, 7), (-1, 1, 3)\}\) then
Solution to Exercise 5.5
We need to find two vectors in \(\mathbb{R}^4\) that are linearly independent to \((1, 1, 2, 4)\) and \((2, -1, -5, 2)\) and one another. Let’s choose \((1, 0, 0, 0)\) and \((0, 1, 0, 0)\) and check for linear dependence
Therefore \(\{(1, 1, 2, 4), (2, -1, -5, 2), \vec{e}_1, \vec{e}_2 \}\) is a basis for \(\mathbb{R}^4\). Note that we could have used any two vectors in \(\mathbb{R}^4\) that form a linearly independent set of vectors.
Solution to Exercise 5.6
We need to find which of the vectors \(\vec{u}\), \(\vec{v}\), \(\vec{w}\), \(\vec{x}\) and \(\vec{y}\) are linearly dependent (and therefore remove them to from the basis).
The fifth column does not have a pivot element so \(\vec{y}\) is linearly dependent on the other vectors, therefore a basis for \(W\) is \(\{ \vec{u}, \vec{v}, \vec{w}, \vec{x}\}\) and \(\dim(W) = 4\).