These linear independence exercises span every core topic in undergraduate linear algebra: the definition of a linearly independent set and the trivial solution, testing sets of vectors in \(\mathbb{R}^n\) by row reduction, using the determinant for square systems, finding dependence relations and expressing vectors as linear combinations, connecting independence to rank, nullity, and the null space, applying the Wronskian to sets of functions, and proving results in abstract vector spaces. Problems are ordered from straightforward definition checks to multi-step proofs, giving a complete path from first encounter to exam readiness.
Definition of Linear Independence and Dependence
A set \(\{v_1, v_2, \ldots, v_k\}\) is linearly independent if the vector equation \(c_1 v_1 + c_2 v_2 + \cdots + c_k v_k = \mathbf{0}\) has only the trivial solution \(c_1 = c_2 = \cdots = c_k = 0\). These problems train students to apply this definition directly, identify dependence by inspection, and write explicit dependence relations.
Problem 1: Applying the Definition in \(\mathbb{R}^2\)
Easy
Consider the vectors \(u = \begin{pmatrix} 2 \\ 5 \end{pmatrix}\) and \(v = \begin{pmatrix} 6 \\ 15 \end{pmatrix}\) in \(\mathbb{R}^2\).
- Write out the vector equation \(c_1 u + c_2 v = \mathbf{0}\) as a system of two scalar equations and solve it.
- State whether \(\{u, v\}\) is linearly independent or linearly dependent, and justify your answer using the definition.
Hint
View Solution
The equation \(c_1 u + c_2 v = \mathbf{0}\) yields the system:
\[ 2c_1 + 6c_2 = 0, \quad 5c_1 + 15c_2 = 0. \]
Both equations reduce to \(c_1 = -3c_2\). Since \(c_2\) is a free variable, there are infinitely many solutions, for instance \(c_1 = 3\), \(c_2 = -1\).
Solution to question 2:
Because the system has a nontrivial solution (not all coefficients zero), the set \(\{u, v\}\) is linearly dependent. Explicitly, \(3u – v = \mathbf{0}\), so \(v = 3u\): one vector is a scalar multiple of the other.
Problem 2: Independence by Inspection in \(\mathbb{R}^3\)
Easy
Without performing any row reduction, determine whether each set below is linearly independent or linearly dependent. Justify each answer with a brief reason referencing a standard theorem or observation.
- \(\left\{ \begin{pmatrix}1\\0\\0\end{pmatrix},\ \begin{pmatrix}0\\1\\0\end{pmatrix},\ \begin{pmatrix}0\\0\\1\end{pmatrix},\ \begin{pmatrix}2\\-1\\3\end{pmatrix} \right\}\) in \(\mathbb{R}^3\).
- \(\left\{ \begin{pmatrix}4\\-2\\6\end{pmatrix},\ \begin{pmatrix}-6\\3\\-9\end{pmatrix} \right\}\) in \(\mathbb{R}^3\).
- \(\left\{ \begin{pmatrix}1\\2\\3\end{pmatrix},\ \mathbf{0},\ \begin{pmatrix}0\\1\\4\end{pmatrix} \right\}\) in \(\mathbb{R}^3\).
Hint
View Solution
Linearly dependent. There are 4 vectors in \(\mathbb{R}^3\). Since the number of vectors exceeds the dimension of the space (\(4 > 3\)), the set must be linearly dependent.
Solution to question 2:
Linearly dependent. Observe that \(-6/4 = 3/(-2) = -9/6 = -3/2\), so the second vector equals \(-\tfrac{3}{2}\) times the first. Two vectors are dependent if and only if one is a scalar multiple of the other.
Solution to question 3:
Linearly dependent. Any set containing the zero vector is linearly dependent, because \(0 \cdot v_1 + 1 \cdot \mathbf{0} + 0 \cdot v_3 = \mathbf{0}\) is a nontrivial combination equal to zero.
Problem 3: Writing an Explicit Dependence Relation
Medium
Let \(v_1 = \begin{pmatrix}1\\2\\-1\end{pmatrix}\), \(v_2 = \begin{pmatrix}3\\0\\1\end{pmatrix}\), and \(v_3 = \begin{pmatrix}5\\4\\-1\end{pmatrix}\).
- Show that \(\{v_1, v_2, v_3\}\) is linearly dependent by finding a nontrivial solution to \(c_1 v_1 + c_2 v_2 + c_3 v_3 = \mathbf{0}\).
- Express \(v_3\) as a linear combination of \(v_1\) and \(v_2\).
Hint
View Solution
\[ \begin{pmatrix}1 & 3 & 5 & 0\\2 & 0 & 4 & 0\\-1 & 1 & -1 & 0\end{pmatrix} \xrightarrow{R_2-2R_1,\ R_3+R_1} \begin{pmatrix}1 & 3 & 5 & 0\\0 & -6 & -6 & 0\\0 & 4 & 4 & 0\end{pmatrix} \xrightarrow{\frac{2}{3}R_2+R_3\to R_3} \begin{pmatrix}1 & 3 & 5 & 0\\0 & -6 & -6 & 0\\0 & 0 & 0 & 0\end{pmatrix}. \]
Column 3 is free (\(c_3 = t\)). From row 2: \(-6c_2 = 6t \Rightarrow c_2 = -t\). From row 1: \(c_1 = -3c_2 – 5c_3 = 3t – 5t = -2t\). Taking \(t = 1\) gives the nontrivial solution \((c_1,c_2,c_3) = (-2,-1,1)\), so \(-2v_1 – v_2 + v_3 = \mathbf{0}\).
Solution to question 2:
\[ v_3 = 2v_1 + v_2 = 2\begin{pmatrix}1\\2\\-1\end{pmatrix} + \begin{pmatrix}3\\0\\1\end{pmatrix} = \begin{pmatrix}5\\4\\-1\end{pmatrix}. \checkmark \]
Testing Linear Independence by Row Reduction
Row reduction is the most systematic method for testing independence: form the matrix \(A = [v_1 \mid v_2 \mid \cdots \mid v_k]\) and reduce to echelon form. The vectors are linearly independent if and only if every column contains a pivot position, equivalently if the homogeneous system \(Ax = \mathbf{0}\) has only the trivial solution.
Problem 4: Row Reduction Test in \(\mathbb{R}^3\)
Easy
Determine whether the set \(\{v_1, v_2, v_3\}\) is linearly independent, where
\[ v_1 = \begin{pmatrix}1\\0\\2\end{pmatrix},\quad v_2 = \begin{pmatrix}3\\1\\-1\end{pmatrix},\quad v_3 = \begin{pmatrix}0\\2\\5\end{pmatrix}. \]
- Set up the matrix \(A = [v_1 \mid v_2 \mid v_3]\) and reduce it to row echelon form.
- Count the pivot positions and state your conclusion.
Hint
View Solution
\[ A = \begin{pmatrix}1 & 3 & 0\\0 & 1 & 2\\2 & -1 & 5\end{pmatrix} \xrightarrow{R_3 – 2R_1} \begin{pmatrix}1 & 3 & 0\\0 & 1 & 2\\0 & -7 & 5\end{pmatrix} \xrightarrow{R_3 + 7R_2} \begin{pmatrix}1 & 3 & 0\\0 & 1 & 2\\0 & 0 & 19\end{pmatrix}. \]
Solution to question 2:
There are 3 pivot positions, one in each column. The homogeneous system \(Ax = \mathbf{0}\) has only the trivial solution, so \(\{v_1, v_2, v_3\}\) is linearly independent.
Problem 5: Four Vectors in \(\mathbb{R}^3\) (Reduction and Dependence Relation)
Medium
Consider the vectors in \(\mathbb{R}^3\):
\[ v_1 = \begin{pmatrix}1\\2\\1\end{pmatrix},\quad v_2 = \begin{pmatrix}0\\1\\3\end{pmatrix},\quad v_3 = \begin{pmatrix}2\\0\\-4\end{pmatrix},\quad v_4 = \begin{pmatrix}3\\3\\0\end{pmatrix}. \]
- Without calculation, explain why \(\{v_1, v_2, v_3, v_4\}\) must be linearly dependent.
- Use row reduction on \([v_1 \mid v_2 \mid v_3 \mid v_4]\) to find a nontrivial dependence relation among these four vectors.
Hint
View Solution
There are 4 vectors in \(\mathbb{R}^3\), and \(4 > 3 = \dim(\mathbb{R}^3)\). Any set containing more vectors than the dimension of the ambient space is automatically linearly dependent.
Solution to question 2:
\[ \begin{pmatrix}1&0&2&3\\2&1&0&3\\1&3&-4&0\end{pmatrix} \xrightarrow{R_2-2R_1,\ R_3-R_1} \begin{pmatrix}1&0&2&3\\0&1&-4&-3\\0&3&-6&-3\end{pmatrix} \xrightarrow{R_3-3R_2} \begin{pmatrix}1&0&2&3\\0&1&-4&-3\\0&0&6&6\end{pmatrix}. \]
Set \(c_4 = 1\) (free after noting the system has a non-trivial kernel). From row 3: \(6c_3 + 6 = 0 \Rightarrow c_3 = -1\). Row 2: \(c_2 – 4(-1) – 3 = 0 \Rightarrow c_2 = -1\). Row 1: \(c_1 + 2(-1) + 3 = 0 \Rightarrow c_1 = -1\). The dependence relation is:
\[ -v_1 – v_2 – v_3 + v_4 = \mathbf{0}, \quad \text{i.e.,} \quad v_4 = v_1 + v_2 + v_3. \]
Problem 6: Vectors in \(\mathbb{R}^4\) with a Real Parameter
Hard
Let \(h \in \mathbb{R}\) and consider
\[ v_1 = \begin{pmatrix}1\\0\\0\\h\end{pmatrix},\quad v_2 = \begin{pmatrix}0\\1\\0\\2\end{pmatrix},\quad v_3 = \begin{pmatrix}0\\0\\1\\-1\end{pmatrix},\quad v_4 = \begin{pmatrix}2\\-1\\h\\0\end{pmatrix}. \]
- Row-reduce the matrix \([v_1 \mid v_2 \mid v_3 \mid v_4]\) and determine for which values of \(h\) the set is linearly independent.
- For the value(s) of \(h\) where the set is linearly dependent, write an explicit dependence relation.
Hint
View Solution
Writing \(A = [v_1|v_2|v_3|v_4]\):
\[ A = \begin{pmatrix}1&0&0&2\\0&1&0&-1\\0&0&1&h\\h&2&-1&0\end{pmatrix}. \]
Apply \(R_4 \leftarrow R_4 – hR_1 – 2R_2 + R_3\):
\[ R_4 \rightarrow \begin{pmatrix}h-h & 2-2 & -1+1 & 0 – 2h + 2 – h\end{pmatrix} = \begin{pmatrix}0 & 0 & 0 & 2-3h\end{pmatrix}. \]
The fourth pivot is \(2 – 3h\). The set is linearly independent when \(2 – 3h \neq 0\), i.e., \(\boxed{h \neq \tfrac{2}{3}}\).
Solution to question 2:
When \(h = \tfrac{2}{3}\), row 4 vanishes and \(c_4\) is free. Set \(c_4 = 1\). Back-substitution gives \(c_3 = -h = -\tfrac{2}{3}\), \(c_2 = 1\), \(c_1 = -2\). The dependence relation is:
\[ -2v_1 + v_2 – \tfrac{2}{3}v_3 + v_4 = \mathbf{0}. \]
Determinant Test for Linear Independence
When a set consists of exactly \(n\) vectors in \(\mathbb{R}^n\), the independence question reduces to a single scalar: the vectors are linearly independent if and only if \(\det[v_1 \mid \cdots \mid v_n] \neq 0\). This section develops fluency with computing and interpreting determinants in this context, including parametric problems where the critical values of a parameter must be found.
Problem 7: \(3 \times 3\) Determinant Test
Easy
Let \(w_1 = (2,\, 1,\, 0)\), \(w_2 = (0,\, 3,\, -1)\), and \(w_3 = (1,\, 0,\, 4)\).
- Form the \(3 \times 3\) matrix whose columns are \(w_1, w_2, w_3\) and compute its determinant by cofactor expansion along the first row.
- Conclude whether \(\{w_1, w_2, w_3\}\) is linearly independent or dependent.
Hint
View Solution
\[ A = \begin{pmatrix}2&0&1\\1&3&0\\0&-1&4\end{pmatrix}. \]
\[ \det(A) = 2\begin{vmatrix}3&0\\-1&4\end{vmatrix} – 0\begin{vmatrix}1&0\\0&4\end{vmatrix} + 1\begin{vmatrix}1&3\\0&-1\end{vmatrix} = 2(12-0) – 0 + 1(-1-0) = 24 – 1 = 23. \]
Solution to question 2:
Since \(\det(A) = 23 \neq 0\), the matrix is invertible and the set \(\{w_1, w_2, w_3\}\) is linearly independent.
Problem 8: Finding the Parameter That Forces Dependence
Medium
For a real parameter \(k\), define the matrix
\[ A(k) = \begin{pmatrix}k & 1 & 0\\2 & k & 1\\0 & 1 & k\end{pmatrix}. \]
- Compute \(\det(A(k))\) as a polynomial in \(k\).
- Find all values of \(k\) for which the columns of \(A(k)\) are linearly dependent.
Hint
View Solution
\[ \det(A(k)) = k(k^2 – 1) – 1(2k – 0) + 0 = k^3 – k – 2k = k^3 – 3k. \]
Factoring: \(\det(A(k)) = k(k^2 – 3)\).
Solution to question 2:
The columns are linearly dependent exactly when \(\det(A(k)) = 0\):
\[ k(k^2 – 3) = 0 \implies k = 0,\quad k = \sqrt{3},\quad k = -\sqrt{3}. \]
Problem 9: Upper-Triangular Matrices and Independence — A Proof
Hard
Let \(M\) be an \(n \times n\) upper-triangular matrix, meaning \(m_{ij} = 0\) for all \(i > j\).
- Prove that if all diagonal entries of \(M\) are nonzero, then the column vectors of \(M\) are linearly independent.
- Give a \(2\times 2\) example showing that the conclusion may fail if even one diagonal entry is zero.
Hint
View Solution
For any triangular matrix, \(\det(M) = m_{11} m_{22} \cdots m_{nn}\). If all diagonal entries are nonzero, then \(\det(M) \neq 0\). A square matrix with nonzero determinant is invertible; by the determinant test for independence, its columns are therefore linearly independent. \(\square\)
Solution to question 2:
Let \(M = \begin{pmatrix}0 & 1\\ 0 & 1\end{pmatrix}\). This is upper-triangular with \(m_{11} = 0\). Its determinant is \(0 \cdot 1 – 1 \cdot 0 = 0\), so its columns \(\begin{pmatrix}0\\0\end{pmatrix}\) and \(\begin{pmatrix}1\\1\end{pmatrix}\) are linearly dependent (the first column is the zero vector).
Linear Independence, Rank, and the Null Space
Linear independence of the columns of a matrix \(A\) is equivalent to \(\operatorname{null}(A) = \{\mathbf{0}\}\), which occurs precisely when \(\operatorname{rank}(A)\) equals the number of columns. The Rank-Nullity Theorem, \(\operatorname{rank}(A) + \operatorname{nullity}(A) = n\), ties these ideas together and is the engine behind many theoretical and computational results.
Problem 10: Rank and Column Independence
Easy
Let \(A = \begin{pmatrix}1 & 2 & 0\\ 0 & 1 & -1\\ 2 & 5 & -1\end{pmatrix}\).
- Find the row echelon form of \(A\) and determine \(\operatorname{rank}(A)\).
- Are the columns of \(A\) linearly independent? Explain using the rank and the Rank-Nullity Theorem.
Hint
View Solution
\[ \begin{pmatrix}1&2&0\\0&1&-1\\2&5&-1\end{pmatrix} \xrightarrow{R_3-2R_1} \begin{pmatrix}1&2&0\\0&1&-1\\0&1&-1\end{pmatrix} \xrightarrow{R_3-R_2} \begin{pmatrix}1&2&0\\0&1&-1\\0&0&0\end{pmatrix}. \]
There are two pivot positions, so \(\operatorname{rank}(A) = 2\).
Solution to question 2:
\(A\) has 3 columns but \(\operatorname{rank}(A) = 2 < 3\). By the Rank-Nullity Theorem, \(\operatorname{nullity}(A) = 3 - 2 = 1 > 0\), meaning \(Ax = \mathbf{0}\) has a nontrivial solution. The columns are therefore linearly dependent.
Problem 11: Finding a Basis for the Null Space
Medium
Let \(B = \begin{pmatrix}1&-1&2&0\\2&-2&4&1\\0&0&0&1\end{pmatrix}\).
- Find \(\operatorname{rank}(B)\) and \(\operatorname{nullity}(B)\) using the Rank-Nullity Theorem.
- Find a basis for \(\operatorname{null}(B)\) and verify that the basis vectors are linearly independent.
Hint
View Solution
\[ B \xrightarrow{R_2 – 2R_1} \begin{pmatrix}1&-1&2&0\\0&0&0&1\\0&0&0&1\end{pmatrix} \xrightarrow{R_3-R_2} \begin{pmatrix}1&-1&2&0\\0&0&0&1\\0&0&0&0\end{pmatrix}. \]
Pivot columns: 1 and 4. So \(\operatorname{rank}(B) = 2\) and \(\operatorname{nullity}(B) = 4 – 2 = 2\).
Solution to question 2:
Free variables: \(x_2 = s\) and \(x_3 = t\). From row 2: \(x_4 = 0\). From row 1: \(x_1 = x_2 – 2x_3 = s – 2t\). The general solution is
\[ x = s\begin{pmatrix}1\\1\\0\\0\end{pmatrix} + t\begin{pmatrix}-2\\0\\1\\0\end{pmatrix}. \]
A basis for \(\operatorname{null}(B)\) is \(\left\{n_1 = (1,1,0,0)^T,\, n_2 = (-2,0,1,0)^T\right\}\). These are independent because the system \(c_1 n_1 + c_2 n_2 = \mathbf{0}\) gives, from component 2: \(c_1 = 0\), and from component 3: \(c_2 = 0\).
Problem 12: Rank-Nullity Reasoning Without Explicit Computation
Hard
Suppose \(A\) is a \(7 \times 6\) matrix with \(\operatorname{rank}(A) = 4\).
- Determine \(\operatorname{nullity}(A)\), \(\operatorname{nullity}(A^T)\), and the dimensions of the four fundamental subspaces of \(A\).
- Are the columns of \(A\) linearly independent? Are the rows of \(A\) linearly independent? Justify both answers.
Hint
View Solution
By the Rank-Nullity Theorem applied to \(A\) (with 6 columns): \(\operatorname{nullity}(A) = 6 – 4 = 2\). \(A^T\) is \(6 \times 7\) with \(\operatorname{rank}(A^T) = 4\), so \(\operatorname{nullity}(A^T) = 7 – 4 = 3\). The four fundamental subspaces have dimensions:
| Subspace | Dimension |
|---|---|
| Column space of \(A\) (in \(\mathbb{R}^7\)) | 4 |
| Null space of \(A\) (in \(\mathbb{R}^6\)) | 2 |
| Row space of \(A\) (in \(\mathbb{R}^6\)) | 4 |
| Left null space of \(A\) (in \(\mathbb{R}^7\)) | 3 |
Solution to question 2:
Columns: \(A\) has 6 columns but \(\operatorname{rank}(A) = 4 < 6\). The columns are not linearly independent (\(\operatorname{nullity}(A) = 2 > 0\)).
Rows: The row space has dimension 4, but \(A\) has 7 rows. Since \(4 < 7\), the rows are also not linearly independent.
Linear Independence of Functions and the Wronskian
In function spaces, linear independence is tested using the Wronskian determinant. For differentiable functions \(f_1, \ldots, f_k\) on an interval \(I\), if \(W[f_1,\ldots,f_k](x_0) \neq 0\) at some point \(x_0 \in I\), the functions are linearly independent on \(I\). These exercises also reinforce how the definition alone suffices for direct proofs of dependence.
Problem 13: Wronskian of Two Exponential Functions
Easy
Consider \(f(x) = e^{2x}\) and \(g(x) = e^{-x}\) on \(\mathbb{R}\).
- Compute the Wronskian \(W[f, g](x)\).
- Use the result to conclude whether \(\{f, g\}\) is linearly independent on \(\mathbb{R}\).
Hint
View Solution
\[ W[f,g](x) = \begin{vmatrix}e^{2x} & e^{-x}\\ 2e^{2x} & -e^{-x}\end{vmatrix} = e^{2x}(-e^{-x}) – e^{-x}(2e^{2x}) = -e^x – 2e^x = -3e^x. \]
Solution to question 2:
Since \(W[f,g](x) = -3e^x \neq 0\) for every \(x \in \mathbb{R}\), the functions \(f\) and \(g\) are linearly independent on \(\mathbb{R}\).
Problem 14: Three Trigonometric Functions (Wronskian and Direct Check)
Medium
Let \(f_1(x) = 1\), \(f_2(x) = \cos^2 x\), \(f_3(x) = \sin^2 x\) on \(\mathbb{R}\).
- Compute \(W[f_1, f_2, f_3](x)\) and state what you can conclude from the result.
- Show directly from the definition that \(\{f_1, f_2, f_3\}\) is linearly dependent, and write an explicit dependence relation.
Hint
View Solution
Compute derivatives: \(f_1′ = 0\), \(f_2′ = -\sin(2x)\), \(f_3′ = \sin(2x)\), \(f_2^{\prime\prime} = -2\cos(2x)\), \(f_3{\prime\prime} = 2\cos(2x)\). Expanding along the first column:
\[ W = \begin{vmatrix}1 & \cos^2 x & \sin^2 x\\ 0 & -\sin(2x) & \sin(2x)\\ 0 & -2\cos(2x) & 2\cos(2x)\end{vmatrix} = 1\cdot\bigl[(-\sin 2x)(2\cos 2x)-(\sin 2x)(-2\cos 2x)\bigr] = 0. \]
The Wronskian is identically zero. For a general set of functions this is inconclusive, but dependence is confirmed directly below.
Solution to question 2:
The Pythagorean identity states \(\cos^2 x + \sin^2 x = 1\) for all \(x \in \mathbb{R}\), so:
\[ 1 \cdot f_1(x) – 1 \cdot f_2(x) – 1 \cdot f_3(x) = 1 – \cos^2 x – \sin^2 x = 0. \]
The coefficients \((c_1,c_2,c_3)=(1,-1,-1)\) are not all zero, so \(\{f_1,f_2,f_3\}\) is linearly dependent.
Problem 15: Independence of Exponentials with Distinct Exponents via the Vandermonde Determinant
Hard
Let \(\lambda_1, \lambda_2, \lambda_3\) be three distinct real numbers and set \(f_i(x) = e^{\lambda_i x}\), \(i=1,2,3\).
- Evaluate \(W[f_1, f_2, f_3]\) at \(x = 0\) and identify the resulting determinant as a Vandermonde determinant.
- Use the Vandermonde formula \(\det V(\lambda_1,\lambda_2,\lambda_3) = (\lambda_2-\lambda_1)(\lambda_3-\lambda_1)(\lambda_3-\lambda_2)\) to prove that \(\{e^{\lambda_1 x}, e^{\lambda_2 x}, e^{\lambda_3 x}\}\) is linearly independent when the \(\lambda_i\) are distinct.
Hint
View Solution
Since \(f_i^{(k)}(x) = \lambda_i^k e^{\lambda_i x}\), at \(x=0\) we have \(f_i^{(k)}(0) = \lambda_i^k\). The Wronskian matrix at \(x=0\) is therefore:
\[ W[f_1,f_2,f_3](0) = \begin{vmatrix}1 & 1 & 1\\ \lambda_1 & \lambda_2 & \lambda_3\\ \lambda_1^2 & \lambda_2^2 & \lambda_3^2\end{vmatrix}, \]
which is the Vandermonde determinant \(V(\lambda_1,\lambda_2,\lambda_3)\).
Solution to question 2:
By the Vandermonde formula:
\[ \det V(\lambda_1,\lambda_2,\lambda_3) = (\lambda_2-\lambda_1)(\lambda_3-\lambda_1)(\lambda_3-\lambda_2). \]
Since the \(\lambda_i\) are distinct, each factor is nonzero, so \(\det V \neq 0\). Therefore \(W[f_1,f_2,f_3](0) \neq 0\), and \(\{e^{\lambda_1 x}, e^{\lambda_2 x}, e^{\lambda_3 x}\}\) is linearly independent on \(\mathbb{R}\). \(\square\)
Linear Independence in Abstract Vector Spaces
The definition of linear independence applies verbatim in polynomial spaces, matrix spaces, and arbitrary vector spaces: the only linear combination of the elements equal to the zero element is the trivial one. These exercises build proof-writing skills and connect independence to bases, spanning sets, and dimension.
Problem 16: Independence in the Polynomial Space \(\mathcal{P}_2\)
Easy
In the vector space \(\mathcal{P}_2\) of polynomials of degree at most 2, consider \(p_1 = 1 + x\), \(p_2 = 1 – x\), \(p_3 = x^2\).
- Write \(c_1 p_1 + c_2 p_2 + c_3 p_3 = 0\) (zero polynomial) and equate coefficients of \(1\), \(x\), and \(x^2\) to zero to solve for \(c_1, c_2, c_3\).
- Is \(\{p_1, p_2, p_3\}\) a basis for \(\mathcal{P}_2\)? Justify your answer.
Hint
View Solution
\[ c_1(1+x)+c_2(1-x)+c_3 x^2 = (c_1+c_2)+(c_1-c_2)x+c_3 x^2 = 0. \]
Equating coefficients: \(c_1+c_2=0\), \(c_1-c_2=0\), \(c_3=0\). The first two equations give \(c_1=c_2\) and \(c_1=-c_2\), so \(c_1=c_2=0\). Combined with \(c_3=0\), the only solution is trivial.
Solution to question 2:
\(\{p_1,p_2,p_3\}\) is linearly independent. Since \(\dim(\mathcal{P}_2)=3\) and the set contains 3 independent vectors, it also spans \(\mathcal{P}_2\) and is therefore a basis.
Problem 17: Independence in the Space of \(2\times 2\) Matrices
Medium
In \(M_{2\times 2}(\mathbb{R})\), the space of all \(2\times 2\) real matrices, consider
\[ A_1=\begin{pmatrix}1&0\\0&0\end{pmatrix},\quad A_2=\begin{pmatrix}0&1\\1&0\end{pmatrix},\quad A_3=\begin{pmatrix}0&0\\0&1\end{pmatrix},\quad A_4=\begin{pmatrix}1&0\\0&1\end{pmatrix}. \]
- Prove that \(\{A_1, A_2, A_3\}\) is linearly independent.
- Determine whether \(\{A_1, A_2, A_3, A_4\}\) is linearly independent. If not, express \(A_4\) as a linear combination of the others.
Hint
View Solution
Suppose \(c_1 A_1+c_2 A_2+c_3 A_3 = \mathbf{0}_{2\times2}\). Equating entries:
\[ \begin{pmatrix}c_1 & c_2\\ c_2 & c_3\end{pmatrix}=\begin{pmatrix}0&0\\0&0\end{pmatrix} \implies c_1=0,\quad c_2=0,\quad c_3=0. \]
Only the trivial solution exists, so \(\{A_1,A_2,A_3\}\) is linearly independent.
Solution to question 2:
\(\{A_1,A_2,A_3,A_4\}\) is linearly dependent since \(\dim(M_{2\times2})=4\) and we can verify directly that \(A_4 = A_1 + A_3\):
\[ A_1+A_3=\begin{pmatrix}1&0\\0&0\end{pmatrix}+\begin{pmatrix}0&0\\0&1\end{pmatrix}=\begin{pmatrix}1&0\\0&1\end{pmatrix}=A_4. \]
The dependence relation is \(A_1 + A_3 – A_4 = \mathbf{0}\).
Problem 18: Independence Is Preserved by Injective Linear Maps
Hard
Let \(T: V \to W\) be a linear transformation with \(\ker(T) = \{\mathbf{0}\}\).
- Prove that if \(\{v_1, \ldots, v_k\}\) is linearly independent in \(V\), then \(\{T(v_1), \ldots, T(v_k)\}\) is linearly independent in \(W\).
- Show by example that the conclusion can fail when \(\ker(T) \neq \{\mathbf{0}\}\).
Hint
View Solution
Suppose \(c_1 T(v_1)+\cdots+c_k T(v_k)=\mathbf{0}_W\). By linearity of \(T\):
\[ T\!\left(\sum_{i=1}^k c_i v_i\right) = \mathbf{0}_W. \]
So \(\sum c_i v_i \in \ker(T) = \{\mathbf{0}_V\}\), giving \(c_1 v_1+\cdots+c_k v_k=\mathbf{0}_V\). Since the \(v_i\) are linearly independent in \(V\), all \(c_i=0\). Therefore \(\{T(v_1),\ldots,T(v_k)\}\) is linearly independent. \(\square\)
Solution to question 2:
Let \(T:\mathbb{R}^2\to\mathbb{R}\) be the projection \(T(x,y)=x\). Then \(\ker(T)=\{(0,y)\}\neq\{\mathbf{0}\}\). The set \(\{v_1,v_2\}=\{(1,0),(1,1)\}\) is linearly independent in \(\mathbb{R}^2\), but \(T(v_1)=1=T(v_2)\), so \(\{T(v_1),T(v_2)\}=\{1,1\}\) is a repeated singleton — linearly dependent in \(\mathbb{R}\).
Problem 19: Mutually Orthogonal Nonzero Vectors Are Linearly Independent
Hard
Let \(\{v_1, v_2, \ldots, v_k\}\) be nonzero vectors in an inner product space with \(\langle v_i, v_j\rangle = 0\) for all \(i \neq j\).
- Prove that \(\{v_1, \ldots, v_k\}\) is linearly independent.
- Explain why the standard basis \(\{e_1, \ldots, e_n\}\) in \(\mathbb{R}^n\) with the dot product is an immediate corollary of this result.
Hint
View Solution
Suppose \(c_1 v_1+\cdots+c_k v_k = \mathbf{0}\). Fix any index \(j\) and take the inner product of both sides with \(v_j\):
\[ \left\langle \sum_{i=1}^k c_i v_i,\; v_j\right\rangle = \langle\mathbf{0}, v_j\rangle = 0. \]
By linearity and orthogonality (\(\langle v_i,v_j\rangle=0\) for \(i\neq j\)):
\[ c_j\langle v_j, v_j\rangle = c_j\|v_j\|^2 = 0. \]
Since \(v_j \neq \mathbf{0}\), we have \(\|v_j\|^2 > 0\), so \(c_j = 0\). Since \(j\) was arbitrary, all coefficients are zero and the set is linearly independent. \(\square\)
Solution to question 2:
The standard basis vectors satisfy \(e_i \cdot e_j = \delta_{ij}\), so they are mutually orthogonal and nonzero. The theorem guarantees their linear independence as an immediate special case — no row reduction needed.