Essense of Linear Algebra by 3Blue1Brown

Essense of Linear Algebra by 3Blue1Brown

Quizzes covering the famous Youtube playlist "Essense of Linear Algebra", in the channel 3Blue1Brown

Public
9 Quizzes
3Blue1Brown: Linear Combinations, Span, and Basis Vectors - Chapter 2 Quiz

3Blue1Brown: Linear Combinations, Span, and Basis Vectors - Chapter 2 Quiz

This quiz assesses a deep conceptual and applied understanding of linear combinations, vector span, and basis vectors, as introduced in the 3Blue1Brown 'Essence of Linear Algebra' Chapter 2 video. Questions cover interpreting vector coordinates, defining and visualizing span in 2D and 3D, identifying linear dependence and independence, and applying the technical definition of a basis. Challenging questions require nuanced reasoning and an ability to connect abstract definitions to geometric intuition. Key formulas/concepts involved: * **Linear Combination**: A vector $\vec{v}$ is a linear combination of vectors $\vec{v_1}, \vec{v_2}, \dots, \vec{v_k}$ if it can be expressed as: $$\vec{v} = c_1\vec{v_1} + c_2\vec{v_2} + \dots + c_k\vec{v_k}$$ where $c_1, c_2, \dots, c_k$ are scalars. * **Span**: The span of a set of vectors $S = \{\vec{v_1}, \vec{v_2}, \dots, \vec{v_k}\}$ is the set of all possible linear combinations of those vectors. It is denoted as $\text{span}(S)$. * **Linear Dependence**: A set of vectors $S = \{\vec{v_1}, \vec{v_2}, \dots, \vec{v_k}\}$ is linearly dependent if at least one vector in $S$ can be expressed as a linear combination of the others, or equivalently, if there exist scalars $c_1, c_2, \dots, c_k$, not all zero, such that: $$c_1\vec{v_1} + c_2\vec{v_2} + \dots + c_k\vec{v_k} = \vec{0}$$ * **Linear Independence**: A set of vectors is linearly independent if it is not linearly dependent, meaning the only way to form the zero vector is if all scalars in the linear combination are zero. * **Basis**: A basis for a vector space $V$ is a set of linearly independent vectors that span $V$. For a given vector space of dimension $n$, a basis will always consist of exactly $n$ linearly independent vectors.

Linear Algebra
3Blue1Brown: Linear Transformations and Matrices - Chapter 3 Quiz

3Blue1Brown: Linear Transformations and Matrices - Chapter 3 Quiz

This quiz rigorously tests your understanding of 2D linear transformations and their relationship to matrices, based on the foundational concepts presented in Chapter 3 of 3Blue1Brown's 'Essence of linear algebra' series. It covers visual interpretations of linearity, the role of basis vectors, matrix construction, matrix-vector multiplication as linear combinations, and the implications of various transformation types. Questions are designed to be challenging, requiring a deep conceptual and computational understanding of the topic. **Key Formulas and Concepts:** * **Definition of Linear Transformation (Visual):** 1. All lines must remain lines without getting curved. 2. The origin must remain fixed in place. (Implies grid lines remain parallel and evenly spaced.) * **Definition of Linear Transformation (Algebraic):** A transformation $$ L $$ is linear if for all vectors $$ \vec{v}, \vec{w} $$ and scalar $$ c $$: 1. **Additivity:** $$ L(\vec{v} + \vec{w}) = L(\vec{v}) + L(\vec{w}) $$ 2. **Homogeneity/Scaling:** $$ L(c\vec{v}) = cL(\vec{v}) $$ (A consequence of these properties is $$ L(\vec{0}) = \vec{0} $$) * **Representing 2D Linear Transformations with Matrices:** A 2D linear transformation is completely described by where the standard basis vectors $$ \hat{i} = \begin{pmatrix} 1 \\ 0 \end{pmatrix} $$ and $$ \hat{j} = \begin{pmatrix} 0 \\ 1 \end{pmatrix} $$ land. If $$ L(\hat{i}) = \begin{pmatrix} a \\ c \end{pmatrix} $$ and $$ L(\hat{j}) = \begin{pmatrix} b \\ d \end{pmatrix} $$, then the transformation matrix $$ A $$ is: $$ A = \begin{pmatrix} a & b \\ c & d \end{pmatrix} $$ * **Matrix-Vector Multiplication as Linear Combination:** To find where a vector $$ \vec{v} = \begin{pmatrix} x \\ y \end{pmatrix} $$ lands under the transformation represented by matrix $$ A $$, we compute $$ A\vec{v} $$: $$ A\vec{v} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = x \begin{pmatrix} a \\ c \end{pmatrix} + y \begin{pmatrix} b \\ d \end{pmatrix} = \begin{pmatrix} ax + by \\ cx + dy \end{pmatrix} $$ This represents the original vector's components ($$ x, y $$) as scaling factors for the transformed basis vectors. * **Geometric Interpretations:** * **Rotation:** e.g., 90° CCW is $$ \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} $$. * **Shear:** e.g., x-shear is $$ \begin{pmatrix} 1 & k \\ 0 & 1 \end{pmatrix} $$. * **Scaling:** e.g., uniform scaling by $$ k $$ is $$ \begin{pmatrix} k & 0 \\ 0 & k \end{pmatrix} $$. * **Reflection:** e.g., across x-axis is $$ \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} $$. * **Linear Dependence of Columns:** If $$ L(\hat{i}) $$ and $$ L(\hat{j}) $$ are linearly dependent, the transformation squishes 2D space onto a 1D line.

Linear Algebra
3Blue1Brown:  3D Transformations - Chapter 5 Quiz

3Blue1Brown: 3D Transformations - Chapter 5 Quiz

This quiz delves into the fundamental concepts of three-dimensional linear transformations as presented in the 3Blue1Brown "Essence of Linear Algebra" series, specifically Chapter 5. It tests your understanding of how 2D linear algebra principles extend to 3D space, the role of basis vectors in defining transformations, matrix representation of these transformations, and the mechanics of vector and matrix multiplication in three dimensions. Expect questions that require critical thinking, application of theoretical knowledge, and a strong grasp of vector and matrix operations. **Important Formulae:** * **Standard basis vectors in 3D:** * $\hat{i} = \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}$ * $\hat{j} = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}$ * $\hat{k} = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}$ * **Representation of a vector $\vec{v}$ in terms of basis vectors:** * $\vec{v} = \begin{pmatrix} x \\ y \\ z \end{pmatrix} = x\hat{i} + y\hat{j} + z\hat{k}$ * **Matrix representation of a 3D linear transformation $T$:** * If $T(\hat{i}) = \begin{pmatrix} a_{11} \\ a_{21} \\ a_{31} \end{pmatrix}$, $T(\hat{j}) = \begin{pmatrix} a_{12} \\ a_{22} \\ a_{32} \end{pmatrix}$, and $T(\hat{k}) = \begin{pmatrix} a_{13} \\ a_{23} \\ a_{33} \end{pmatrix}$, then the transformation matrix $A$ is: $$A = \begin{pmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}$$ * **Applying a transformation $T$ to a vector $\vec{v}$ (Matrix-Vector Multiplication):** * $T(\vec{v}) = A\vec{v}$ * $$A\vec{v} = \begin{pmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix} \begin{pmatrix} x \\ y \\ z \end{pmatrix} = x \begin{pmatrix} a_{11} \\ a_{21} \\ a_{31} \end{pmatrix} + y \begin{pmatrix} a_{12} \\ a_{22} \\ a_{32} \end{pmatrix} + z \begin{pmatrix} a_{13} \\ a_{23} \\ a_{33} \end{pmatrix} = \begin{pmatrix} a_{11}x + a_{12}y + a_{13}z \\ a_{21}x + a_{22}y + a_{23}z \\ a_{31}x + a_{32}y + a_{33}z \end{pmatrix}$$ * **Composition of transformations (Matrix-Matrix Multiplication):** * If $T_1$ is represented by matrix $A$ and $T_2$ by matrix $B$, then applying $T_2$ first, then $T_1$, is represented by the matrix product $AB$. * For $3 \times 3$ matrices $A = [a_{ij}]$ and $B = [b_{ij}]$, the element $(AB)_{ij}$ is the dot product of the $i$-th row of $A$ and the $j$-th column of $B$. * $$(AB)_{ij} = \sum_{k=1}^3 a_{ik}b_{kj}$$

Linear Algebra
3Blue1Brown: The Determinant -  Chapter 6 Quiz

3Blue1Brown: The Determinant - Chapter 6 Quiz

This quiz rigorously tests your understanding of the determinant concept as presented in the 3Blue1Brown 'Essence of Linear Algebra' Chapter 6 video. It covers the determinant's role as an area/volume scaling factor, its implications for orientation reversal, the meaning of a zero determinant (dimension collapse, linear dependence), the geometric intuition behind the 2x2 formula ($ad-bc$), the extension to 3D with parallelepipeds and the right-hand rule, and the property $\text{det}(M_1 M_2) = \text{det}(M_1) \text{det}(M_2)$. The questions range from conceptual interpretations to application-based scenarios, requiring deep insight into the visual and geometric aspects of linear transformations. **Key Formulas and Concepts:** * **2D Determinant**: For a matrix $A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}$, $\text{det}(A) = ad - bc$. * **Geometric Meaning (2D)**: $\text{det}(A)$ is the signed area of the parallelogram formed by the transformed basis vectors $\text{L}(\hat{i}) = \begin{pmatrix} a \\ c \end{pmatrix}$ and $\text{L}(\hat{j}) = \begin{pmatrix} b \\ d \end{pmatrix}$. The absolute value $|\text{det}(A)|$ is the area scaling factor. A negative sign indicates orientation inversion. * **Geometric Meaning (3D)**: $\text{det}(A)$ is the signed volume of the parallelepiped formed by the transformed basis vectors $\text{L}(\hat{i})$, $\text{L}(\hat{j})$, $\text{L}(\hat{k})$. The absolute value $|\text{det}(A)|$ is the volume scaling factor. A negative sign indicates orientation inversion (e.g., changing from right-hand rule to left-hand rule). * **Zero Determinant**: If $\text{det}(A) = 0$, the transformation collapses space into a lower dimension (e.g., a line or a point in 2D, a plane, line, or point in 3D). This implies the column vectors of $A$ are linearly dependent. * **Determinant of Product**: For matrices $M_1$ and $M_2$, $\text{det}(M_1 M_2) = \text{det}(M_1) \text{det}(M_2)$. This reflects the cumulative effect of sequential scaling factors. * **Basis Vectors**: The transformation of the unit square (2D) or unit cube (3D), whose edges are defined by the standard basis vectors, provides the fundamental geometric interpretation of the determinant.

Linear Algebra
3Blue1Brown: Essence of Linear Algebra - Chapter 7 - Inverse matrices, column space and null space - Quiz

3Blue1Brown: Essence of Linear Algebra - Chapter 7 - Inverse matrices, column space and null space - Quiz

This quiz rigorously assesses your understanding of inverse matrices, column space, and null space as presented in the 3Blue1Brown 'Essence of Linear Algebra' Chapter 7 video. It focuses on the geometric interpretations and conceptual relationships, avoiding computational methods as per the video's scope. Questions are designed to be challenging, requiring deep comprehension of how linear transformations, determinants, rank, column space, and null space interrelate. Key concepts covered include: - **Linear Systems ($$Ax=v$$):** Geometric interpretation as a linear transformation mapping x to v. - **Inverse Matrices ($$A^{-1}$$):** Existence condition (non-zero determinant, no squishing of space), geometric meaning as 'undoing' a transformation, algebraic property ($$A^{-1}A=I$$). - **Column Space:** The set of all possible outputs of a transformation (the span of the columns), its dimensionality defining the rank, and its role in the existence of solutions for $$Ax=v$$. - **Rank:** The number of dimensions in the column space. Full rank implying no squishing, non-full rank implying squishing to a lower dimension. - **Null Space (Kernel):** The set of all vectors that are mapped to the zero vector by a transformation ($$Ax=0$$), its significance when the determinant is zero, and its relation to the uniqueness and existence of solutions for $$Ax=0$$. Mathematical formulas used in this topic: - **Linear System:** $$Ax = v$$ where A is the coefficient matrix, x is the vector of variables, and v is the constant vector. - **Inverse Matrix Property:** $$A^{-1}A = I$$ (where I is the identity matrix). - **Solving with Inverse:** $$x = A^{-1}v$$ (if A is invertible). Prepare to think critically about the visual and conceptual implications of these core linear algebra ideas.

Linear Algebra
3Blue1Brown: Linear Algebra : Nonsquare Matrix Transformations between dimensions -Chapter 8 Quiz

3Blue1Brown: Linear Algebra : Nonsquare Matrix Transformations between dimensions -Chapter 8 Quiz

This quiz delves into the sophisticated geometric interpretations of nonsquare matrices as linear transformations between different dimensional spaces, as presented in 3Blue1Brown's 'Essence of Linear Algebra' Chapter 8. It assesses your deep understanding of how such matrices map input vectors from one dimension to output vectors in another. Key concepts include: 1. **Matrix Dimensions and Geometric Mapping**: A matrix with $$m$$ rows and $$n$$ columns (an $$m \times n$$ matrix) represents a linear transformation from an $$n$$-dimensional input space to an $$m$$-dimensional output space. The number of columns, $$n$$, indicates the dimension of the input space (number of basis vectors), and the number of rows, $$m$$, indicates the dimension of the output space (number of coordinates for each transformed basis vector). 2. **Column Vectors as Transformed Basis Vectors**: The columns of the transformation matrix are precisely the coordinates of where the standard basis vectors of the input space land in the output space. 3. **Linearity Preservation**: Despite changing dimensions, linear transformations maintain the properties that grid lines remain parallel and evenly spaced (where applicable and visualizable), and the origin maps to the origin ($$T(\mathbf{0}) = \mathbf{0}$$). 4. **Column Space**: The column space of an $$m \times n$$ matrix is the set of all possible output vectors. Geometrically, it represents the subspace that the entire input space is mapped onto in the output space. 5. **Rank and Full Rank**: The rank of a matrix is the dimension of its column space. A nonsquare $$m \times n$$ matrix is considered 'full rank' if its rank is equal to the minimum of $$m$$ and $$n$$, which implies that the transformation does not 'squish' the input space into a lower-dimensional subspace than inherently necessary for its dimensions. 6. **Information Loss and Embedding**: Transformations from higher to lower dimensions ($$m < n$$) often involve a loss of information ('squishification' onto a lower-dimensional subspace). Transformations from lower to higher dimensions ($$m > n$$) involve embedding the lower-dimensional space into a higher one. This quiz demands careful consideration of these concepts, often requiring you to synthesize information and apply abstract reasoning to geometric scenarios. Be prepared for questions that test both conceptual understanding and the ability to interpret mathematical notation.

Linear Algebra
3Blue1Brown: Linear Algebra: Dot Products and Duality - Chapter 9 Quiz

3Blue1Brown: Linear Algebra: Dot Products and Duality - Chapter 9 Quiz

This quiz rigorously tests your understanding of dot products, their geometric and algebraic interpretations, and their profound connection to linear transformations from a multi-dimensional space to the one-dimensional number line, as explained in 3Blue1Brown's 'Essence of Linear Algebra' Chapter 9: 'Dot products and duality'. The questions delve into the concept of duality, where vectors can be seen as the 'embodiment' of such linear transformations. Mastery of these concepts requires a nuanced understanding of: **Numerical Definition of Dot Product:** For two vectors $$v = \begin{pmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{pmatrix}$$ and $$w = \begin{pmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{pmatrix}$$, their dot product is given by: $$\quad v \cdot w = v_1 w_1 + v_2 w_2 + \dots + v_n w_n$$ **Geometric Interpretation of Dot Product:** The dot product $$v \cdot w$$ can be interpreted as the length of the projection of $$w$$ onto $$v$$, multiplied by the length of $$v$$. The sign indicates direction: $$\quad v \cdot w = ||v|| \cdot ||w|| \cdot \cos(\theta)$$, where $$\theta$$ is the angle between $$v$$ and $$w$$. **Linear Transformations to the Number Line:** A linear transformation $$T: \mathbb{R}^n \to \mathbb{R}$$ maps vectors to scalars. It can be represented by a $$1 \times n$$ matrix $$A = \begin{pmatrix} a_1 & a_2 & \dots & a_n \end{pmatrix}$$. Applying this transformation to a vector $$x$$ results in: $$\quad T(x) = A x = \begin{pmatrix} a_1 & a_2 & \dots & a_n \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{pmatrix} = a_1 x_1 + a_2 x_2 + \dots + a_n x_n$$ **Duality:** The central idea that any linear transformation from a vector space to the number line corresponds to a unique vector in that space, such that applying the transformation is equivalent to taking the dot product with that vector. Conversely, any vector defines such a linear transformation. This quiz will challenge your conceptual grasp of these interconnections.

Linear Algebra