the link is: https://www.youtube.com/watch?v=XkY2DOUCWMU&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=4 syllabus vectors, what even are they? linear combinations, span, and basis vectors linear transformations and matrices matrix multiplication as composition three-dimensional linear transformations the determinant inverse matrices, column space and null space non-square matrices as transformations between dimensions dot products and duality cross products cross products in the light of linear transformations cramer’s rule, explained geometrically change of basis eigenvectors and eigenvalues abstract vector spaces

Aided by the animation in 3blue1brown’s tutorial, it’s intuitively easier to understand linear transformation as resetting the i hat and j hat by a pair of vectors, which is matrix. Matrix multiplication can be comprehended as composition of two separate linear transformation.

If we go to three-dimensional, it’s same, just adding k hat on top of i hat and j hat to wrap head around in a space rather than plane. The determinant of a transformation can be understood as the scalar of a unit defined by a matrix. Then there comes the concepts of inverse matrices, column space and null space, rank means number of dimensions in the output the set of all possible outputs Av is called Column space of a Matrix A.

Then the concept of Determinant of transformation, basically is the number of fold it gets scaled up/down during the transformation. Down to the math, it’s easy to memorize it as

If need to get to the logic or deducing of this math,

Linear system of equations such as below is a good application of matrix math.

Dot products is bit hard to understand intuitively or visually as projection onto the space. levering on duality in math to get the grip of it.

Next, cross product of matrices result in a vector, illustrated as

Linear Algebra has important application in Machine Learning, according to Andrew Ng’s series of video, I cut several screenshots related to the operation of matrix as follows:

1 is identity, because 1Z = Z1 = Z for any row number Z. Similarly, there is the concept of identity matrix, with 1 shown up diagonally. for any matrix A, A.I = I.A = A. Generally Matrix multiplication is not commutative, but if one of the matrix is identity matrix, it holds true.