From the course: Machine Learning Foundations: Linear Algebra

Unlock the full course today

Join today to access over 23,200 courses taught by industry experts.

Orthogonal matrix

Orthogonal matrix

- [Instructor] When we explored standard basis vectors, we haven't mentioned they're orthonormal. Meaning, they're orthogonal to each other. This means they're at the right angles to each other. So their dot product is zero, and they also have a unit norm. Let's see them in our coordinate system. We can represent them as vector e1, 1,0, and e2, 0,1. Now, you'll probably be thinking about the connection between orthonormal vectors and orthogonal matrices. Orthogonal matrix is usually denoted with Q. Orthonormal vectors make up all of the rows and all of the columns of the orthogonal matrix. To understand the valuable property of this kind of matrix, we first need to understand what it means to calculate the transpose of a matrix. A transpose of the matrix is a flipped version of the original matrix. Meaning, we just have to switch rows and columns to get transpose. It is denoted as the capital A with superscript t. For…

Contents