From the course: Machine Learning Foundations: Linear Algebra

Defining linear algebra

- [Instructor] When you hear the term linear algebra, the first thing that comes to your mind is probably algebra and you have flashbacks of high school math. Hopefully, that sparks joy. If not, don't worry. We'll get through the linear algebra story step by step together. Despite having a similar name, linear algebra is not an advanced or applied version of algebra. Mathematicians sometimes become furious when people mix these two areas of mathematics, just as software developers are not keen on people mixing Java with JavaScript. Algebra is a branch of mathematics in which arithmetical operations and formal manipulations are applied to abstract symbols rather than specific numbers. In short, the term algebra is the study of mathematical symbols and the rules for manipulating these symbols. Linear algebra is a branch of mathematics that lets you concisely describe coordinates and interactions of planes in higher dimensions and perform operations on them. In short, linear algebra is the study of vectors and linear functions. The main building blocks and areas of linear algebra are systems of linear equations, vectors and matrices, linear transformations, determinants, and vector spaces. If you haven't engaged or even heard of vectors and matrices before, don't worry. No prior knowledge is needed. We'll learn everything together, starting from the beginning. Linear algebra is extremely important in machine learning, also called ML. Because when you are building a machine learning model, you are dealing with either vectors or matrices. Large data sets have multiple row and columns. These are nothing but matrices. When you split your data set into training and testing data, you are performing operations on these matrices. Let's jump in and explore linear algebra in depth.

Contents