# Linear Algebra for Machine Learning – Part 3

L

This is part 3 of the linear algebra series .The goal of every single article will be to make linear algebra more understandable in geometric and applied aspects  .

Linear algebra is a branch of mathematics that is widely used throughout science and engineering .A good understanding of linear algebra is essential for understanding the working  with many machine learning algorithms especially deep Learning. linear algebra is a very wide branch and covering all topics is itself a semester course and many topics are not that much relevant to learn about machine learning algorithms. So I will omit some of the most important topics in linear algebra which are not essential for understanding machine learning algorithms. If you are a beginner or have no prior experience with linear algebra then it is recommended to follow this series of article in predefined order.

Note : If you haven’t read part 2 click here.

The goal of this article is to introduce some special type of matrices i.e identity matrix , matrix inverse .We will also cover what is system of linear equations and how to solve it, some important properties of determinant and their geometric significance and many more.

After completing this article you will know ,

• what are identity matrices .?
• what is matrix inverse , adjoined of a matrix ?
• what is determinant , its geometric significance ?

So let’s begin ,

##### Identity matrix

An identity matrix is a pure diagonal matrix i.e  where   and  where ,the 2-d array representation of identity matrix is given below,

This property of identity matrix makes it a very good candidate for all computation .this also means dot product between an identity matrix and a vector does not change the vector. The concept of identity matrix is also very useful to find inverse of any matrix. One of the most important property of identity matrix is

Where A is known matrix and is inverse of matrix A and I is identity matrix.

before going to matrix inversion , let’s first understand why we even need to find inverse of any matrix.If you are following this series from beginning then We now know enough linear algebra notation to write down a system of linear equations:

where is a known matrix and is a known vector and is an unknown vector of variables which need to be find out in order to satisfy this equation, each row of A dot product with x gives a single value of b. This can be written as ,

To find the solution of these linear equations ,

the we know so , the final equation becomes

so in order to find x we need to find , Of course, this depends on it being possible to find , , The necessary and sufficient condition for a square matrix A to posses inverse is that where is determinant of A . proof of this is not required and also not very important but still if you are interested then comment it below.

##### Finding inverse of A

Calculating is quite easy just follow the steps given below,

Step 1 : find adjoint of A

Step 2 : divide each element of adj(A) by its determinant |A|.

The main part of this whole calculation is to find adj(A), and frankly speaking it is a computationally expensive part . The idea behind adj(A) is simple just replace every element of A its co-factor then transpose its co-factor matrix . To find co-factor consider the example given below.

Consider a 3×3 matrix

So its co-factor matrix is,

After finding the adj(A) just divide its each element by its determinant. To find determinant

To find determinant,

determinant has of the deepest geometric significance which most of text books fails to cover which is , the value of determinant specify how much the matrix will scale after applying the determinant as a linear transformation to some other matrix.

In the next article we will cover some of the most used properties of matrix which are norms , eigen decomposition , SVD decomposition which are formulations on which many algorithms are built on like recommender  systems ,PCA etc. If you have any doubts feel free to drop queries in comment section below. Hope you enjoyed reading it.