Authorabhinavsinghml

I write blogs about Machine Learning and data science

Effects of hyperparameters on mlp’s

E

This week I was doing some simple experiments with MLP, trying out different architecture and on different learning rates with random batch normalization and sometimes droput. There was no such definite goal of this experiment except to observe the behavior of Neural networks when we change learning rates on different architectures. The goal of this article is to share some of simple findings...

Feature selection

F

“How good your machine learning model predict depends upon what data you use to train it.” If the data is itself bad don’t expect good result, now the immediate question arises “How you define bad data ?”, the answer of this question is quite difficult as the quality of data depends upon many factors but among all factors the choice of features used to create data is...

Introduction to batch normalization

I

The goal of this article is to answer three simple question, What is batch normalization ? Why we need batch normalization ? How to use it in a neural network ? What is batch normalization ? Batch normalization refers to the normalization done over a batch of data in between of hidden layers. Consider the diagram below for complete understanding. The normalization is performed on the activation...

Setting up your environment

S

Setting up environment can sometime be a difficult process, and choosing between different can be confusing. So in this article we will cover how to setup your own data science on your machine, this article is not operating system dependent, procedures followed in this article can be used for any operating system. Instead of comparing all the different environment for data science purposes, we...

Importance of randomness in machine learning

I

Random numbers are very important part of machine learning, they are the staring point for the predicting output in presence of input data. In order to understand the working of coefficients correctly you must understand importance of randomness and how to generate it. After reading this article you will know,  Importance of randomness. Its use in machine learning. how to generate it. Random...

Introduction to Hypothesis testing

I

Hypothesis testing is probably the most confusing topic in whole of statistics, even experienced machine learning engineers suffers from lack of crisp understanding of this topic. In this tutorial we will try to answer following question, What is hypothesis testing ? What it is use for ? What is hypothesis testing ? The assumption that the input data posses a specific structure  and on that we...

Weight Initialization in neural network

W

Weight initialization is one of the major factors that directly affects the convergence of the model i.e how accurately the optimizer minimize the loss function. Before we even get started I am assuming that the reader are aware, as what are neural networks, what are hidden layers, how they are connected etc. Though I will provide a abstract overview  of working of a neural network. We will be...

How to learn Mathematics for Machine learning

H

Learning and understanding mathematics is key to understand machine learning algorithms, as machine learning algorithms is applied mathematics. Mathematics is just a language as all other language but it is so unambiguous and precise than we could explain phenomenon using it. Understanding Mathematics is very crucial for machine learning, and by understanding it means what each step means both...

Introduction to tensorflow Part 4

I

If you are new to tensorflow, we recommend to start this series from part 1 This is part 4 of tensorflow series. The goal of every single article is to make you understand one of the most popular deep learning library out there. This series is entirely focused on beginners, who are either starting to learn deep learning and want to built their own neural nets or want to build state-of-the-art...

Most common tags