Gradient descent Scikit learn tutorial

Table of Contents

What is Gradient Descent?
Concept
Example
Video Tutorial

1 What is Gradient Descent?

In this blog we’ll be looking at a very important concept of error handling known as a gradient descent and the concept behind it. Now the image that we have in front of you is a three-dimensional image of a gradient descent.

Figure-1

It specifically means that which you are having as the global minimum is the point where you’rehaving the minimum error. Now your machine learning model can have any of the error values starting from any point and the objective that you have in your machine learning model is to get to that minimum error

2 Concept

Now the two major approaches to attain the global minima are either by taking a very big leap or very big rate or very big step to reach the global minima or by taking on a very small steps to reach the same.

Figure-2
  1. The problem with a very big step is you might not go to the minimaand you might end up on a negative side. The problem with the very small rate is that those rates get smaller and smaller each and every time and as it approaches to the global minima it increases on an infinitely many small steps allowing you to never reach the global minima.
  2. So, to handle that we use something known as targeted gradient descent. In which you need to take a very specific step in according to the model that allows you to reach global minima in a very fast rate.
Figure-3

3 Example

  1. Firstly, we are loading the bank data using pandas
  2. Then import necessary libraries here weare importing the K fold library from sklearn_cv module.
  3. We pass the number of folds and number of iterations and shuffling with‘False’.
  4. Now we save each and every amount of step data to validate init’s training or testing period.
  5. The data being saved and not seen by the machine learning model allows it to evaluate it properly.
Figure-4

we can look at the image with the training data and testing observations that we want over the iteration for K fold over a time.

Figure-5
  1. we can see the accuracy of the model and in the first iteration we have an accuracy of 100% followed by the 93 percent and so on.
  2. So, the concept of k-fold is very essential for gradient descent.we can also refer the documentation in sklearn model selection and kfold library.

Video Tutorial

Leave a Reply

Your email address will not be published. Required fields are marked *