What is bias ,variance, Overfitting and Underfitting in machine learning??
In this blog we are going to discuss the important topic of bias and variance and also about the Overfitting and Underfitting.
Lots of people train different machine learning models but they find difficulty in increasing the model accuracy due to the lack of knowledge of bias and variance.
In this guide we will walk through this terms and what does it means and by the end of the blog you would be able to increase your model accuracy
You can simply assume bias is a error of training data similarly variance is the error of testing data.
Things might not be clear till now right??
Assume there is the data set of 10000 images for training the model and 2000 images for testing the model.
If the model performs well at the training images and not at the testing images this causes the problem of Overfitting. This means that accuracy in terms of training images is far greater than the testing images. So the performance of the model is good on training images but not for the testing images (data which have not seen by our model).
If the model does not perform well at both training images and testing images this causes the problem of Underfitting. This means that accuracy for training and testing images is not good enough which affects the performance of the model.
So now how can we related BIAS and VARIANCE ?
Previously we discussed what is bias and variance
BIAS : error of training data
Variance : error of testing data
So in case of overfitting, model performs good at training images that means the error for the training images is less so it causes LOW BIAS and for the testing images the performance of the model is not good enough so it leads to HIGH VARIANCE.
Similarly in case of underfitting, training and testing accuracy both are less so there is high error for the training and testing images which leads to HIGH BIAS and HIGH VARIANCE.
Therefore error for training and testing images should be low which means our model must have LOW VARIANCE and LOW BIAS for the better performance of the model.
There are various optimization, regularization and initialization techniques which makes our model to move towards LOW VARIANCE and LOW BIAS and that makes our model to perform well with greater accuracy.
This all techniques will be discussed later.
For more such information subscribe to our blogger. For queries and information related to other topics comment down below.
0 comments:
Post a Comment