What are the different ways of preventing over-fitting in a deep neural network ? Explain the intuition behind each

Spread the Knowledge

  1. L2 norm regularization : Make the weights closer to zero prevent overfitting.
  2. L1 Norm regularization : Make the weights closer to zero and also induce sparsity in weights. Less common form of regularization
  3. Dropout regularization : Ensure some of the hidden units are dropped out at random to ensure the network does not overfit by becoming too reliant on a neuron by letting it overfit
  4. Early stopping : Stop the training before weights are adjusted to overfit to the training data

 


Spread the Knowledge

Leave a Reply

Your email address will not be published. Required fields are marked *