Batch vs Mini-Batch vs Stochastic Gradient Descent
Posted on
Tools to crack your data science Interviews
Batch norm and Layer norm are common normalization techniques. This brief video talks about the need for normalization and the types of norms in deep neural networks.