Batch vs Mini-Batch vs Stochastic Gradient Descent
Posted on
Tools to crack your data science Interviews
Batch norm and Layer norm are common normalization techniques. This brief video talks about the need for normalization and the types of norms in deep neural networks.
This brief video talks about mixed data and challenges dealing with mixed data when working with machine learning algorithms.