Knowledge Distillation Posted on June 12, 2024 by MLNerds This video talks about model compression and what knowledge distillation is. It talks about the distillation loss and the common frameworks employed for knowledge distillation.