Model Distillation
M
Model Distillation
Definition
A technique for compressing a large, complex model (teacher) into a smaller, more efficient model (student) that approximates the teacher's performance. The student learns from the teacher's output probabilities rather than the original training data.