Jump to content

File:Knowledge Distillation.png: Revision history

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

25 April 2025

  • curprev 02:5602:56, 25 April 2025 Ciarang talk contribs 187 bytes +187 Diagram of knowledge distillation: a complex teacher model transfers learned knowledge to a smaller student model using soft predictions to enable efficient edge deployment.