Jump to content

File:Knowledge-Distillation.webp: Difference between revisions

From Edge Computing Wiki
Ciarang (talk | contribs)
Diagram of knowledge distillation: a complex teacher model transfers learned knowledge to a smaller student model using soft predictions to enable efficient edge deployment.
 
(No difference)

Latest revision as of 02:53, 25 April 2025

Summary[edit]

Diagram of knowledge distillation: a complex teacher model transfers learned knowledge to a smaller student model using soft predictions to enable efficient edge deployment.

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeThumbnailDimensionsUserComment
current02:53, 25 April 2025
Error creating thumbnail: Image type not supported
2,048 × 850 (280 KB)Ciarang (talk | contribs)Diagram of knowledge distillation: a complex teacher model transfers learned knowledge to a smaller student model using soft predictions to enable efficient edge deployment.

There are no pages that use this file.