Knowledge Distillation in Deep Learning -Keras Implementation

Knowledge distillation is a technique used in deep learning to transfer the knowledge learned by a large, complex model (called the teacher model) to a smaller, simpler model (called the student model). The idea is to use the teacher model to “teach” the student model by providing it with the output of the teacher model, rather than just the input-output pairs used to train the teacher model. This allows the student model to learn from the teacher model’s expertise, and ultimately perform better than if it had been trained on its own. The process is…

--

--

PhD student | Computer Science | University of Milan | Data science | AI in Cardiology | Writer | Researcher

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Moklesur Rahman

PhD student | Computer Science | University of Milan | Data science | AI in Cardiology | Writer | Researcher