clean-IT: Towards Sustainable Digital Technologiesclean-IT Initiative
This video belongs to the openHPI course clean-IT: Towards Sustainable Digital Technologies. Do you want to see more? Enroll yourself for free!

Christian Bartz (HPI) - Knowledge Destillation

Time effort: approx. 11 minutes
You are using our new video player. If you experience any problems, please contact the helpdesk. You can always switch to the old player.

About this video


AI models tend to have a high memory and computing footprint. Therefore mechanisms to decrease that footprint are necessary. Knowledge Distillation is such a process for reducing the footprint of a neural network. During knowledge distillation we make use of the fact that many neural networks are overparameterized and can be compressed by using the outputs of an already trained network for the training of a much smaller network. Such small models with distilled knowledge can then be deployed in real-world scenarios and help to keep the footprint of using deep neural networks low. More information...

Christian Bartz is currently a Ph.D. student at the Chair of Internet Technologies and Systems at the Hasso Plattner Institute (HPI), University of Potsdam, Germany. Prior to his Ph.D. studies, he received his master degree in IT-Systems Engineering from HPI in 2016. His current research interests revolve around computer vision and deep learning, especially text recognition in scene images, handwriting recognition for automated analysis of archival material, and automated generation of suitable training data for the training of machine learning algorithms.