clean-IT: Towards Sustainable Digital Technologiesclean-IT Initiative

This video belongs to the openHPI course clean-IT: Towards Sustainable Digital Technologies. Do you want to see more?

Christian Bartz (HPI) - Knowledge Destillation

Time effort: approx. 11 minutes

An error occurred while loading the video player, or it takes a long time to initialize. You can try clearing your browser cache. Please try again later and contact the helpdesk if the problem persists.

About this video


AI models tend to have a high memory and computing footprint. Therefore mechanisms to decrease that footprint are necessary. Knowledge Distillation is such a process for reducing the footprint of a neural network. During knowledge distillation we make use of the fact that many neural networks are overparameterized and can be compressed by using the outputs of an already trained network for the training of a much smaller network. Such small models with distilled knowledge can then be deployed in real-world scenarios and help to keep the footprint of using deep neural networks low. More information...

Christian Bartz is currently a Ph.D. student at the Chair of Internet Technologies and Systems at the Hasso Plattner Institute (HPI), University of Potsdam, Germany. Prior to his Ph.D. studies, he received his master degree in IT-Systems Engineering from HPI in 2016. His current research interests revolve around computer vision and deep learning, especially text recognition in scene images, handwriting recognition for automated analysis of archival material, and automated generation of suitable training data for the training of machine learning algorithms.