clean-IT: Towards Sustainable Digital Technologiesclean-IT Initiative

Dieses Video gehört zum openHPI-Kurs clean-IT: Towards Sustainable Digital Technologies. Möchten Sie mehr sehen?

Christian Bartz (HPI) - Knowledge Destillation

Zeitaufwand: etwa 11 Minuten

Beim Laden des Videoplayers ist ein Fehler aufgetreten, oder es dauert lange, bis er initialisiert wird. Sie können versuchen, Ihren Browser-Cache zu leeren. Bitte versuchen Sie es später noch einmal und wenden Sie sich an den Helpdesk, wenn das Problem weiterhin besteht.

Über dieses Video


AI models tend to have a high memory and computing footprint. Therefore mechanisms to decrease that footprint are necessary. Knowledge Distillation is such a process for reducing the footprint of a neural network. During knowledge distillation we make use of the fact that many neural networks are overparameterized and can be compressed by using the outputs of an already trained network for the training of a much smaller network. Such small models with distilled knowledge can then be deployed in real-world scenarios and help to keep the footprint of using deep neural networks low. More information...

Christian Bartz is currently a Ph.D. student at the Chair of Internet Technologies and Systems at the Hasso Plattner Institute (HPI), University of Potsdam, Germany. Prior to his Ph.D. studies, he received his master degree in IT-Systems Engineering from HPI in 2016. His current research interests revolve around computer vision and deep learning, especially text recognition in scene images, handwriting recognition for automated analysis of archival material, and automated generation of suitable training data for the training of machine learning algorithms.