clean-IT: Towards Sustainable Digital Technologiesclean-IT Initiative

本视频属于openHPI课程clean-IT: Towards Sustainable Digital Technologies。你想看更多吗?

Christian Bartz (HPI) - Knowledge Destillation

时间效果趋于.11 分钟

An error occurred while loading the video player, or it takes a long time to initialize. You can try clearing your browser cache. Please try again later and contact the helpdesk if the problem persists.

关于这个视频


AI models tend to have a high memory and computing footprint. Therefore mechanisms to decrease that footprint are necessary. Knowledge Distillation is such a process for reducing the footprint of a neural network. During knowledge distillation we make use of the fact that many neural networks are overparameterized and can be compressed by using the outputs of an already trained network for the training of a much smaller network. Such small models with distilled knowledge can then be deployed in real-world scenarios and help to keep the footprint of using deep neural networks low. More information...

Christian Bartz is currently a Ph.D. student at the Chair of Internet Technologies and Systems at the Hasso Plattner Institute (HPI), University of Potsdam, Germany. Prior to his Ph.D. studies, he received his master degree in IT-Systems Engineering from HPI in 2016. His current research interests revolve around computer vision and deep learning, especially text recognition in scene images, handwriting recognition for automated analysis of archival material, and automated generation of suitable training data for the training of machine learning algorithms.