Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems

The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a vari...

Full description

Bibliographic Details
Other Authors: Pedrycz, Witold (Editor), Chen, Shyi-Ming (Editor)
Format: eBook
Language:English
Published: Cham Springer International Publishing 2023, 2023
Edition:1st ed. 2023
Series:Studies in Computational Intelligence
Subjects:
Online Access:
Collection: Springer eBooks 2005- - Collection details see MPG.ReNa
Description
Summary:The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms
Physical Description:VIII, 232 p. 70 illus., 51 illus. in color online resource
ISBN:9783031320958