BIROn - Birkbeck Institutional Research Online

    Knowledge distillation: a survey

    Gou, J. and Yu, B. and Maybank, Stephen and Tao, Da. (2021) Knowledge distillation: a survey. International Journal of Computer Vision , ISSN 0920-5691. (In Press)

    KD_Survey-arxiv.pdf - Author's Accepted Manuscript

    Download (881kB) | Preview


    In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of model parameters. However, it is a challenge to deploy these cumbersome deep models on devices with limited resources, e.g., mobile phones and embedded devices, not only because of the high computational complexity but also the large storage requirements. To this end, a variety of model compression and acceleration techniques have been developed. As a representative type of model compression and acceleration, knowledge distillation effectively learns a small student model from a large teacher model. It has received rapid increasing attention from the community. This paper provides a comprehensive survey of knowledge distillation from the perspectives of knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparision and applications. Furthermore, challenges in knowledge distillation are briefly reviewed and comments on future research are discussed and forwarded.


    Item Type: Article
    Keyword(s) / Subject(s): Deep neural networks, Model compression, Knowledge distillation, Knowledge transfer, Teacher-student architecture
    School: School of Business, Economics & Informatics > Computer Science and Information Systems
    Depositing User: Steve Maybank
    Date Deposited: 17 May 2021 14:11
    Last Modified: 22 Mar 2022 01:10


    Activity Overview

    Additional statistics are available via IRStats2.

    Archive Staff Only (login required)

    Edit/View Item Edit/View Item