Beyond Knowledge Distillation: Collaborative Learning for Bidirectional Model Assistance

Knowledge distillation (KD) is a powerful technique that enables a well-trained large model to assist a small model. However, KD is constrained in a teacher-student manner. Thus, this method may not be appropriate in general situations, where the learning abilities of two models are uncertain or not...

Full description

Bibliographic Details
Main Authors: Jinzhuo Wang, Wenmin Wang, Wen Gao
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8409945/
Description
Summary:Knowledge distillation (KD) is a powerful technique that enables a well-trained large model to assist a small model. However, KD is constrained in a teacher-student manner. Thus, this method may not be appropriate in general situations, where the learning abilities of two models are uncertain or not significantly different. In this paper, we propose a collaborative learning (CL) method, which is a flexible strategy to achieve bidirectional model assistance for two models using a mutual knowledge base (MKB). The MKB is used to collect mutual information and provide assistance, and it is updated along with the learning process of the two models and separately deployed when converged. We show that CL can be applied to any two deep neural networks and is easily extended to multiple networks. Compared with the teacher-student framework, CL can achieve bidirectional assistance and does not impose specific requirements on the involved models, such as pretraining and different abilities. The experimental results demonstrate that CL can efficiently improve the learning ability and convergence speed of the two models, with superior performance to a series of relevant methods, such as ensemble learning and a series of KD-based methods. More importantly, we show that the state-of-the-art models, such as DenseNet, can be greatly improved using CL along with other popular models.
ISSN:2169-3536