Multi-exit self-distillation with appropriate teachers

Wujie SUN, Defang CHEN, Can WANG, Deshi YE, Yan FENG, Chun CHEN

PDF(643 KB)
PDF(643 KB)
Front. Inform. Technol. Electron. Eng ›› 2024, Vol. 25 ›› Issue (4) : 585-599. DOI: 10.1631/FITEE.2200644

Multi-exit self-distillation with appropriate teachers

Author information +
History +

Abstract

Multi-exit architecture allows early-stop inference to reduce computational cost, which can be used in resource-constrained circumstances. Recent works combine the multi-exit architecture with self-distillation to simultaneously achieve high efficiency and decent performance at different network depths. However, existing methods mainly transfer knowledge from deep exits or a single ensemble to guide all exits, without considering that inappropriate learning gaps between students and teachers may degrade the model performance, especially in shallow exits. To address this issue, we propose Multi-exit self-distillation with Appropriate TEachers (MATE) to provide diverse and appropriate teacher knowledge for each exit. In MATE, multiple ensemble teachers are obtained from all exits with different trainable weights. Each exit subsequently receives knowledge from all teachers, while focusing mainly on its primary teacher to keep an appropriate gap for efficient knowledge transfer. In this way, MATE achieves diversity in knowledge distillation while ensuring learning efficiency. Experimental results on CIFAR-100, TinyImageNet, and three fine-grained datasets demonstrate that MATE consistently outperforms state-of-the-art multi-exit self-distillation methods with various network architectures.

Keywords

Multi-exit architecture / Knowledge distillation / Learning gap

Cite this article

Download citation ▾
Wujie SUN, Defang CHEN, Can WANG, Deshi YE, Yan FENG, Chun CHEN. Multi-exit self-distillation with appropriate teachers. Front. Inform. Technol. Electron. Eng, 2024, 25(4): 585‒599 https://doi.org/10.1631/FITEE.2200644

RIGHTS & PERMISSIONS

2024 Zhejiang University Press
PDF(643 KB)

Accesses

Citations

Detail

Sections
Recommended

/