ADOCIL: Enhancing Image Classification with Attention Distillation for Online Class-Incremental Learning
Keywords:
Catastrophic forgetting, class-incremental learning, two-stage sampling, ADVC, attention distillationAbstract
Catastrophic forgetting is a major challenge for online class-incremental learning. Existing replay-based methods have achieved a certain degree of effectiveness, but are limited by not considering the quality of the samples and the key semantic information in a single-pass data stream. To address these issues, we proposed the framework of Online Class-Incremental Learning based on Attention Distillation (ADOCIL), which consists of three parts. A two-stage sampling method is used in the replay stage to improve the quality of the samples taken. Meanwhile, we introduced the Attention-based Dual-View Consistency (ADVC), which enables the model to fully explore the critical semantic information within a single-pass data stream. In addition, to further mitigate the problem of catastrophic forgetting, we introduced attention distillation to map the attentional map of the teacher model to the student model, thus solving the problem of forgetting historical tasks. Extensive experiments demonstrated the effectiveness of ADOCIL.
