ADOCIL: Enhancing Image Classification with Attention Distillation for Online Class-Incremental Learning

Authors

  • Jinyong Cheng Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing, Shandong Fundamental Research Center for Computer Science, Jinan, China
  • Mengyun Chen Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing, Shandong Fundamental Research Center for Computer Science, Jinan, China
  • Baoyu Du Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing, Shandong Fundamental Research Center for Computer Science, Jinan, China
  • Min Guo Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing, Shandong Fundamental Research Center for Computer Science, Jinan, China

Keywords:

Catastrophic forgetting, class-incremental learning, two-stage sampling, ADVC, attention distillation

Abstract

Catastrophic forgetting is a major challenge for online class-incremental learning. Existing replay-based methods have achieved a certain degree of effectiveness, but are limited by not considering the quality of the samples and the key semantic information in a single-pass data stream. To address these issues, we proposed the framework of Online Class-Incremental Learning based on Attention Distillation (ADOCIL), which consists of three parts. A two-stage sampling method is used in the replay stage to improve the quality of the samples taken. Meanwhile, we introduced the Attention-based Dual-View Consistency (ADVC), which enables the model to fully explore the critical semantic information within a single-pass data stream. In addition, to further mitigate the problem of catastrophic forgetting, we introduced attention distillation to map the attentional map of the teacher model to the student model, thus solving the problem of forgetting historical tasks. Extensive experiments demonstrated the effectiveness of ADOCIL.

Downloads

Download data is not yet available.

Published

2025-10-30

How to Cite

Cheng, J., Chen, M., Du, B., & Guo, M. (2025). ADOCIL: Enhancing Image Classification with Attention Distillation for Online Class-Incremental Learning. Computing and Informatics, 44(5). Retrieved from http://147.213.75.17/ojs/index.php/cai/article/view/7148