Our paper on incremental learning with adaptive model search is accepted to IEEE Access

[2022.01.29]

The following paper is accepted to the IEEE Access

Incremental Learning with Adaptive Model Search and a Nominal Loss Model by Chanho Ahn, Eunwoo Kim, and Songhwai Oh

  • AbstractIn incremental learning, different tasks are learned sequentially without access to the previously trained dataset. Catastrophic forgetting is a significant bottleneck to incremental learning as the network performs poorly on previous tasks when it is trained on a new task. This paper provides an adaptive model search method that uses a different part of parameters of the backbone network depending on an input image to mitigate catastrophic forgetting. Our model search method can prevent forgetting by minimizing the update of critical parameters for the previous tasks while learning a new task. This model search involves a trainable model search network that selects the model structure for an input image to minimize loss functions for all the tasks. We also propose a method for approximating the loss function of previous tasks using only the network parameters. The critical parameters for the previous tasks can be found according to the influence of the parameters on the approximated loss function. The proposed approximation method has the potential to theoretically reach a parameter set with a lower loss value than the parameter set learned by the previous task. The proposed framework is the first method of model search that can consider the performance of both current and previous tasks in the incremental learning problem. Empirical studies show that the proposed method outperforms other competitors for both old and new tasks while requiring less computation.