Affordable Access

Semi-Supervised Class Incremental Learning

Authors
  • Lechat, Alexis
  • Herbin, Stéphane
  • Jurie, Frédéric
Publication Date
Jan 10, 2021
Source
HAL-INRIA
Keywords
Language
English
License
Unknown
External links

Abstract

This paper makes a contribution to the problem of incremental class learning, the principle of which is to sequentially introduce batches of samples annotated with new classes during the learning phase. The main objective is to reduce the drop in classification performance on old classes, a phenomenon commonly called catastrophic forgetting. We propose in this paper a new method which exploits the availability of a large quantity of non-annotated images in addition to the annotated batches. These images are used to regularize the classifier and give the feature space a more stable structure. We demonstrate on two image data sets, MNIST and STL-10, that our approach is able to improve the global performance of classifiers learned using an incremental learning protocol, even with annotated batches of small size.

Report this publication

Statistics

Seen <100 times