Table of Contents
Fetching ...

Towards Realistic Class-Incremental Learning with Free-Flow Increments

Zhiming Xu, Baile Xu, Jian Zhao, Furao Shen, Suorong Yang

Abstract

Class-incremental learning (CIL) is typically evaluated under predefined schedules with equal-sized tasks, leaving more realistic and complex cases unexplored. However, a practical CIL system should learns immediately when any number of new classes arrive, without forcing fixed-size tasks. We formalize this setting as Free-Flow Class-Incremental Learning (FFCIL), where data arrives as a more realistic stream with a highly variable number of unseen classes each step. It will make many existing CIL methods brittle and lead to clear performance degradation. We propose a model-agnostic framework for robust CIL learning under free-flow arrivals. It comprises a class-wise mean (CWM) objective that replaces sample frequency weighted loss with uniformly aggregated class-conditional supervision, thereby stabilizing the learning signal across free-flow class increments, as well as method-wise adjustments that improve robustness for representative CIL paradigms. Specifically, we constrain distillation to replayed data, normalize the scale of contrastive and knowledge transfer losses, and introduce Dynamic Intervention Weight Alignment (DIWA) to prevent over-adjustment caused by unstable statistics from small class increments. Experiments confirm a clear performance degradation across various CIL baselines under FFCIL, while our strategies yield consistent gains.

Towards Realistic Class-Incremental Learning with Free-Flow Increments

Abstract

Class-incremental learning (CIL) is typically evaluated under predefined schedules with equal-sized tasks, leaving more realistic and complex cases unexplored. However, a practical CIL system should learns immediately when any number of new classes arrive, without forcing fixed-size tasks. We formalize this setting as Free-Flow Class-Incremental Learning (FFCIL), where data arrives as a more realistic stream with a highly variable number of unseen classes each step. It will make many existing CIL methods brittle and lead to clear performance degradation. We propose a model-agnostic framework for robust CIL learning under free-flow arrivals. It comprises a class-wise mean (CWM) objective that replaces sample frequency weighted loss with uniformly aggregated class-conditional supervision, thereby stabilizing the learning signal across free-flow class increments, as well as method-wise adjustments that improve robustness for representative CIL paradigms. Specifically, we constrain distillation to replayed data, normalize the scale of contrastive and knowledge transfer losses, and introduce Dynamic Intervention Weight Alignment (DIWA) to prevent over-adjustment caused by unstable statistics from small class increments. Experiments confirm a clear performance degradation across various CIL baselines under FFCIL, while our strategies yield consistent gains.

Paper Structure

This paper contains 18 sections, 20 equations, 6 figures, 3 tables.

Figures (6)

  • Figure 1: Illustration of FFCIL. (a) Unlike equal-size tasks, FFCIL allows variable per-step class increments. (b) Existing CIL methods experience a substantial accuracy drop under FFCIL, even with the same number of classes and learning stages.
  • Figure 2: The proposed strategies for FFCIL. Class-wise mean loss enforces class-invariant updates, mitigating instability caused by free-flow class exposure. Replay-only distillation excludes new-class samples, reducing sensitivity to free-flow class arrivals. Objectives whose magnitudes depend on the sample or the activated class space are scale-normalized. The dynamic weight alignment scheme regulates calibration strength by new class increments to prevent over-adjustment.
  • Figure 3: BiC confusion matrices on CIFAR-100 for equal-split CIL, Free-Flow with original method, and Free-Flow with our framework.
  • Figure 4: Impact of FFCIL step schedules on CIFAR-100: (a) iCaRL and (b) DER under ascending, descending, and highly fluctuating schedules.
  • Figure 5: Step-wise accuracy on CIFAR-100 under an extreme FFCIL schedule, with 90 classes introduced initially, followed by 1–2 classes per step.
  • ...and 1 more figures