Table of Contents
Fetching ...
Paper

Dynamic Dual Buffer with Divide-and-Conquer Strategy for Online Continual Learning

Abstract

Online Continual Learning (OCL) involves sequentially arriving data and is particularly challenged by catastrophic forgetting, which significantly impairs model performance. To address this issue, we introduce a novel framework, Online Dynamic Expandable Dual Memory (ODEDM), that integrates a short-term memory for fast memory and a long-term memory structured into sub-buffers anchored by cluster prototypes, enabling the storage of diverse and category-specific samples to mitigate forgetting. We propose a novel K-means-based strategy for prototype identification and an optimal transport-based mechanism to retain critical samples, prioritising those exhibiting high similarity to their corresponding prototypes. This design preserves semantically rich information. Additionally, we propose a Divide-and-Conquer (DAC) optimisation strategy that decomposes memory updates into subproblems, thereby reducing computational overhead. ODEDM functions as a plug-and-play module that can be seamlessly integrated with existing rehearsal-based approaches. Experimental results under both standard and imbalanced OCL settings show that ODEDM consistently achieves state-of-the-art performance across multiple datasets, delivering substantial improvements over the DER family as well as recent methods such as VR-MCL and POCL.