A Unified Framework for Continual Learning and Machine Unlearning

1RespAI Lab, KIIT Bhubaneswar  2SagepilotAI  3EPFL  4University of South Florida

Corresponding Author: murari.mandalfcs@kiit.ac.in
Motivation for our paper

The lack of a consistent framework for continual learning and unlearning makes it difficult to develop flexible models that can learn and forget dynamically. Disjointed techniques are limited in their ability to adapt to new knowledge while removing outdated or sensitive information, making them unsuitable for real-world applications.We compare the isolated continual learning, machine unlearning problems to the unified continual learning-unlearning (CLUL) problem in the above figure.

Proposed framework for our paper

We propose a controlled knowledge distillation framework for managing continual learning and unlearning operations. It consists of a CL teacher, UL teacher, and a Student model, optimized through a unified loss function. The framework aims to preserve previously acquired knowledge while integrating new information. Contrastive distillation maintains existing knowledge similarities, while adaptive distillation incorporates new learning. For unlearning, KL-Divergence drives the targeted removal of obsolete information. The framework maintains consistency even after random sequences of CL and UL requests.

Abstract

Continual learning and machine unlearning are crucial challenges in machine learning, typically addressed separately. Continual learning focuses on adapting to new knowledge while preserving past information, whereas unlearning involves selectively forgetting specific subsets of data. In this paper, we introduce a novel framework that jointly tackles both tasks by leveraging controlled knowledge distillation. Our approach enables efficient learning with minimal forgetting and effective targeted unlearning. By incorporating a fixed memory buffer, the system supports learning new concepts while retaining prior knowledge. The distillation process is carefully managed to ensure a balance between acquiring new information and forgetting specific data as needed. Experimental results on benchmark datasets show that our method matches or exceeds the performance of existing approaches in both continual learning and machine unlearning. This unified framework is the first to address both challenges simultaneously, paving the way for adaptable models capable of dynamic learning and forgetting while maintaining strong overall performance.

Quantitative Results

Baseline Results

Ablation Studies

BibTeX


@misc{chatterjee2024unifiedframeworkcontinuallearning,
  title={A Unified Framework for Continual Learning and Machine Unlearning}, 
  author={Romit Chatterjee and Vikram Chundawat and Ayush Tarun and Ankur Mali and Murari Mandal},
  year={2024},
  eprint={2408.11374},
  archivePrefix={arXiv},
  primaryClass={cs.LG},
  url={https://arxiv.org/abs/2408.11374}, 
}