Class-incremental learning on multivariate time series via shape-aligned temporal distillation

Class-incremental learning (CIL) on multivariate time series (MTS) is an important yet understudied problem. Based on practical privacy-sensitive circumstances, we propose a novel distillation-based strategy using a single-headed classifier without saving historical samples. We propose to exploit So...

Full description

Bibliographic Details
Main Authors: Qiao, Zhongzheng, Hu, Minghui, Jiang, Xudong, Suganthan, Ponnuthurai Nagaratnam, Savitha, Ramasamy
Other Authors: School of Electrical and Electronic Engineering
Format: Conference Paper
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/165392
Description
Summary:Class-incremental learning (CIL) on multivariate time series (MTS) is an important yet understudied problem. Based on practical privacy-sensitive circumstances, we propose a novel distillation-based strategy using a single-headed classifier without saving historical samples. We propose to exploit Soft-Dynamic Time Warping (Soft-DTW) for knowledge distillation, which aligns the feature maps along the temporal dimension before calculating the discrepancy. Compared with Euclidean distance, Soft-DTW shows its advantages in overcoming catastrophic forgetting and balancing the stability-plasticity dilemma. We construct two novel MTS-CIL benchmarks for comprehensive experiments. Combined with a prototype augmentation strategy, our framework demonstrates significant superiority over other prominent exemplar-free algorithms.