Continual unsupervised representation learning

Continual learning aims to improve the ability of modern learning systems to deal with non-stationary distributions, typically by attempting to learn a series of tasks sequentially. Prior art in the field has largely considered supervised or reinforcement learning tasks, and often assumes full knowl...

Fuld beskrivelse

Bibliografiske detaljer
Main Authors: Rao, D, Visin, F, Rusu, AA, Teh, YW, Pascanu, R, Hadsell, R
Format: Conference item
Udgivet: Conference on Neural Information Processing Systems 2019
_version_ 1826317655662919680
author Rao, D
Visin, F
Rusu, AA
Teh, YW
Pascanu, R
Hadsell, R
author_facet Rao, D
Visin, F
Rusu, AA
Teh, YW
Pascanu, R
Hadsell, R
author_sort Rao, D
collection OXFORD
description Continual learning aims to improve the ability of modern learning systems to deal with non-stationary distributions, typically by attempting to learn a series of tasks sequentially. Prior art in the field has largely considered supervised or reinforcement learning tasks, and often assumes full knowledge of task labels and boundaries. In this work, we propose an approach (CURL) to tackle a more general problem that we will refer to as unsupervised continual learning. The focus is on learning representations without any knowledge about task identity, and we explore scenarios when there are abrupt changes between tasks, smooth transitions from one task to another, or even when the data is shuffled. The proposed approach performs task inference directly within the model, is able to dynamically expand to capture new concepts over its lifetime, and incorporates additional rehearsal-based techniques to deal with catastrophic forgetting. We demonstrate the efficacy of CURL in an unsupervised learning setting with MNIST and Omniglot, where the lack of labels ensures no information is leaked about the task. Further, we demonstrate strong performance compared to prior art in an i.i.d setting, or when adapting the technique to supervised tasks such as incremental class learning.
first_indexed 2024-03-07T04:18:25Z
format Conference item
id oxford-uuid:ca2d1fa2-b48c-4da2-af4f-f0563e54ac67
institution University of Oxford
last_indexed 2025-03-11T16:57:21Z
publishDate 2019
publisher Conference on Neural Information Processing Systems
record_format dspace
spelling oxford-uuid:ca2d1fa2-b48c-4da2-af4f-f0563e54ac672025-02-21T10:48:59ZContinual unsupervised representation learningConference itemhttp://purl.org/coar/resource_type/c_5794uuid:ca2d1fa2-b48c-4da2-af4f-f0563e54ac67Symplectic ElementsConference on Neural Information Processing Systems2019Rao, DVisin, FRusu, AATeh, YWPascanu, RHadsell, RContinual learning aims to improve the ability of modern learning systems to deal with non-stationary distributions, typically by attempting to learn a series of tasks sequentially. Prior art in the field has largely considered supervised or reinforcement learning tasks, and often assumes full knowledge of task labels and boundaries. In this work, we propose an approach (CURL) to tackle a more general problem that we will refer to as unsupervised continual learning. The focus is on learning representations without any knowledge about task identity, and we explore scenarios when there are abrupt changes between tasks, smooth transitions from one task to another, or even when the data is shuffled. The proposed approach performs task inference directly within the model, is able to dynamically expand to capture new concepts over its lifetime, and incorporates additional rehearsal-based techniques to deal with catastrophic forgetting. We demonstrate the efficacy of CURL in an unsupervised learning setting with MNIST and Omniglot, where the lack of labels ensures no information is leaked about the task. Further, we demonstrate strong performance compared to prior art in an i.i.d setting, or when adapting the technique to supervised tasks such as incremental class learning.
spellingShingle Rao, D
Visin, F
Rusu, AA
Teh, YW
Pascanu, R
Hadsell, R
Continual unsupervised representation learning
title Continual unsupervised representation learning
title_full Continual unsupervised representation learning
title_fullStr Continual unsupervised representation learning
title_full_unstemmed Continual unsupervised representation learning
title_short Continual unsupervised representation learning
title_sort continual unsupervised representation learning
work_keys_str_mv AT raod continualunsupervisedrepresentationlearning
AT visinf continualunsupervisedrepresentationlearning
AT rusuaa continualunsupervisedrepresentationlearning
AT tehyw continualunsupervisedrepresentationlearning
AT pascanur continualunsupervisedrepresentationlearning
AT hadsellr continualunsupervisedrepresentationlearning