Continual Domain Adaptation through Pruning-aided Domain-specific Weight Modulation

04/15/2023
by   Prasanna B, et al.
0

In this paper, we propose to develop a method to address unsupervised domain adaptation (UDA) in a practical setting of continual learning (CL). The goal is to update the model on continually changing domains while preserving domain-specific knowledge to prevent catastrophic forgetting of past-seen domains. To this end, we build a framework for preserving domain-specific features utilizing the inherent model capacity via pruning. We also perform effective inference using a novel batch-norm based metric to predict the final model parameters to be used accurately. Our approach achieves not only state-of-the-art performance but also prevents catastrophic forgetting of past domains significantly. Our code is made publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2021

Adversarial Continual Learning for Multi-Domain Hippocampal Segmentation

Deep learning for medical imaging suffers from temporal and privacy-rela...
research
10/19/2020

Continual Unsupervised Domain Adaptation with Adversarial Learning

Unsupervised Domain Adaptation (UDA) is essential for autonomous driving...
research
11/01/2018

Progressive Memory Banks for Incremental Domain Adaptation

This paper addresses the problem of incremental domain adaptation (IDA)....
research
04/05/2020

Improved Pretraining for Domain-specific Contextual Embedding Models

We investigate methods to mitigate catastrophic forgetting during domain...
research
03/28/2023

Complementary Domain Adaptation and Generalization for Unsupervised Continual Domain Shift Learning

Continual domain shift poses a significant challenge in real-world appli...
research
04/29/2020

Continual Deep Learning by Functional Regularisation of Memorable Past

Continually learning new skills is important for intelligent systems, ye...
research
08/06/2016

Transferring Knowledge from Text to Predict Disease Onset

In many domains such as medicine, training data is in short supply. In s...

Please sign up or login with your details

Forgot password? Click here to reset