Taxonomy Adaptive Cross-Domain Adaptation in Medical Imaging via Optimization Trajectory Distillation

by   Jianan Fan, et al.

The success of automated medical image analysis depends on large-scale and expert-annotated training sets. Unsupervised domain adaptation (UDA) has been raised as a promising approach to alleviate the burden of labeled data collection. However, they generally operate under the closed-set adaptation setting assuming an identical label set between the source and target domains, which is over-restrictive in clinical practice where new classes commonly exist across datasets due to taxonomic inconsistency. While several methods have been presented to tackle both domain shifts and incoherent label sets, none of them take into account the common characteristics of the two issues and consider the learning dynamics along network training. In this work, we propose optimization trajectory distillation, a unified approach to address the two technical challenges from a new perspective. It exploits the low-rank nature of gradient space and devises a dual-stream distillation algorithm to regularize the learning dynamics of insufficiently annotated domain and classes with the external guidance obtained from reliable sources. Our approach resolves the issue of inadequate navigation along network optimization, which is the major obstacle in the taxonomy adaptive cross-domain adaptation scenario. We evaluate the proposed method extensively on several tasks towards various endpoints with clinical and open-world significance. The results demonstrate its effectiveness and improvements over previous methods.


Unsupervised Domain Adaptation Using Feature Disentanglement And GCNs For Medical Image Classification

The success of deep learning has set new benchmarks for many medical ima...

Collaborative Unsupervised Domain Adaptation for Medical Image Diagnosis

Deep learning based medical image diagnosis has shown great potential in...

CTP-Net For Cross-Domain Trajectory Prediction

Deep learning based trajectory prediction methods rely on large amount o...

Knowledge Distillation for BERT Unsupervised Domain Adaptation

A pre-trained language model, BERT, has brought significant performance ...

TADA: Taxonomy Adaptive Domain Adaptation

Traditional domain adaptation addresses the task of adapting a model to ...

MemSAC: Memory Augmented Sample Consistency for Large Scale Domain Adaptation

Practical real world datasets with plentiful categories introduce new ch...

Partial Domain Adaptation Using Selective Representation Learning For Class-Weight Computation

The generalization power of deep-learning models is dependent on rich-la...

Please sign up or login with your details

Forgot password? Click here to reset