On the Importance of Asymmetry for Siamese Representation Learning

04/01/2022
by   Xiao Wang, et al.
5

Many recent self-supervised frameworks for visual representation learning are based on certain forms of Siamese networks. Such networks are conceptually symmetric with two parallel encoders, but often practically asymmetric as numerous mechanisms are devised to break the symmetry. In this work, we conduct a formal study on the importance of asymmetry by explicitly distinguishing the two encoders within the network – one produces source encodings and the other targets. Our key insight is keeping a relatively lower variance in target than source generally benefits learning. This is empirically justified by our results from five case studies covering different variance-oriented designs, and is aligned with our preliminary theoretical analysis on the baseline. Moreover, we find the improvements from asymmetric designs generalize well to longer training schedules, multiple other frameworks and newer backbones. Finally, the combined effect of several asymmetric designs achieves a state-of-the-art accuracy on ImageNet linear probing and competitive results on downstream transfer. We hope our exploration will inspire more research in exploiting asymmetry for Siamese representation learning.

READ FULL TEXT
research
11/20/2020

Exploring Simple Siamese Representation Learning

Siamese networks have become a common structure in various recent models...
research
10/20/2022

MixMask: Revisiting Masked Siamese Self-supervised Learning in Asymmetric Distance

Recent advances in self-supervised learning integrate Masked Modeling an...
research
01/30/2023

Exploring Image Augmentations for Siamese Representation Learning with Chest X-Rays

Image augmentations are quintessential for effective visual representati...
research
03/31/2023

Siamese DETR

Recent self-supervised methods are mainly designed for representation le...
research
10/08/2022

Robustness of Unsupervised Representation Learning without Labels

Unsupervised representation learning leverages large unlabeled datasets ...
research
04/30/2018

Sampling strategies in Siamese Networks for unsupervised speech representation learning

Recent studies have investigated siamese network architectures for learn...

Please sign up or login with your details

Forgot password? Click here to reset