A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation

07/24/2023
by   Jinjing Zhu, et al.
0

In this paper, we strive to answer the question "how to collaboratively learn convolutional neural network (CNN)-based and vision transformer (ViT)-based models by selecting and exchanging the reliable knowledge between them for semantic segmentation?" Accordingly, we propose an online knowledge distillation (KD) framework that can simultaneously learn compact yet effective CNN-based and ViT-based models with two key technical breakthroughs to take full advantage of CNNs and ViT while compensating their limitations. Firstly, we propose heterogeneous feature distillation (HFD) to improve students' consistency in low-layer feature space by mimicking heterogeneous features between CNNs and ViT. Secondly, to facilitate the two students to learn reliable knowledge from each other, we propose bidirectional selective distillation (BSD) that can dynamically transfer selective knowledge. This is achieved by 1) region-wise BSD determining the directions of knowledge transferred between the corresponding regions in the feature space and 2) pixel-wise BSD discerning which of the prediction knowledge to be transferred in the logit space. Extensive experiments on three benchmark datasets demonstrate that our proposed framework outperforms the state-of-the-art online distillation methods by a large margin, and shows its efficacy in learning collaboratively between ViT-based and CNN-based models.

READ FULL TEXT

page 1

page 6

page 16

research
09/18/2023

Heterogeneous Generative Knowledge Distillation with Masked Image Modeling

Small CNN-based models usually require transferring knowledge from a lar...
research
09/06/2022

Transformer-CNN Cohort: Semi-supervised Semantic Segmentation by the Best of Both Students

The popular methods for semi-supervised semantic segmentation mostly ado...
research
03/11/2019

Structured Knowledge Distillation for Semantic Segmentation

In this paper, we investigate the knowledge distillation strategy for tr...
research
09/16/2020

Compressing Facial Makeup Transfer Networks by Collaborative Distillation and Kernel Decomposition

Although the facial makeup transfer network has achieved high-quality pe...
research
06/25/2018

PILOT: A Pixel Intensity Driven Illuminant Color Estimation Framework for Color Constancy

In this study, a CNN based Pixel Intensity driven iLluminant cOlor esTim...
research
10/25/2021

MUSE: Feature Self-Distillation with Mutual Information and Self-Information

We present a novel information-theoretic approach to introduce dependenc...
research
11/23/2022

Structural Knowledge Distillation for Object Detection

Knowledge Distillation (KD) is a well-known training paradigm in deep ne...

Please sign up or login with your details

Forgot password? Click here to reset