Multi-Granularity Distillation Scheme Towards Lightweight Semi-Supervised Semantic Segmentation

08/22/2022
by   Jie Qin, et al.
6

Albeit with varying degrees of progress in the field of Semi-Supervised Semantic Segmentation, most of its recent successes are involved in unwieldy models and the lightweight solution is still not yet explored. We find that existing knowledge distillation techniques pay more attention to pixel-level concepts from labeled data, which fails to take more informative cues within unlabeled data into account. Consequently, we offer the first attempt to provide lightweight SSSS models via a novel multi-granularity distillation (MGD) scheme, where multi-granularity is captured from three aspects: i) complementary teacher structure; ii) labeled-unlabeled data cooperative distillation; iii) hierarchical and multi-levels loss setting. Specifically, MGD is formulated as a labeled-unlabeled data cooperative distillation scheme, which helps to take full advantage of diverse data characteristics that are essential in the semi-supervised setting. Image-level semantic-sensitive loss, region-level content-aware loss, and pixel-level consistency loss are set up to enrich hierarchical distillation abstraction via structurally complementary teachers. Experimental results on PASCAL VOC2012 and Cityscapes reveal that MGD can outperform the competitive approaches by a large margin under diverse partition protocols. For example, the performance of ResNet-18 and MobileNet-v2 backbone is boosted by 11.5 protocol on Cityscapes. Although the FLOPs of the model backbone is compressed by 3.4-5.3x (ResNet-18) and 38.7-59.6x (MobileNetv2), the model manages to achieve satisfactory segmentation results.

READ FULL TEXT

page 6

page 13

research
01/02/2019

Learning Efficient Detector with Semi-supervised Adaptive Distillation

Knowledge Distillation (KD) has been used in image classification for mo...
research
06/07/2023

CorrMatch: Label Propagation via Correlation Matching for Semi-Supervised Semantic Segmentation

In this paper, we present a simple but performant semi-supervised semant...
research
07/21/2020

Deep Semi-supervised Knowledge Distillation for Overlapping Cervical Cell Instance Segmentation

Deep learning methods show promising results for overlapping cervical ce...
research
09/06/2022

Transformer-CNN Cohort: Semi-supervised Semantic Segmentation by the Best of Both Students

The popular methods for semi-supervised semantic segmentation mostly ado...
research
08/03/2023

Guided Distillation for Semi-Supervised Instance Segmentation

Although instance segmentation methods have improved considerably, the d...
research
06/27/2021

Semi-supervised Semantic Segmentation with Directional Context-aware Consistency

Semantic segmentation has made tremendous progress in recent years. Howe...
research
08/18/2023

Diverse Cotraining Makes Strong Semi-Supervised Segmentor

Deep co-training has been introduced to semi-supervised segmentation and...

Please sign up or login with your details

Forgot password? Click here to reset