Cross-stitch Networks for Multi-task Learning

04/12/2016
by   Ishan Misra, et al.
0

Multi-task learning in Convolutional Networks has displayed remarkable success in the field of recognition. This success can be largely attributed to learning shared representations from multiple supervisory tasks. However, existing multi-task approaches rely on enumerating multiple network architectures specific to the tasks at hand, that do not generalize. In this paper, we propose a principled approach to learn shared representations in ConvNets using multi-task learning. Specifically, we propose a new sharing unit: "cross-stitch" unit. These units combine the activations from multiple networks and can be trained end-to-end. A network with cross-stitch units can learn an optimal combination of shared and task-specific representations. Our proposed method generalizes across multiple tasks and shows dramatically improved performance over baseline methods for categories with few training examples.

READ FULL TEXT
research
08/26/2019

Stochastic Filter Groups for Multi-Task CNNs: Learning Specialist and Generalist Convolution Kernels

The performance of multi-task learning in Convolutional Neural Networks ...
research
11/16/2016

Fully-adaptive Feature Sharing in Multi-Task Networks with Applications in Person Attribute Classification

Multi-task learning aims to improve generalization performance of multip...
research
05/25/2022

Real-Time Video Deblurring via Lightweight Motion Compensation

While motion compensation greatly improves video deblurring quality, sep...
research
04/07/2020

Multi-Task Learning via Co-Attentive Sharing for Pedestrian Attribute Recognition

Learning to predict multiple attributes of a pedestrian is a multi-task ...
research
11/15/2022

Cross-Stitched Multi-task Dual Recursive Networks for Unified Single Image Deraining and Desnowing

We present the Cross-stitched Multi-task Unified Dual Recursive Network ...
research
09/28/2016

Learning to Push by Grasping: Using multiple tasks for effective learning

Recently, end-to-end learning frameworks are gaining prevalence in the f...
research
03/13/2017

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

In this work, we present a compact, modular framework for constructing n...

Please sign up or login with your details

Forgot password? Click here to reset