Local-to-Global Self-Attention in Vision Transformers

07/10/2021
by   Jinpeng Li, et al.
0

Transformers have demonstrated great potential in computer vision tasks. To avoid dense computations of self-attentions in high-resolution visual data, some recent Transformer models adopt a hierarchical design, where self-attentions are only computed within local windows. This design significantly improves the efficiency but lacks global feature reasoning in early stages. In this work, we design a multi-path structure of the Transformer, which enables local-to-global reasoning at multiple granularities in each stage. The proposed framework is computationally efficient and highly effective. With a marginal increasement in computational overhead, our model achieves notable improvements in both image classification and semantic segmentation. Code is available at https://github.com/ljpadam/LG-Transformer

READ FULL TEXT
research
11/27/2022

Semantic-Aware Local-Global Vision Transformer

Vision Transformers have achieved remarkable progresses, among which Swi...
research
11/25/2021

NomMer: Nominate Synergistic Context in Vision Transformer for Visual Recognition

Recently, Vision Transformers (ViT), with the self-attention (SA) as the...
research
01/04/2022

PyramidTNT: Improved Transformer-in-Transformer Baselines with Pyramid Architecture

Transformer networks have achieved great progress for computer vision ta...
research
12/24/2021

SimViT: Exploring a Simple Vision Transformer with sliding windows

Although vision Transformers have achieved excellent performance as back...
research
07/14/2023

HEAL-SWIN: A Vision Transformer On The Sphere

High-resolution wide-angle fisheye images are becoming more and more imp...
research
06/11/2023

E(2)-Equivariant Vision Transformer

Vision Transformer (ViT) has achieved remarkable performance in computer...
research
03/04/2023

A Fast Training-Free Compression Framework for Vision Transformers

Token pruning has emerged as an effective solution to speed up the infer...

Please sign up or login with your details

Forgot password? Click here to reset