Neighborhood Attention Transformer
Tiklamalar: 38
ID: 282328
2022
Makale Kalitesi ve Performans Metrikleri
Genel Kalite
Improving Quality
0.0
/100
Etkilesim verilerini yapay zeka tabanli akademik kalite degerlendirmesiyle birlestirir
Okuyucu Etkilesimi
Emerging Content
3.6
/100
12 goruntulemeler
12 okuyucular
Trend
Yapay Zeka Kalite Degerlendirmesi
Analiz edilmedi
Ozet
We present Neighborhood Attention (NA), the first efficient and scalable
sliding-window attention mechanism for vision. NA is a pixel-wise operation,
localizing self attention (SA) to the nearest neighboring pixels, and therefore
enjoys a linear time and space complexity compared to the quadratic complexity
of SA. The sliding-window pattern allows NA's receptive field to grow without
needing extra pixel shifts, and preserves translational equivariance, unlike
Swin Transformer's Window Self Attention (WSA). We develop NATTEN (Neighborhood
Attention Extension), a Python package with efficient C++ and CUDA kernels,
which allows NA to run up to 40% faster than Swin's WSA while using up to 25%
less memory. We further present Neighborhood Attention Transformer (NAT), a new
hierarchical transformer design based on NA that boosts image classification
and downstream vision performance. Experimental results on NAT are competitive;
NAT-Tiny reaches 83.2% top-1 accuracy on ImageNet, 51.4% mAP on MS-COCO and
48.4% mIoU on ADE20K, which is 1.9% ImageNet accuracy, 1.0% COCO mAP, and 2.6%
ADE20K mIoU improvement over a Swin model with similar size. To support more
research based on sliding-window attention, we open source our project and
release our checkpoints at:
https://github.com/SHI-Labs/Neighborhood-Attention-Transformer .
| Referans Anahtari |
shi2022neighborhood
Kullanarak makale yazarken otomatik alinti icin bu anahtari kullanin
SciMatic Makale Yoneticisi veya Tez Yoneticisi
|
|---|---|
| Yazarlar | Ali Hassani; Steven Walton; Jiachen Li; Shen Li; Humphrey Shi |
| Dergi | arXiv |
| Yil | 2022 |
| DOI |
DOI bulunamadi
|
| URL | |
| Anahtar Kelimeler |
Atiflar
Atif bulunamadi. Atif eklemek icin yoneticiyle iletisime gecin: info@scimatic.org
Yorumlar
Henuz yorum yok. Bu makaleye ilk yorumu siz yapin.