Alternatives To Ffd_cvpr2020
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Awesome Transformer Attention3,895
4 months ago15
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
X Transformers3,840104 months ago317December 02, 202355mitPython
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Awesome Speech Recognition Speech Synthesis Papers2,869
7 months ago2mit
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Eeg Dl563
a year ago10May 13, 20204mitPython
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Triplet Attention383
3 years agomitJupyter Notebook
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Action Recognition Visual Attention338
8 years ago8Jupyter Notebook
Action recognition using soft attention based deep recurrent neural networks
Linformer Pytorch323
2 years ago41October 10, 20202mitPython
My take on a practical implementation of Linformer for Pytorch.
Attention_is_all_you_need293
7 years ago1bsd-3-clauseJupyter Notebook
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Multi Scale Attention166
4 years agoPython
Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"
Atpapers105
3 years ago
Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力机制、Transformer和预训练语言模型论文与相关资源集合
Alternatives To Ffd_cvpr2020
Select To Compare


Alternative Project Comparisons
Popular Paper Projects
Popular Attention Mechanism Projects
Popular Learning Resources Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
C Plus Plus
Paper
Attention Mechanism