Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Attention Is All You Need Pytorch | 7,910 | 7 months ago | 74 | mit | Python | |||||
A PyTorch implementation of the Transformer model in "Attention is All You Need". | ||||||||||
Nmt | 6,085 | 2 years ago | 275 | apache-2.0 | Python | |||||
TensorFlow Neural Machine Translation Tutorial | ||||||||||
Transformer | 3,882 | a year ago | 134 | apache-2.0 | Python | |||||
A TensorFlow Implementation of the Transformer: Attention Is All You Need | ||||||||||
Seq2seq Attn | 1,167 | 3 years ago | 14 | mit | Lua | |||||
Sequence-to-sequence model with LSTM encoder/decoders and attention | ||||||||||
Keras Attention | 656 | 5 years ago | 22 | agpl-3.0 | Python | |||||
Visualizing RNNs using the attention mechanism | ||||||||||
Attentiongan | 564 | 10 months ago | 16 | other | Python | |||||
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation | ||||||||||
Seq2seq | 502 | 4 years ago | 11 | mit | Python | |||||
Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch | ||||||||||
Pytorch Original Transformer | 376 | 3 years ago | mit | Jupyter Notebook | ||||||
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models. | ||||||||||
Keras Transformer | 312 | 13 | 3 | 2 years ago | 39 | January 22, 2022 | mit | Python | ||
Transformer implemented in Keras | ||||||||||
Transformer Tensorflow | 304 | 6 years ago | 5 | Python | ||||||
TensorFlow implementation of 'Attention Is All You Need (2017. 6)' |