Mlpnlp Nmt

This is a sample code of "LSTM encoder-decoder with attention mechanism" mainly for understanding a recently developed machine translation framework based on deep neural networks.
Alternatives To Mlpnlp Nmt
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Attention Is All You Need Pytorch7,910
7 months ago74mitPython
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Nmt6,085
2 years ago275apache-2.0Python
TensorFlow Neural Machine Translation Tutorial
Transformer3,882
a year ago134apache-2.0Python
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Seq2seq Attn1,167
3 years ago14mitLua
Sequence-to-sequence model with LSTM encoder/decoders and attention
Keras Attention656
5 years ago22agpl-3.0Python
Visualizing RNNs using the attention mechanism
Attentiongan564
10 months ago16otherPython
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Seq2seq502
4 years ago11mitPython
Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
Pytorch Original Transformer376
3 years agomitJupyter Notebook
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Keras Transformer3121332 years ago39January 22, 2022mitPython
Transformer implemented in Keras
Transformer Tensorflow304
6 years ago5Python
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Alternatives To Mlpnlp Nmt
Select To Compare


Alternative Project Comparisons
Popular Translation Projects
Popular Attention Projects
Popular Data Processing Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Neural Network
Translation
Lstm
Decoder
Encoder
Attention
Attention Mechanism
Nmt
Encoder Decoder