Keras_attention_seq2seq

A sequence-to-sequence framework of Keras-based generative attention mechanisms that humans can read.一个人类可以阅读的基于Keras的代注意力机制的序列到序列的框架/模型,或许你不用写复杂的
Alternatives To Keras_attention_seq2seq
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Attention Is All You Need Pytorch7,910
6 months ago74mitPython
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Nmt6,085
a year ago275apache-2.0Python
TensorFlow Neural Machine Translation Tutorial
Transformer3,882
a year ago134apache-2.0Python
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Seq2seq Attn1,167
3 years ago14mitLua
Sequence-to-sequence model with LSTM encoder/decoders and attention
Keras Attention656
5 years ago22agpl-3.0Python
Visualizing RNNs using the attention mechanism
Attentiongan564
9 months ago16otherPython
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Seq2seq502
4 years ago11mitPython
Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
Pytorch Original Transformer376
3 years agomitJupyter Notebook
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Keras Transformer3121332 years ago39January 22, 2022mitPython
Transformer implemented in Keras
Transformer Tensorflow304
6 years ago5Python
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Alternatives To Keras_attention_seq2seq
Select To Compare


Alternative Project Comparisons
Popular Attention Projects
Popular Translation Projects
Popular Machine Learning Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Translation
Keras
Attention
Sequence To Sequence
Attention Mechanism
Machine Translation
Keras Tensorflow