Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for attention encoder decoder
attention
x
encoder-decoder
x
26 search results found
Pytorch Seq2seq
⭐
5,024
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
A Pytorch Tutorial To Image Captioning
⭐
2,084
Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning
Encoder_decoder
⭐
256
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Snli Attention
⭐
228
SNLI with word-word attention by LSTM encoder-decoder
Multi Scale Attention
⭐
166
Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"
Rnn For Joint Nlu
⭐
165
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Abstractive Summarization
⭐
113
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Tf Var Attention
⭐
55
Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018)
Bidirectiona Lstm For Text Summarization
⭐
49
A bidirectional encoder-decoder LSTM neural network is trained for text summarization on the cnn/dailymail dataset. (MIT808 project)
Mlpnlp Nmt
⭐
43
This is a sample code of "LSTM encoder-decoder with attention mechanism" mainly for understanding a recently developed machine translation framework based on deep neural networks.
Banglatranslator
⭐
28
Bangla Machine Translator
Pytorch_neural_machine_translation_attention
⭐
24
Neural Machine Translation with Attention (PyTorch)
Image Caption
⭐
24
Using LSTM or Transformer to solve Image Captioning in Pytorch
Tensorflow Seq2seq
⭐
21
Implement en-fr translation task by implenting seq2seq, encoder-decoder in RNN layers with Attention mechanism and Beamsearch inference decoder in TensorFlow 1.3
Dhs_summit_2019_image_captioning
⭐
18
Image captioning using attention models
Tf Var Attention
⭐
14
Variational Attention for Sequence to Sequence Models
Tf_encdec_seq2seq
⭐
12
Configurable Encoder-Decoder Sequence-to-Sequence model. Built with TensorFlow.
Pointer_generator_summarizer
⭐
12
Pointer Generator Network: Seq2Seq with attention, pointing and coverage mechanism for abstractive summarization.
Image Captioning
⭐
12
Implemented 3 different architectures to tackle the Image Caption problem, i.e, Merged Encoder-Decoder - Bahdanau Attention - Transformers
Veritas
⭐
11
Attention-based Recurrent Neural Nets + POS embeddings for Authorship Recognition
Self Attention
⭐
9
Transformer的完整实现。详细构建Encoder、Decoder、Self-attentio
Inter Intra Attentions
⭐
8
An experimental custom seq-2-seq model with both layer-wise (inter-layer), and intra-layer attention (attention to previous hidden states of the same RNN unit) for abstractive summarization.
Translate_machine_translation
⭐
8
Vietnamese and Chinese to English
Chatbot Pytorch
⭐
6
PyTorch Implementation of Japanese Chatbot
Attentionmnistseq2seq
⭐
6
Use encoder-decoder with Bahdanau attention to predict MNIST digits sequence
Attention Hred
⭐
5
Project on Hierarchical Encoder Decoder
Related Searches
Python Attention (2,327)
Pytorch Attention (645)
Jupyter Notebook Attention (546)
1-26 of 26 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.