Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for jupyter notebook attention
attention
x
jupyter-notebook
x
277 search results found
Annotated_deep_learning_paper_implementations
⭐
41,877
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Nlp Tutorial
⭐
13,226
Natural Language Processing Tutorial for Deep Learning Researchers
Pytorch Seq2seq
⭐
5,024
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Pytorch Gat
⭐
1,815
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Transformer Explainability
⭐
1,596
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
Nlp Models Tensorflow
⭐
1,329
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Attention Transfer
⭐
1,120
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Simplecvreproduction
⭐
1,021
Replication of simple CV Projects including attention, classification, detection, keypoint detection, etc.
Vit Pytorch
⭐
984
Pytorch reimplementation of the Vision Transformer (An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale)
Bottom Up Attention
⭐
979
Bottom-up attention model for image captioning and VQA, based on Faster R-CNN and Visual Genome
Attention Learn To Route
⭐
931
Attention based model for learning to solve different routing problems
Transformer
⭐
691
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Dota Doai
⭐
676
This repo is the codebase for our team to participate in DOTA related competitions, including rotation and horizontal detection.
Stereo Transformer
⭐
555
Revisiting Stereo Depth Estimation From a Sequence-to-Sequence Perspective with Transformers. (ICCV 2021 Oral)
Attention Networks For Classification
⭐
477
Hierarchical Attention Networks for Document Classification in PyTorch
Dla
⭐
421
Deep learning for audio processing
Pytorch Original Transformer
⭐
376
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Deep_learning_nlp
⭐
359
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Draw
⭐
346
Reimplementation of DRAW
Action Recognition Visual Attention
⭐
338
Action recognition using soft attention based deep recurrent neural networks
Transformer_time_series
⭐
331
Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (NeurIPS 2019)
Mprgdeeplearninglecturenotebook
⭐
324
Chinese Chatbot
⭐
318
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传
Attention Analysis
⭐
309
Attention_is_all_you_need
⭐
293
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Adaptiveattention
⭐
288
Implementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Hiecoattenvqa
⭐
284
Latex_ocr
⭐
276
💎 数学公式识别 Math Formula OCR
Attention Pytorch
⭐
264
注意力机制实践
Eeap Examples
⭐
241
Code for Document Similarity on Reuters dataset using Encode, Embed, Attend, Predict recipe
Da Rnn
⭐
234
📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Tokencut
⭐
233
(CVPR 2022) Pytorch implementation of "Self-supervised transformers for unsupervised object discovery using normalized cut"
Jddc_solution_4th
⭐
225
2018-JDDC大赛第4名的解决方案
Sohu_competition
⭐
216
Sohu's 2018 content recognition competition 1st solution(搜狐内容识别大赛第一名解决方案)
Pytorch Lesson Zh
⭐
214
pytorch 包教不包会
Probing Vits
⭐
209
Probing the representations of Vision Transformers.
Bottom Up Attention.pytorch
⭐
207
A PyTorch reimplementation of bottom-up-attention models
Cvpr2019_pyramid Feature Attention Network For Saliency Detection
⭐
201
code and model of Pyramid Feature Selective Network for Saliency detection
Attention_network_with_keras
⭐
201
An example attention network with simple dataset.
Graph Convolution Nlp
⭐
200
Graph Convolution Network for NLP
Doc Han Att
⭐
193
Hierarchical Attention Networks for Chinese Sentiment Classification
Hey Jetson
⭐
189
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Simpleselfattention
⭐
184
A simpler version of the self-attention layer from SAGAN, and some image classification results.
Rl4co
⭐
183
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
Davsod
⭐
182
Shifting More Attention to Video Salient Objection Detection, CVPR 2019 (Oral)
Multihead Siamese Nets
⭐
173
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Rnn For Joint Nlu
⭐
165
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Graph_attention_pool
⭐
161
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Asrframe
⭐
155
An Automatic Speech Recognition Frame ,一个中文语音识别的完整框架, 提供了多个模型
Csa Inpainting
⭐
149
Coherent Semantic Attention for image inpainting(ICCV 2019)
Speech2text
⭐
148
A Deep-Learning-Based Persian Speech Recognition System
Speechcmdrecognition
⭐
148
A neural attention model for speech command recognition
Attentionn
⭐
147
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
__init__
⭐
130
Absvit
⭐
120
Official code for "Top-Down Visual Attention from Analysis by Synthesis" (CVPR 2023 highlight)
Hierarchical Attention Network
⭐
117
Implementation of Hierarchical Attention Networks in PyTorch
Abstractive Summarization
⭐
113
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Colab Tricks
⭐
110
Tricks for Colab power users
Pytorch Model Zoo
⭐
108
A collection of deep learning models implemented in PyTorch
Tf2deepfloorplan
⭐
107
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
Linear Attention Recurrent Neural Network
⭐
107
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Bertqa Attention On Steroids
⭐
105
BertQA - Attention on Steroids
Weibosentiment
⭐
104
基于各种机器学习和深度学习的中文微博情感分析
Pytorch Question Answering
⭐
102
Important paper implementations for Question Answering using PyTorch
Set_transformer
⭐
101
Pytorch implementation of set transformer
Abstractive Text Summarization
⭐
99
PyTorch implementation/experiments on Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond paper.
Automatic Generation Of Text Summaries
⭐
98
使用两种方法(抽取式Textrank和概要式seq2seq)自动提取文本摘要
Pytorch Transfomer
⭐
95
My implementation of the transformer architecture from the Attention is All you need paper applied to time series.
Delf_enhanced
⭐
93
Wrapper of DELF Tensorflow Model
Ranger Mish Imagewoof 5
⭐
93
Repo to build on / reproduce the record breaking Ranger-Mish-SelfAttention setup on FastAI ImageWoof dataset 5 epochs
Machine Learning
⭐
92
My Attempt(s) In The World Of ML/DL....
Kdd_winniethebest
⭐
91
KDD Cup 2020 Challenges for Modern E-Commerce Platform: Multimodalities Recall first place
Transformer_image_caption
⭐
85
Image Captioning based on Bottom-Up and Top-Down Attention model
Adaptive
⭐
84
Pytorch Implementation of Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning
100 Days Of Nlp
⭐
83
Neural Vqa Attention
⭐
81
❓ Attention-based Visual Question Answering in Torch
Opam_tip2018
⭐
80
Source code of our TIP 2018 paper "Object-Part Attention Model for Fine-grained Image Classification"
Algorithm Whiteboard Resources
⭐
78
this is where we share notebooks/projects used in your youtube channel
Image Captions
⭐
77
BERT + Image Captioning
Mad
⭐
74
Code for "Online and Linear Time Attention by Enforcing Monotonic Alignments"
Natural Language Joint Query Search
⭐
70
Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Attention_flow
⭐
68
Group Level Emotion Recognition
⭐
67
Model submitted for the ICMI 2018 EmotiW Group-Level Emotion Recognition Challenge
Cs224n_project
⭐
65
Neural Image Captioning in TensorFlow.
Positional Encoding
⭐
63
Encoding position with the word embeddings.
Scifive
⭐
62
SciFive: a text-text transformer model for biomedical literature
Imcap_keras
⭐
62
Image captioning with spatial attention using keras with tensorflow backend
E2e Glstm Sc
⭐
61
Code for paper "Image Caption Generation with Text-Conditional Semantic Attention"
Synthesizer
⭐
60
A PyTorch implementation of the paper - "Synthesizer: Rethinking Self-Attention in Transformer Models"
Arelu
⭐
58
AReLU: Attention-based-Rectified-Linear-Unit
Rarnn
⭐
57
Recursive application of recurrent neural networks, for hierarchical intent parsing
How To Train Your Neural Net
⭐
55
Deep learning research implemented on notebooks using PyTorch.
Attentionmask
⭐
51
AttentionMask: Attentive, Efficient Object Proposal Generation Focusing on Small Objects (ACCV 2018, accepted as oral)
Cs224n Gpu That Talks
⭐
50
Attention, I'm Trying to Speak: End-to-end speech synthesis (CS224n '18)
Factored Attention
⭐
50
This repository contains code for reproducing results in our paper Interpreting Potts and Transformer Protein Models Through the Lens of Simplified Attention
Slot Attention Pytorch
⭐
50
Pytorch Implementation of paper "Object-Centric Learning with Slot Attention"
Image Captioning With Semantic Attention
⭐
47
Attention
⭐
47
Deep Learning Tensorflow
⭐
47
Gathers Tensorflow deep learning models.
S2am
⭐
46
[TIP 2020] Improving the Harmony of the Composite Image by Spatial-Separated Attention Module
Related Searches
Python Jupyter Notebook (12,976)
Jupyter Notebook Machine Learning (8,463)
Jupyter Notebook Dataset (6,824)
Jupyter Notebook Deep Learning (6,566)
Jupyter Notebook Tensorflow (4,771)
Jupyter Notebook Data Science (4,256)
Jupyter Notebook Convolutional Neural Networks (4,218)
Jupyter Notebook Classification (3,939)
Jupyter Notebook Neural (3,926)
Jupyter Notebook Pytorch (3,877)
1-100 of 277 search results
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.