Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for attention
attention
x
2,121 search results found
Annotated_deep_learning_paper_implementations
⭐
41,877
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Vit Pytorch
⭐
16,298
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Numpy Ml
⭐
14,162
Machine learning, in numpy
Nlp Tutorial
⭐
13,226
Natural Language Processing Tutorial for Deep Learning Researchers
External Attention Pytorch
⭐
10,361
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Attention Is All You Need Pytorch
⭐
7,910
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Text_classification
⭐
7,628
all kinds of text classification models and more with deep learning
Espnet
⭐
7,563
End-to-End Speech Processing Toolkit
Gpt Neox
⭐
6,366
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
Nmt
⭐
6,085
TensorFlow Neural Machine Translation Tutorial
Bertviz
⭐
5,547
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Pytorch Seq2seq
⭐
5,024
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Informer2020
⭐
4,553
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Transformer
⭐
3,882
A TensorFlow Implementation of the Transformer: Attention Is All You Need
X Transformers
⭐
3,840
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Nlp_paper_study
⭐
3,373
该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记
Scenic
⭐
2,733
Scenic: A Jax Library for Computer Vision Research and Beyond
Spektral
⭐
2,317
Graph Neural Networks with Keras and Tensorflow 2.
Leader Line
⭐
2,208
Draw a leader line in your web page.
Image_segmentation
⭐
2,186
Pytorch implementation of U-Net, R2U-Net, Attention U-Net, and Attention R2U-Net.
A Pytorch Tutorial To Image Captioning
⭐
2,084
Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning
Gat
⭐
2,078
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Reformer Pytorch
⭐
1,917
Reformer, the efficient Transformer, in Pytorch
Deepvoice3_pytorch
⭐
1,906
PyTorch implementation of convolutional neural networks-based text-to-speech synthesis models
Self Attention Gan
⭐
1,822
Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
Pytorch Gat
⭐
1,815
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Absa Pytorch
⭐
1,782
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Attention Module
⭐
1,746
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
Yoloair
⭐
1,714
🔥🔥🔥YOLOv5, YOLOv6, YOLOv7, YOLOv8, PPYOLOE, YOLOX, YOLOR, YOLOv4, YOLOv3, Transformer, Attention, TOOD and Improved-YOLOv5-YOLOv7... Support to improve backbone, neck, head, loss, IoU, NMS and other modules🚀
Pygat
⭐
1,684
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Deberta
⭐
1,673
The implementation of DeBERTa
Chinese Text Classification Pytorch
⭐
1,609
中文文本分类,TextCNN,TextRNN,FastText,TextRCNN,BiLSTM_At
Transformer Explainability
⭐
1,596
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
Yolov4 Pytorch
⭐
1,572
This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Nlp Journey
⭐
1,563
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Danet
⭐
1,556
Dual Attention Network for Scene Segmentation (CVPR2019)
Bi Att Flow
⭐
1,510
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
Swipebackhelper
⭐
1,378
make your activity can swipe to close
Fast Transformers
⭐
1,369
Pytorch library for fast transformer implementations
Nlp Models Tensorflow
⭐
1,329
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Hopfield Layers
⭐
1,258
Hopfield Networks is All You Need
Rcan
⭐
1,252
PyTorch code for our ECCV 2018 paper "Image Super-Resolution Using Very Deep Residual Channel Attention Networks"
Unet Segmentation Pytorch Nest Of Unets
⭐
1,236
Implementation of different kinds of Unet Models for Image Segmentation - Unet , RCNN-Unet, Attention Unet, RCNN-Attention Unet, Nested Unet
Deep_architecture_genealogy
⭐
1,196
Deep Learning Architecture Genealogy Project
Gansformer
⭐
1,188
Generative Adversarial Transformers
Seq2seq Attn
⭐
1,167
Sequence-to-sequence model with LSTM encoder/decoders and attention
Attention Transfer
⭐
1,120
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Lambda Networks
⭐
1,110
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Attention Gated Networks
⭐
1,099
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Rebar
⭐
1,089
ATTENTION: Please find the canonical repository here:
Nlp Paper
⭐
1,070
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和Py
Ccnet
⭐
1,061
CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).
Awesome Yolo Object Detection
⭐
1,047
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
Simplecvreproduction
⭐
1,021
Replication of simple CV Projects including attention, classification, detection, keypoint detection, etc.
Transformer
⭐
1,014
PyTorch Implementation of "Attention Is All You Need"
Textclassifier
⭐
1,003
Text classifier for Hierarchical Attention Networks for Document Classification
Sparse_attention
⭐
1,002
Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"
Vit Pytorch
⭐
984
Pytorch reimplementation of the Vision Transformer (An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale)
Bottom Up Attention
⭐
979
Bottom-up attention model for image captioning and VQA, based on Faster R-CNN and Visual Genome
Attention Ocr
⭐
973
Visual Attention based OCR
Attention Ocr
⭐
957
A Tensorflow model for text recognition (CNN + seq2seq with visual attention) available as a Python package and compatible with Google Cloud ML Engine.
Coordattention
⭐
937
Code for our CVPR2021 paper coordinate attention
Attention Learn To Route
⭐
931
Attention based model for learning to solve different routing problems
Ride
⭐
920
Test data editor for Robot Framework
Arctic Captions
⭐
831
Awesome Image Captioning
⭐
828
A curated list of image captioning and related area resources. :-)
Nlp Tutorials
⭐
794
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Performer Pytorch
⭐
777
An implementation of Performer, a linear attention-based transformer, in Pytorch
Django Bootstrap
⭐
735
Django Form Implementation of the Twitter-Bootstrap UI
Attentiondeepmil
⭐
727
Implementation of Attention-based Deep Multiple Instance Learning in PyTorch
Self Attention Gan
⭐
722
Awesome Fast Attention
⭐
717
list of efficient attention modules
Speech Transformer
⭐
714
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Tf Rnn Attention
⭐
703
Tensorflow implementation of attention mechanism for text classification tasks.
Dockerjenkins_tutorial
⭐
693
A repository for items learned in my Getting Started with Jenkins and Docker tutorial series
Transformer
⭐
691
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Dota Doai
⭐
676
This repo is the codebase for our team to participate in DOTA related competitions, including rotation and horizontal detection.
Spatial Transformer Network
⭐
661
A Tensorflow implementation of Spatial Transformer Networks.
Ecanet
⭐
658
Code for ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
Keras Attention
⭐
656
Visualizing RNNs using the attention mechanism
Seq2seq Pytorch
⭐
653
Sequence to Sequence Models with PyTorch
Textclassification Keras
⭐
649
Text classification models implemented in Keras, including: FastText, TextCNN, TextRNN, TextBiRNN, TextAttBiRNN, HAN, RCNN, RCNNVariant, etc.
Joeynmt
⭐
647
Minimalist NMT for educational purposes
Long Range Arena
⭐
635
Long Range Arena for Benchmarking Efficient Transformers
Image_captioning
⭐
634
Tensorflow implementation of "Show, Attend and Tell: Neural Image Caption Generation with Visual Attention"
Vad
⭐
632
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Cbam.pytorch
⭐
630
Non-official implement of Paper:CBAM: Convolutional Block Attention Module
Text Classification
⭐
622
Implementation of papers for text classification task on DBpedia
Tensorflow Ocr
⭐
616
🖺 OCR using tensorflow with attention
Longnet
⭐
613
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Rlseq2seq
⭐
610
Deep Reinforcement Learning For Sequence to Sequence Models
Tab Transformer Pytorch
⭐
609
Implementation of TabTransformer, attention network for tabular data, in Pytorch
Bottom Up Attention Vqa
⭐
606
An efficient PyTorch implementation of the winning entry of the 2017 VQA Challenge.
Transformer Tts
⭐
599
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Moran_v2
⭐
593
MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition
Quill Image Extend Module
⭐
583
vue-quill-editor的增强模块,提供图片上传,复制插入,拖拽插入,支持与其他模块一起使用
Nlp Paper
⭐
579
NLP Paper
Keras Self Attention
⭐
570
Attention mechanism for processing sequential data that considers the context for each timestamp.
R Net
⭐
569
Tensorflow Implementation of R-Net
Attentiongan
⭐
564
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Related Searches
Python Attention (2,327)
Pytorch Attention (645)
Jupyter Notebook Attention (557)
1-100 of 2,121 search results
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.