Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for attention mechanism
attention-mechanism
x
649 search results found
Protein Localization
⭐
43
Using Transformer protein embeddings with a linear attention mechanism to make SOTA de-novo predictions for the subcellular location of proteins 🔬
Gmflownet
⭐
42
Global Matching with Overlapping Attention for Optical Flow Estimation, CVPR 2022
Describing_a_knowledge_base
⭐
42
Code for Describing a Knowledge Base
Coordinate Descent Attention
⭐
41
Implementation of an Attention layer where each head can attend to more than just one token, using coordinate descent to pick topk
Flash Attention Softmax N
⭐
41
CUDA and Triton implementations of Flash Attention with SoftmaxN.
Video Description With Spatial Temporal Attention
⭐
41
Our paper was published in the proceedings of ACM Multimedia 2017 (MM' 17) .
Attention_is_all_you_need
⭐
41
Mocha Pytorch
⭐
40
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Keras_attention
⭐
40
🔖 An Attention Layer in Keras
Transformer In Transformer
⭐
40
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Sequencing
⭐
39
A flexible and simple framework for sequence to sequence learning.
Attention
⭐
39
Attention based neural machine translation
Attend And Rectify
⭐
39
Pytorch code for the IEEE TM/ECCV paper "Attend and rectify"
Anr
⭐
39
Code for our CIKM 2018 paper titled "ANR: Aspect-based Neural Recommender"
Speakerrecognitionfromscratch
⭐
38
Final project for the Speaker Recognition course on Udemy, 机器之心, 深蓝学院 and 语音之家
Efficient Attention
⭐
37
[EVA ICLR'23; LARA ICML'22] Efficient attention mechanisms via control variates, random features, and importance sampling
Attentional Neural Factorization Machine
⭐
37
Attention,Factorization Machine, Deep Learning, Recommender System
Anmm Cikm16
⭐
37
Implementation of Attention-Based Neural Matching Model Proposed in CIKM16 for Answer Sentence Selection
Travis
⭐
36
TrAVis: Visualise BERT attention in-browser
3d Object Reconstruction From Multi View Monocular Rgb Images
⭐
36
Hybrid Ensemble Approach For 3D Object Reconstruction from Multi-View Monocular RGB images
Multilingual_nmt
⭐
36
Experiments on Multilingual NMT
Kalman Filtering Attention
⭐
36
Implementation of the Kalman Filtering Attention proposed in "Kalman Filtering Attention for User Behavior Modeling in CTR Prediction"
Ffd_cvpr2020
⭐
36
Ecg Synthesis And Classification
⭐
35
1D GAN for ECG Synthesis and 3 models: CNN, LSTM, and Attention mechanism for ECG Classification.
Attsets
⭐
35
🔥AttSets in Tensorflow (IJCV 2019)
Autoregressive Linear Attention Cuda
⭐
35
CUDA implementation of autoregressive linear attention, with all the latest research findings
Multivariate Attention Tcn
⭐
35
Attentional Pointnet
⭐
34
Attentional-PointNet is Deep-Neural-Network architecture for 3D object detection in point clouds
Fragmentvc
⭐
34
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Ssm Dta
⭐
34
SSM-DTA: Breaking the Barriers of Data Scarcity in Drug-Target Affinity Prediction (Briefings in Bioinformatics 2023)
Automatic Personality Prediction
⭐
33
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Sigat
⭐
33
source code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Atfm
⭐
33
Attentive Traffic Flow Machines
Dual Mfa Vqa
⭐
33
Co-attending Regions and Detections for VQA.
Hierarchical Word Sense Disambiguation Using Wordnet Senses
⭐
33
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Ganvinci
⭐
33
Photorealistic human image editing with GANs - Reimplementation of the paper "FEAT: Face Editing with Attention" with additional changes and improvements.
Dcsp_segmentation
⭐
33
T5 Pytorch
⭐
32
Implementation of Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer in PyTorch.
Sa Dl
⭐
32
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stanford Cs231n Assignments 2020
⭐
32
This repository contains my solutions to the assignments for Stanford's CS231n "Convolutional Neural Networks for Visual Recognition" (Spring 2020).
Mambatransformer
⭐
31
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Focus Longer To See Better
⭐
31
[CVPRW 2020] Focus Longer to See Better:Recursively Refined Attention for Fine-Grained Image Classification
Scattnet
⭐
31
Semantic Segmentation Network with Spatial and Channel Attention Mechanism for High-Resolution Remote Sensing Images
Ear
⭐
31
Code associated with the paper "Entropy-based Attention Regularization Frees Unintended Bias Mitigation from Lists"
Minimal Nmt
⭐
31
A minimal nmt example to serve as an seq2seq+attention reference.
Panoptic Transformer
⭐
31
Another attempt at a long-context / efficient transformer by me
Subcellular_localization
⭐
30
Memformer
⭐
30
Implementation of Memformer, a Memory-augmented Transformer, in Pytorch
Deep Implicit Attention
⭐
29
Experimental implementation of deep implicit attention in PyTorch
Lstm Attention
⭐
29
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Samn
⭐
29
This is our implementation of SAMN: Social Attentional Memory Network
Im2latex
⭐
29
Tensorflow Implementation of Im2Latex
Attswinunet
⭐
28
Official implementation code for Attention Swin U-Net: Cross-Contextual Attention Mechanism for Skin Lesion Segmentation paper
Cosformer Pytorch
⭐
28
Unofficial PyTorch implementation of the paper "cosFormer: Rethinking Softmax In Attention".
Attention Snn
⭐
28
Offical implementation of "Attention Spiking Neural Networks" (IEEE T-PAMI2023)
Mpad
⭐
28
Message Passing Attention Networks for Document Understanding
Manner
⭐
28
MANNER: Multi-view Attention Network for Noise ERasure (Speech enhancement in time-domain)
Attention Ocr Toy Example
⭐
28
Audio Vision
⭐
28
Implementation and reviews of Audio & Computer vision related papers in python using keras and tensorflow.
Aat
⭐
27
Code for paper "Adaptively Aligned Image Captioning via Adaptive Attention Time". NeurIPS 2019
Attention_in_graph
⭐
27
attention mechanism for graph classification, significant sub-graph mining, graph disstillation
Image Captioning Chinese
⭐
27
Image Captioning in Chinese using LSTM RNN with attention mechanism
Gdcan
⭐
27
[TPAMI 2021] Code release for "Generalized Domain Conditioned Adaptation Network" https://arxiv.org/abs/2103.12339
Htm Pytorch
⭐
27
Implementation of Hierarchical Transformer Memory (HTM) for Pytorch
3han
⭐
26
3HAN: A Deep Neural Network for Fake News Detection: https://link.springer.com/chapter/10.1007%2F978-3-
Denet
⭐
26
This is the official repo for Dynamic Extension Nets for Few-shot Semantic Segmentation (ACM Multimedia 20).
Glas
⭐
26
Generative Latent Attentive Sampler
Stgm
⭐
26
STGM: Spatio-Temporal Graph Mixformer for Traffic Forecasting
Affgcn
⭐
25
Attention Feature Fusion base on spatial-temporal Graph Convolutional Network(AFFGCN)
Mfpnet
⭐
25
PyTorch implementation for "Remote Sensing Change Detection Based on Multidirectional Adaptive Feature Fusion and Perceptual Similarity"
Show Attend And Tell Keras
⭐
25
Keras implementation of the "Show, Attend and Tell" paper
Rnn Text Classification Tf
⭐
25
Tensorflow implementation of Attention-based Bidirectional RNN text classification.
Fed Att
⭐
25
Attentive Federated Learning for Private NLM
Attention Guided Sparsity
⭐
25
Attention-Based Guided Structured Sparsity of Deep Neural Networks
Languagemodel Using Attention
⭐
25
Pytorch implementation of a basic language model using Attention in LSTM network
Memory Editable Transformer
⭐
24
My explorations into editing the knowledge and memories of an attention network
Visual Attention Model
⭐
24
Chainer implementation of Deepmind's Visual Attention Model paper
Complex Valued Transformer
⭐
24
Implementation of the transformer proposed in "Building Blocks for a Complex-Valued Transformer Architecture"
Image Caption
⭐
24
Using LSTM or Transformer to solve Image Captioning in Pytorch
Pytorch_neural_machine_translation_attention
⭐
24
Neural Machine Translation with Attention (PyTorch)
Omninet Pytorch
⭐
24
Implementation of OmniNet, Omnidirectional Representations from Transformers, in Pytorch
Tranception Pytorch
⭐
24
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction
Linear Attention Mechanism
⭐
24
⭐ Welcome to my HomePage ⭐
Lyricshan
⭐
24
Music Genre Classification by Lyrics using a Hierarchical Attention Network
Pause Transformer
⭐
24
Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amount of time on any token
Adast
⭐
23
[IEEE TETCI] "ADAST: Attentive Cross-domain EEG-based Sleep Staging Framework with Iterative Self-Training"
Attentiongatedvnet3d
⭐
23
Attention Gated VNet3D Model for KiTS19——2019 Kidney Tumor Segmentation Challenge
Txt2txt
⭐
23
Extremely easy to use sequence to sequence library with attention, for text to text conversion tasks.
Mengzi Retrieval Lm
⭐
23
An experimental implementation of the retrieval-enhanced language model
Layer_augmentation
⭐
23
Implementation of the NLI model in our ACL 2019 paper: Augmenting Neural Networks with First-order Logic.
Han
⭐
23
TensorFlow implementation of Z. Hu et al. "Listening to Chaotic Whispers: A Deep Learning Framework for News-oriented Stock Trend Prediction", WSDM 2018
San
⭐
23
Attention-based feature ranking for propositional data.
Compositional Attention Pytorch
⭐
23
Implementation of "compositional attention" from MILA, a multi-head attention variant that is reframed as a two-step attention process with disentangled search and retrieval head aggregation, in Pytorch
Keras_attentivenormalization
⭐
23
Unofficial Keras implementation of the paper Attentive Normalization.
Multigrid Neural Architectures
⭐
23
Multigrid Neural Architecture
Sentimentanalysis
⭐
22
Sentiment Analysis: Deep Bi-LSTM+attention model
Daam I2i
⭐
22
Diffusion attentive attribution maps for interpreting Stable Diffusion for image-to-image attention.
Transframer Pytorch
⭐
22
Implementation of Transframer, Deepmind's U-net + Transformer architecture for up to 30 seconds video generation, in Pytorch
Protein Localization Transformer
⭐
22
Code for CELL-E: Biological Zero-Shot Text-to-Image Synthesis for Protein Localization Prediction
Neural Networks For Time Series Analysis
⭐
22
Compare how ANNs, RNNs, LSTMs, and LSTMs with attention perform on time-series analysis
Related Searches
Python Attention Mechanism (517)
Deep Learning Attention Mechanism (370)
301-400 of 649 search results
< Previous
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.