Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for python attention mechanism
attention-mechanism
x
python
x
444 search results found
Stam Pytorch
⭐
59
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Sturcture Inpainting
⭐
59
Source code of AAAI 2020 paper 'Learning to Incorporate Structure Knowledge for Image Inpainting'
Grm
⭐
59
[CVPR'23] The official PyTorch implementation of our CVPR 2023 paper: "Generalized Relation Modeling for Transformer Tracking".
Ntua Slp Semeval2018
⭐
58
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Arelu
⭐
58
AReLU: Attention-based-Rectified-Linear-Unit
Memory Compressed Attention
⭐
58
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Pytorch Attention Guided Cyclegan
⭐
57
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Agent Attention Pytorch
⭐
57
Implementation of Agent Attention in Pytorch
Abivirnet
⭐
56
Attention Bidirectional Video Recurrent Net
Xiaox
⭐
56
flask+seq2seq【TensorFlow1.0, Pytorch】 🎨 🎨 在线聊天机器人 https://mp.weixin.qq.com/s/VpiAmVSTin3ALA8MnzhCJA 或 https://ask.hellobi.com/blog/python_shequ/14486
Tf Var Attention
⭐
55
Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018)
Quantumforest
⭐
53
Fast Differentiable Forest lib with the advantages of both decision trees and neural networks
Global And Local Attention Based Free Form Image Inpainting
⭐
52
Official implementation of "Global and local attention-based free-form image inpainting"
Sarcasm Detection
⭐
52
Detecting Sarcasm on Twitter using both traditonal machine learning and deep learning techniques.
Patient2vec
⭐
51
Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record
Ca Net
⭐
51
Code for Comprehensive Attention Convolutional Neural Networks for Explainable Medical Image Segmentation.
Tacotron
⭐
50
A PyTorch implementation of Location-Relative Attention Mechanisms For Robust Long-Form Speech Synthesis
Perceiver
⭐
50
Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Setvae
⭐
50
[CVPR'21] SetVAE: Learning Hierarchical Composition for Generative Modeling of Set-Structured Data, in PyTorch
Simple Diffusion
⭐
50
A minimal implementation of a denoising diffusion model in PyTorch.
Transformerx
⭐
49
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
Global Self Attention Network
⭐
49
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Text2text
⭐
49
Rq Transformer
⭐
49
Implementation of RQ Transformer, proposed in the paper "Autoregressive Image Generation using Residual Quantization"
Flash Genomics Model
⭐
49
My own attempt at a long context genomics model, leveraging recent advances in long context attention modeling (Flash Attention + other hierarchical methods)
Lie Transformer Pytorch
⭐
49
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
Efficient Attention
⭐
48
An implementation of the efficient attention module.
Projectrul
⭐
47
to prediction the remain useful life of bearing based on 2012 PHM data
Hierarchical Attention Networks
⭐
47
TensorFlow implementation of the paper "Hierarchical Attention Networks for Document Classification"
The Clean Transformer
⭐
46
pytorch-lightning과 wandb로 깔끔하게 구현해보는 트랜스포머
Diffusion Policy
⭐
46
Implementation of Diffusion Policy, Toyota Research's supposed breakthrough in leveraging DDPMs for learning policies for real-world Robotics
Adjacent Attention Network
⭐
45
Graph neural network message passing reframed as a Transformer with local attention
Lattice
⭐
45
[NAACL 2022] Robust (Controlled) Table-to-Text Generation with Structure-Aware Equivariance Learning.
Halonet Pytorch
⭐
45
Implementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Tf Bind Transformer
⭐
45
A repository with exploration into using transformers to predict DNA ↔ transcription factor binding
Daf3d
⭐
45
Deep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound
Isab Pytorch
⭐
44
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Chatbot Startkit
⭐
44
This repository holds files for the simple chatbot wrote in TensorFlow 1.4, with attention mechanism and bucketing.
Mlpnlp Nmt
⭐
43
This is a sample code of "LSTM encoder-decoder with attention mechanism" mainly for understanding a recently developed machine translation framework based on deep neural networks.
A Pgnn
⭐
43
Source code and datasets for the paper "Personalizing Graph Neural Networks with Attention Mechanism for Session-based Recommendation"
Gmflownet
⭐
42
Global Matching with Overlapping Attention for Optical Flow Estimation, CVPR 2022
Describing_a_knowledge_base
⭐
42
Code for Describing a Knowledge Base
Attention_is_all_you_need
⭐
41
Video Description With Spatial Temporal Attention
⭐
41
Our paper was published in the proceedings of ACM Multimedia 2017 (MM' 17) .
Coordinate Descent Attention
⭐
41
Implementation of an Attention layer where each head can attend to more than just one token, using coordinate descent to pick topk
Flash Attention Softmax N
⭐
41
CUDA and Triton implementations of Flash Attention with SoftmaxN.
Mocha Pytorch
⭐
40
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Anr
⭐
39
Code for our CIKM 2018 paper titled "ANR: Aspect-based Neural Recommender"
Attend And Rectify
⭐
39
Pytorch code for the IEEE TM/ECCV paper "Attend and rectify"
Attention
⭐
39
Attention based neural machine translation
Sequencing
⭐
39
A flexible and simple framework for sequence to sequence learning.
Speakerrecognitionfromscratch
⭐
38
Final project for the Speaker Recognition course on Udemy, 机器之心, 深蓝学院 and 语音之家
Efficient Attention
⭐
37
[EVA ICLR'23; LARA ICML'22] Efficient attention mechanisms via control variates, random features, and importance sampling
Attentional Neural Factorization Machine
⭐
37
Attention,Factorization Machine, Deep Learning, Recommender System
3d Object Reconstruction From Multi View Monocular Rgb Images
⭐
36
Hybrid Ensemble Approach For 3D Object Reconstruction from Multi-View Monocular RGB images
Multilingual_nmt
⭐
36
Experiments on Multilingual NMT
Travis
⭐
36
TrAVis: Visualise BERT attention in-browser
Attsets
⭐
35
🔥AttSets in Tensorflow (IJCV 2019)
Autoregressive Linear Attention Cuda
⭐
35
CUDA implementation of autoregressive linear attention, with all the latest research findings
Attentional Pointnet
⭐
34
Attentional-PointNet is Deep-Neural-Network architecture for 3D object detection in point clouds
Fragmentvc
⭐
34
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Ssm Dta
⭐
34
SSM-DTA: Breaking the Barriers of Data Scarcity in Drug-Target Affinity Prediction (Briefings in Bioinformatics 2023)
Ganvinci
⭐
33
Photorealistic human image editing with GANs - Reimplementation of the paper "FEAT: Face Editing with Attention" with additional changes and improvements.
Dcsp_segmentation
⭐
33
Automatic Personality Prediction
⭐
33
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Sigat
⭐
33
source code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
T5 Pytorch
⭐
32
Implementation of Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer in PyTorch.
Sa Dl
⭐
32
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Minimal Nmt
⭐
31
A minimal nmt example to serve as an seq2seq+attention reference.
Mambatransformer
⭐
31
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Ear
⭐
31
Code associated with the paper "Entropy-based Attention Regularization Frees Unintended Bias Mitigation from Lists"
Scattnet
⭐
31
Semantic Segmentation Network with Spatial and Channel Attention Mechanism for High-Resolution Remote Sensing Images
Panoptic Transformer
⭐
31
Another attempt at a long-context / efficient transformer by me
Memformer
⭐
30
Implementation of Memformer, a Memory-augmented Transformer, in Pytorch
Lstm Attention
⭐
29
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Deep Implicit Attention
⭐
29
Experimental implementation of deep implicit attention in PyTorch
Samn
⭐
29
This is our implementation of SAMN: Social Attentional Memory Network
Attention Snn
⭐
28
Offical implementation of "Attention Spiking Neural Networks" (IEEE T-PAMI2023)
Attention Ocr Toy Example
⭐
28
Audio Vision
⭐
28
Implementation and reviews of Audio & Computer vision related papers in python using keras and tensorflow.
Mpad
⭐
28
Message Passing Attention Networks for Document Understanding
Manner
⭐
28
MANNER: Multi-view Attention Network for Noise ERasure (Speech enhancement in time-domain)
Image Captioning Chinese
⭐
27
Image Captioning in Chinese using LSTM RNN with attention mechanism
Attention_in_graph
⭐
27
attention mechanism for graph classification, significant sub-graph mining, graph disstillation
Gdcan
⭐
27
[TPAMI 2021] Code release for "Generalized Domain Conditioned Adaptation Network" https://arxiv.org/abs/2103.12339
Htm Pytorch
⭐
27
Implementation of Hierarchical Transformer Memory (HTM) for Pytorch
Aat
⭐
27
Code for paper "Adaptively Aligned Image Captioning via Adaptive Attention Time". NeurIPS 2019
Denet
⭐
26
This is the official repo for Dynamic Extension Nets for Few-shot Semantic Segmentation (ACM Multimedia 20).
Glas
⭐
26
Generative Latent Attentive Sampler
3han
⭐
26
3HAN: A Deep Neural Network for Fake News Detection: https://link.springer.com/chapter/10.1007%2F978-3-
Languagemodel Using Attention
⭐
25
Pytorch implementation of a basic language model using Attention in LSTM network
Affgcn
⭐
25
Attention Feature Fusion base on spatial-temporal Graph Convolutional Network(AFFGCN)
Mfpnet
⭐
25
PyTorch implementation for "Remote Sensing Change Detection Based on Multidirectional Adaptive Feature Fusion and Perceptual Similarity"
Fed Att
⭐
25
Attentive Federated Learning for Private NLM
Rnn Text Classification Tf
⭐
25
Tensorflow implementation of Attention-based Bidirectional RNN text classification.
Attention Guided Sparsity
⭐
25
Attention-Based Guided Structured Sparsity of Deep Neural Networks
Tranception Pytorch
⭐
24
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction
Complex Valued Transformer
⭐
24
Implementation of the transformer proposed in "Building Blocks for a Complex-Valued Transformer Architecture"
Pause Transformer
⭐
24
Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amount of time on any token
Lyricshan
⭐
24
Music Genre Classification by Lyrics using a Hierarchical Attention Network
Related Searches
Python Machine Learning (20,195)
Python Dataset (14,792)
Python Tensorflow (13,736)
Python Deep Learning (13,092)
Python Jupyter Notebook (12,976)
Python Network (11,495)
Python Natural Language Processing (9,064)
Python Artificial Intelligence (8,580)
Python Pytorch (7,877)
Python Neural (7,444)
201-300 of 444 search results
< Previous
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.