Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for attention mechanism
attention-mechanism
x
649 search results found
Simplednn
⭐
95
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Geoman
⭐
95
Tensorflow Implement of GeoMAN, IJCAI-18
Calm Pytorch
⭐
93
Implementation of CALM from the paper "LLM Augmented LLMs: Expanding Capabilities through Composition", out of Google Deepmind
Kac Net
⭐
92
Implementation of Knowledge Aided Consistency for Weakly Supervised Phrase Grounding in Tensorflow
Competitive Inner Imaging Senet
⭐
91
Source code of paper: (not available now)
Long Short Transformer
⭐
91
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Transganformer
⭐
91
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Self Attentive Emb Tf
⭐
91
Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)
Aiatrack
⭐
90
[ECCV'22] The official PyTorch implementation of our ECCV 2022 paper: "AiATrack: Attention in Attention for Transformer Visual Tracking".
Darnn
⭐
87
A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
Trajectory Transformer
⭐
87
Code for "Transformer Networks for Trajectory Forecasting"
Epsanet
⭐
86
Compressive Transformer Pytorch
⭐
86
Pytorch implementation of Compressive Transformers, from Deepmind
Ceit Pytorch
⭐
84
Implementation of Convolutional enhanced image Transformer
Zorro Pytorch
⭐
83
Implementation of Zorro, Masked Multimodal Transformer, in Pytorch
100 Days Of Nlp
⭐
83
Vista Net
⭐
82
Code for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Grounder
⭐
81
Implementation of Grounding of Textual Phrases in Images by Reconstruction in Tensorflow
Various Attention Mechanisms
⭐
81
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Deepaffinity
⭐
80
Protein-compound affinity prediction through unified RNN-CNN
Eeg Transformer
⭐
79
i. A practical application of Transformer (ViT) on 2-D physiological signal (EEG) classification tasks. Also could be tried with EMG, EOG, ECG, etc. ii. Including the attention of spatial dimension (channel attention) and *temporal dimension*. iii. Common spatial pattern (CSP), an efficient feature enhancement method, realized with Python.
Sanet
⭐
78
Arbitrary Style Transfer with Style-Attentional Networks
Attend_infer_repeat
⭐
77
A Tensorfflow implementation of Attend, Infer, Repeat
Transformers
⭐
75
Everything you need to know about Transformers! 🤖
Uniformer Pytorch
⭐
74
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Mad
⭐
74
Code for "Online and Linear Time Attention by Enforcing Monotonic Alignments"
Mirasol Pytorch
⭐
74
Implementation of 🌻 Mirasol, SOTA Multimodal Autoregressive model out of Google Deepmind, in Pytorch
Mixture Of Attention
⭐
74
Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts
Neural Chatbot
⭐
73
A Neural Network based Chatbot
Fast Transformer Pytorch
⭐
73
Implementation of Fast Transformer in Pytorch
Qformer
⭐
73
The official repo for [Arxiv'23] "Vision Transformer with Quadrangle Attention"
Simpool
⭐
72
This repo contains the official implementation of ICCV 2023 paper "Keep It SimPool: Who Said Supervised Transformers Suffer from Attention Deficit?"
Writing Editing Network
⭐
72
Code for Paper Abstract Writing through Editing Mechanism
Transformercpi
⭐
71
TransformerCPI: Improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments(BIOINFORMATICS 2020) https://doi.org/10.1093/bioinformatics/btaa524
Feedback Transformer Pytorch
⭐
71
Implementation of Feedback Transformer in Pytorch
Qrc Net
⭐
70
Implementation of Query-guided Regression Network with Context Policy for Phrase Grounding in Tensorflow
Trihorn Net
⭐
70
Official PyTorch implementation of TriHorn-Net
Image Caption Generator
⭐
70
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Treegen
⭐
70
A Tree-Based Transformer Architecture for Code Generation. (AAAI'20)
Image Captioning
⭐
69
Perceiver Ar Pytorch
⭐
69
Implementation of Perceiver AR, Deepmind's new long-context attention network based on Perceiver architecture, in Pytorch
Narre
⭐
67
This is our implementation of NARRE:Neural Attentional Regression with Review-level Explanations
Group Level Emotion Recognition
⭐
67
Model submitted for the ICMI 2018 EmotiW Group-Level Emotion Recognition Challenge
Cs224n_project
⭐
65
Neural Image Captioning in TensorFlow.
Crabnet
⭐
65
Predict materials properties using only the composition information!
Softalignments
⭐
65
Neural macine translation soft alignment visualisations for web and command line
Pytorch_sentiment_rnn
⭐
64
Example Recurrent Neural Networks for Sentiment Analysis (Aspect-Based) on SemEval 2014
Leafgan
⭐
63
Optic Disc Unet
⭐
62
Attention Unet model with post process for retina optic disc segmention
Taylor Series Linear Attention
⭐
61
Explorations into the recently proposed Taylor Series Linear Attention
Transxnet
⭐
61
TransXNet: Learning Both Global and Local Dynamics with a Dual Dynamic Token Mixer for Visual Recognition
Fake_news_detection_deep_learning
⭐
60
Fake News Detection using Deep Learning models in Tensorflow
Sturcture Inpainting
⭐
59
Source code of AAAI 2020 paper 'Learning to Incorporate Structure Knowledge for Image Inpainting'
Attentional Interfaces
⭐
59
🔍 Attentional interfaces in TensorFlow.
Grm
⭐
59
[CVPR'23] The official PyTorch implementation of our CVPR 2023 paper: "Generalized Relation Modeling for Transformer Tracking".
Stam Pytorch
⭐
59
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Memory Compressed Attention
⭐
58
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Arelu
⭐
58
AReLU: Attention-based-Rectified-Linear-Unit
Ntua Slp Semeval2018
⭐
58
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Pytorch Attention Guided Cyclegan
⭐
57
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Agent Attention Pytorch
⭐
57
Implementation of Agent Attention in Pytorch
Abivirnet
⭐
56
Attention Bidirectional Video Recurrent Net
Xiaox
⭐
56
flask+seq2seq【TensorFlow1.0, Pytorch】 🎨 🎨 在线聊天机器人 https://mp.weixin.qq.com/s/VpiAmVSTin3ALA8MnzhCJA 或 https://ask.hellobi.com/blog/python_shequ/14486
Pg Cnn
⭐
56
Occlusion aware facial expression recognition using CNN with attention mechanism
Tree_enhanced_embedding_model
⭐
55
TEM: Tree-enhanced Embedding Model for Explainable Recommendation, WWW2018
Tf Var Attention
⭐
55
Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018)
Deepattention
⭐
55
Deep Visual Attention Prediction (TIP18)
Comparemodels_trecqa
⭐
55
Compare six baseline deep learning models on TrecQA
Transformers4vision
⭐
54
A summarization of Transformer-based architectures for CV tasks, including image classification, object detection, segmentation, and Few-shot Learning. Keep updated frequently.
Quantumforest
⭐
53
Fast Differentiable Forest lib with the advantages of both decision trees and neural networks
Global And Local Attention Based Free Form Image Inpainting
⭐
52
Official implementation of "Global and local attention-based free-form image inpainting"
Sarcasm Detection
⭐
52
Detecting Sarcasm on Twitter using both traditonal machine learning and deep learning techniques.
Ca Net
⭐
51
Code for Comprehensive Attention Convolutional Neural Networks for Explainable Medical Image Segmentation.
Patient2vec
⭐
51
Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record
Perceiver
⭐
50
Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Simple Diffusion
⭐
50
A minimal implementation of a denoising diffusion model in PyTorch.
Tacotron
⭐
50
A PyTorch implementation of Location-Relative Attention Mechanisms For Robust Long-Form Speech Synthesis
Setvae
⭐
50
[CVPR'21] SetVAE: Learning Hierarchical Composition for Generative Modeling of Set-Structured Data, in PyTorch
Lie Transformer Pytorch
⭐
49
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
Rq Transformer
⭐
49
Implementation of RQ Transformer, proposed in the paper "Autoregressive Image Generation using Residual Quantization"
Global Self Attention Network
⭐
49
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Flash Genomics Model
⭐
49
My own attempt at a long context genomics model, leveraging recent advances in long context attention modeling (Flash Attention + other hierarchical methods)
Transformerx
⭐
49
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
Text2text
⭐
49
Attention Viz
⭐
49
Visualizing query-key interactions in language + vision transformers
Efficient Attention
⭐
48
An implementation of the efficient attention module.
Projectrul
⭐
47
to prediction the remain useful life of bearing based on 2012 PHM data
Hierarchical Attention Networks
⭐
47
TensorFlow implementation of the paper "Hierarchical Attention Networks for Document Classification"
Diffusion Policy
⭐
46
Implementation of Diffusion Policy, Toyota Research's supposed breakthrough in leveraging DDPMs for learning policies for real-world Robotics
The Clean Transformer
⭐
46
pytorch-lightning과 wandb로 깔끔하게 구현해보는 트랜스포머
Adjacent Attention Network
⭐
45
Graph neural network message passing reframed as a Transformer with local attention
Lattice
⭐
45
[NAACL 2022] Robust (Controlled) Table-to-Text Generation with Structure-Aware Equivariance Learning.
Daf3d
⭐
45
Deep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound
Tf Bind Transformer
⭐
45
A repository with exploration into using transformers to predict DNA ↔ transcription factor binding
Halonet Pytorch
⭐
45
Implementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Video Cap
⭐
44
🎬 Video Captioning: ICCV '15 paper implementation
Chatbot Startkit
⭐
44
This repository holds files for the simple chatbot wrote in TensorFlow 1.4, with attention mechanism and bucketing.
Isab Pytorch
⭐
44
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
A Pgnn
⭐
43
Source code and datasets for the paper "Personalizing Graph Neural Networks with Attention Mechanism for Session-based Recommendation"
Protein Localization
⭐
43
Using Transformer protein embeddings with a linear attention mechanism to make SOTA de-novo predictions for the subcellular location of proteins 🔬
Related Searches
Python Attention Mechanism (517)
Deep Learning Attention Mechanism (370)
201-300 of 649 search results
< Previous
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.