Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for python attention mechanism
attention-mechanism
x
python
x
444 search results found
Vit Pytorch
⭐
16,298
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Rwkv Lm
⭐
10,705
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Text_classification
⭐
7,628
all kinds of text classification models and more with deep learning
Palm Rlhf Pytorch
⭐
7,496
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
Transformer
⭐
3,882
A TensorFlow Implementation of the Transformer: Attention Is All You Need
X Transformers
⭐
3,840
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Musiclm Pytorch
⭐
2,686
Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
Audiolm Pytorch
⭐
2,112
Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch
A Pytorch Tutorial To Image Captioning
⭐
2,084
Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning
Gat
⭐
2,078
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Reformer Pytorch
⭐
1,917
Reformer, the efficient Transformer, in Pytorch
Pytorch Gat
⭐
1,815
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Toolformer Pytorch
⭐
1,788
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
Pygat
⭐
1,684
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Make A Video Pytorch
⭐
1,449
Implementation of Make-A-Video, new SOTA text to video generator from Meta AI, in Pytorch
Hopfield Layers
⭐
1,258
Hopfield Networks is All You Need
Whisper Timestamped
⭐
1,217
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Sockeye
⭐
1,190
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Lambda Networks
⭐
1,110
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Alphafold2
⭐
1,086
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Soundstorm Pytorch
⭐
1,054
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
Textclassifier
⭐
1,003
Text classifier for Hierarchical Attention Networks for Document Classification
Perceiver Pytorch
⭐
980
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Coca Pytorch
⭐
900
Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch
Pointer_summarizer
⭐
859
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Retro Pytorch
⭐
784
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Performer Pytorch
⭐
777
An implementation of Performer, a linear attention-based transformer, in Pytorch
Muse Maskgit Pytorch
⭐
739
Implementation of Muse: Text-to-Image Generation via Masked Generative Transformers, in Pytorch
Tf Rnn Attention
⭐
703
Tensorflow implementation of attention mechanism for text classification tasks.
Awesome Attention Mechanism In Cv
⭐
686
Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
Phenaki Pytorch
⭐
674
Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
Keras Attention
⭐
656
Visualizing RNNs using the attention mechanism
Seq2seq Pytorch
⭐
653
Sequence to Sequence Models with PyTorch
Longnet
⭐
613
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Tab Transformer Pytorch
⭐
609
Implementation of TabTransformer, attention network for tabular data, in Pytorch
Transformer Tts
⭐
599
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Moran_v2
⭐
593
MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition
Text Summarization Tensorflow
⭐
586
Tensorflow seq2seq Implementation of Text Summarization.
Keras Self Attention
⭐
570
Attention mechanism for processing sequential data that considers the context for each timestamp.
Eeg Dl
⭐
563
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Memorizing Transformers Pytorch
⭐
556
Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch
Self Attention Cv
⭐
550
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Flamingo Pytorch
⭐
549
Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch
Simgnn
⭐
540
A PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).
Bottleneck Transformer Pytorch
⭐
523
Implementation of Bottleneck Transformer in Pytorch
Nmt Keras
⭐
514
Neural Machine Translation with Keras
Wama_modules
⭐
501
A PyTorch Computer Vision (CV) module library for building n-D networks flexibly ~
Parti Pytorch
⭐
487
Implementation of Parti, Google's pure attention-based text-to-image neural network, in Pytorch
Nuwa Pytorch
⭐
466
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Neural_sp
⭐
466
End-to-end ASR/LM implementation with PyTorch
Megabyte Pytorch
⭐
458
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Openstl
⭐
450
OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive Learning
Lamda Rlhf Pytorch
⭐
444
Open-source pre-training implementation of Google's LaMDA in PyTorch. Adding RLHF similar to ChatGPT.
Palm Pytorch
⭐
439
Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways
Mirnet
⭐
436
[ECCV 2020] Learning Enriched Features for Real Image Restoration and Enhancement. SOTA results for image denoising, super-resolution, and image enhancement.
Geotransformer
⭐
422
[CVPR2022] Geometric Transformer for Fast and Robust Point Cloud Registration
Structured Self Attention
⭐
412
A Structured Self-attentive Sentence Embedding
Seam
⭐
408
Self-supervised Equivariant Attention Mechanism for Weakly Supervised Semantic Segmentation, CVPR 2020 (Oral)
Caranet
⭐
407
Context Axial Reverse Attention Network for Small Medical Objects Segmentation
Point Transformer Pytorch
⭐
402
Implementation of the Point Transformer layer, in Pytorch
Meshgpt Pytorch
⭐
394
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Time Series Autoencoder
⭐
386
📈 PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series 📈
Paperrobot
⭐
384
Code for PaperRobot: Incremental Draft Generation of Scientific Ideas
Pytorch Original Transformer
⭐
376
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Swarms
⭐
376
Build, Deploy, and Scale Reliable Swarms of Autonomous Agents for Workflow Automation. Join our Community: https://discord.gg/DbjBMJTSWD
Enformer Pytorch
⭐
359
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Changeformer
⭐
335
[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
Recurrent Memory Transformer Pytorch
⭐
335
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
Linformer Pytorch
⭐
323
My take on a practical implementation of Linformer for Pytorch.
Multimodalmamba
⭐
321
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Robotic Transformer Pytorch
⭐
306
Implementation of RT1 (Robotic Transformer) in Pytorch
Keras Gat
⭐
301
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
Medical Chatgpt
⭐
300
Implementation of ChatGPT, but tailored towards primary care medicine, with the reward being able to collect patient histories in a thorough and efficient manner and come up with a reasonable differential diagnosis
Flash Pytorch
⭐
298
Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"
Stanet
⭐
296
official implementation of the spatial-temporal attention neural network (STANet) for remote sensing image change detection
Slot Attention
⭐
286
Implementation of Slot Attention from GoogleAI
Seq2seq_chatbot
⭐
286
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam Movie Dialogs
Linear Attention Transformer
⭐
278
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Itransformer
⭐
275
Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
Tensorflow_end2end_speech_recognition
⭐
275
End-to-End speech recognition implementation base on TensorFlow (CTC, Attention, and MTL training)
Local Attention
⭐
270
An implementation of local windowed attention for language modeling
Memory Efficient Attention Pytorch
⭐
267
Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
Eqtransformer
⭐
260
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Seq2seq Summarizer
⭐
259
Pointer-generator reinforced seq2seq summarization in PyTorch
Q Transformer
⭐
253
Implementation of Q-Transformer, Scalable Offline Reinforcement Learning via Autoregressive Q-Functions, out of Google Deepmind
Magvit2 Pytorch
⭐
244
Implementation of MagViT2 Tokenizer in Pytorch
Hierarchical Multi Label Text Classification
⭐
239
The code of CIKM'19 paper《Hierarchical Multi-label Text Classification: An Attention-based Recurrent Network Approach》
Attentionalpoolingaction
⭐
228
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Pan
⭐
227
(ECCV2020 Workshops) Efficient Image Super-Resolution Using Pixel Attention.
Yolo Multi Backbones Attention
⭐
223
Model Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
Pytorch Attention
⭐
222
🦖Pytorch implementation of popular Attention Mechanisms, Vision Transformers, MLP-Like models and CNNs.🔥🔥🔥
Attentive Gan Derainnet
⭐
222
Unofficial tensorflow implemention of "Attentive Generative Adversarial Network for Raindrop Removal from A Single Image (CVPR 2018) " model https://maybeshewill-cv.github.io/attentive-gan-de
Rt 2
⭐
215
Democratization of RT-2 "RT-2: New model translates vision and language into action"
Colt5 Attention
⭐
207
Implementation of the conditionally routed attention in the CoLT5 architecture, in Pytorch
Equiformer Pytorch
⭐
207
Implementation of the Equiformer, SE3/E3 equivariant attention network that reaches new SOTA, and adopted for use by EquiFold for protein folding
Se3 Transformer Pytorch
⭐
205
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Multimodal Sentiment Analysis
⭐
205
Attention-based multimodal fusion for sentiment analysis
Routing Transformer
⭐
202
Fully featured implementation of Routing Transformer
Saits
⭐
201
The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) model with efficiency for time series imputation (imputing multivariate incomplete time series containing missing data/values). https://arxiv.org/abs/2202.08516
Ttslearn
⭐
197
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Related Searches
Python Machine Learning (20,195)
Python Dataset (14,792)
Python Tensorflow (13,736)
Python Deep Learning (13,092)
Python Jupyter Notebook (12,976)
Python Network (11,495)
Python Natural Language Processing (9,064)
Python Artificial Intelligence (8,580)
Python Pytorch (7,877)
Python Neural (7,444)
1-100 of 444 search results
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.