Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for artificial intelligence attention
artificial-intelligence
x
attention
x
47 search results found
Vit Pytorch
⭐
16,298
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
X Transformers
⭐
3,840
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Reformer Pytorch
⭐
1,917
Reformer, the efficient Transformer, in Pytorch
Lambda Networks
⭐
1,110
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Performer Pytorch
⭐
777
An implementation of Performer, a linear attention-based transformer, in Pytorch
Longnet
⭐
613
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Tab Transformer Pytorch
⭐
609
Implementation of TabTransformer, attention network for tabular data, in Pytorch
Self Attention Cv
⭐
550
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Point Transformer Pytorch
⭐
402
Implementation of the Point Transformer layer, in Pytorch
Linformer Pytorch
⭐
323
My take on a practical implementation of Linformer for Pytorch.
Slot Attention
⭐
286
Implementation of Slot Attention from GoogleAI
Open Musiclm
⭐
281
Implementation of MusicLM, a text to music model published by Google Research, with a few modifications.
Linear Attention Transformer
⭐
278
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Local Attention
⭐
270
An implementation of local windowed attention for language modeling
Ai_law
⭐
229
all kinds of baseline models for long text classificaiton( text categorization)
Se3 Transformer Pytorch
⭐
205
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Linformer
⭐
194
Implementation of Linformer for Pytorch
Spear Tts Pytorch
⭐
178
Implementation of Spear-TTS - multi-speaker text-to-speech attention network, in Pytorch
Sinkhorn Transformer
⭐
178
Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Awesome Nlp Resources
⭐
157
This repository contains landmark research papers in Natural Language Processing that came out in this century.
Axial Attention
⭐
140
Implementation of Axial attention - attending to multi-dimensional data efficiently
Llama Qrlhf
⭐
137
Implementation of the Llama architecture with RLHF + Q-learning
Fagan
⭐
100
A variant of the Self Attention GAN named: FAGAN (Full Attention GAN)
H Transformer 1d
⭐
100
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Pyanomaly
⭐
92
Useful Toolbox for Anomaly Detection
Hamburger Pytorch
⭐
70
Pytorch implementation of the hamburger module from the ICLR 2021 paper "Is Attention Better Than Matrix Decomposition"
Arelu
⭐
58
AReLU: Attention-based-Rectified-Linear-Unit
Homebrewnlp
⭐
57
A case study of efficient training of large language models using commodity hardware.
Perceiver
⭐
50
Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Global Self Attention Network
⭐
49
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Lie Transformer Pytorch
⭐
49
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
Adjacent Attention Network
⭐
45
Graph neural network message passing reframed as a Transformer with local attention
Isab Pytorch
⭐
44
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Unibo Ai
⭐
37
Notes monorepo for the UniBO Artificial Intelligence MSc.
Reading_comprehension_tf
⭐
36
Machine Reading Comprehension in Tensorflow
Axial Positional Embedding
⭐
27
Axial Positional Embedding for Pytorch
Han
⭐
25
使用分层注意力机制 HAN + 多任务学习 解决 AI Challenger 细粒度用户评论情感分析 。https://challenger.ai/competition/fsauor2018
Attention Sampling Pytorch
⭐
24
This is a PyTorch implementation of the paper: "Processing Megapixel Images with Deep Attention-Sampling Models".
Deep Learning Visuals
⭐
24
A collection of 100 Deep Learning images and visualizations
Attn_gan_pytorch
⭐
19
python package for self-attention gan implemented as extension of PyTorch nn.Module. paper -> https://arxiv.org/abs/1805.08318
Oqmrc_2018
⭐
18
AI Challenger 2018 阅读理解赛道代码分享
Gqa Node Properties
⭐
18
Recalling node properties from a knowledge graph
Molecule Attention Transformer
⭐
15
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Ai_challenger_2018_sentiment_analysis
⭐
15
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Pointer_generator_summarizer
⭐
12
Pointer Generator Network: Seq2Seq with attention, pointing and coverage mechanism for abstractive summarization.
English_chinese_machine_translation_baseline
⭐
10
tensor2tensor usage
Mobilevlm
⭐
9
Implementation of the LDP module block in PyTorch and Zeta from the paper: "MobileVLM: A Fast, Strong and Open Vision Language Assistant for Mobile Devices"
Image_caption_competition
⭐
8
AI Challenger Image Caption Competition
Mmca
⭐
8
The open source community's implementation of the all-new Multi-Modal Causal Attention from "DeepSpeed-VisualChat: Multi-Round Multi-Image Interleave Chat via Multi-Modal Causal Attention"
Dailypaperclub
⭐
8
The repository for the exclusive Daily Paper Club hosted at Agora every 10pm NYC time at this discord: https://discord.gg/Gnzh6dnzyz
Cyclegan With Self Attention
⭐
8
In this repository, I have developed a CycleGAN architecture with embedded Self-Attention Layers, that could solve three different complex tasks. Here the same principle Neural Network architecture has been used to solve the three different task. Although truth be told, my model has not exceeded any state of the art performances for the given task, but the architecture was powerful enough to understand the task that has been given to solve and produce considerably good results.
Attentiongrid
⭐
7
A network of attention mechanisms at your fingertips. Unleash the potential of attention mechanisms for diverse AI applications. AttentionGrid is all you need!
Tinygptv
⭐
7
Simple Implementation of TinyGPTV in super simple Zeta lego blocks
Gats
⭐
6
Implementation of GATS from the paper: "GATS: Gather-Attend-Scatter" in pytorch and zeta
Memory Transformer Xl
⭐
6
A variant of Transformer-XL where the memory is updated not with a queue, but with attention
Compositional Attention
⭐
5
an implementation of Compositional Attention: Disentangling Search and Retrieval by MILA
Intstar
⭐
5
A computation framework for modelling complex adaptive hierarchical systems
Awesome Ai
⭐
5
Awesome list for all things AI, ML and deep learning
Related Searches
Machine Learning Artificial Intelligence (4,407)
Jupyter Notebook Artificial Intelligence (2,652)
Python Artificial Intelligence (2,382)
Python Attention (2,327)
Artificial Intelligence Neural Network (1,732)
Deep Learning Artificial Intelligence (1,352)
Java Artificial Intelligence (1,340)
Artificial Intelligence Tensorflow (1,225)
Artificial Intelligence Chatgpt (1,141)
Artificial Intelligence Computer Vision (1,009)
1-47 of 47 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.