Sparse_attention

Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"
Alternatives To Sparse_attention
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Bi Att Flow1,510
a year ago73apache-2.0Python
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
Sparse_attention1,002
4 years ago10Python
Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"
Keras Attention656
5 years ago22agpl-3.0Python
Visualizing RNNs using the attention mechanism
Dl4j Tutorials429
3 years agomitJava
dl4j 基础教程 配套视频:https://space.bilibili.com/327018681/#/
Attention_keras429
a year ago11mitPython
Keras Layer implementation of Attention for Sequential models
Neurst232
2 years ago3April 14, 20229otherPython
Neural end-to-end Speech Translation Toolkit
Struct Attn221
7 years ago1mitLua
Code for Structured Attention Networks https://arxiv.org/abs/1702.00887
Mcan Vqa181
4 years ago2apache-2.0Python
Deep Modular Co-Attention Networks for Visual Question Answering
Multihead Siamese Nets173
a year ago12mitJupyter Notebook
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Image Local Attention93
2 years ago2Python
A better PyTorch implementation of image local attention which reduces the GPU memory by an order of magnitude.
Alternatives To Sparse_attention
Select To Compare


Alternative Project Comparisons
Popular Attention Projects
Popular Gpu Projects
Popular Machine Learning Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Gpu
Matrix
Attention