Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for attention head
attention
x
head
x
27 search results found
Exbert
⭐
541
A Visual Analysis Tool to Explore Learned Representations in Transformers Models
Linformer Pytorch
⭐
323
My take on a practical implementation of Linformer for Pytorch.
Attention Analysis
⭐
309
Vit Explain
⭐
260
Explainability for Vision Transformers
Vit
⭐
204
Implementing Vi(sion)T(transformer)
The Story Of Heads
⭐
170
This is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the ACL 2021 paper "Analyzing Source and Target Contributions to NMT Predictions".
Attentions
⭐
154
PyTorch implementation of some attentions for Deep Learning Researchers.
Attentionn
⭐
147
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Attention Tracker
⭐
126
Linear Attention Recurrent Neural Network
⭐
107
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Collaborative Attention
⭐
101
Code for Multi-Head Attention: Collaborate Instead of Concatenate
Hetsann
⭐
90
Source Codes of HetSANN in the AAAI'20 paper: An Attention-based Graph Nerual Network for Heterogeneous Structural Learning.
Crabnet
⭐
65
Predict materials properties using only the composition information!
Multi2oie
⭐
47
Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)
Attention Visualization
⭐
44
Visualization for simple attention and Google's multi-head attention.
Panosalnet
⭐
22
This is the source code for the ACM MM18 paper "Your Attention is Unique: Detecting 360-Degree Video Saliency in Head-Mounted Display for Head Movement Prediction"
Ocr On The Go
⭐
20
For ICDAR 2019 Paper on End-to-end License Plate and Scene Text Recognition with multi-head attention models
Vrp_drl_mha
⭐
17
PyTorch1.6, TensorFlow2.1, "Attention, Learn to Solve Routing Problems!", Transformer, Deep RL(Policy Gradient, REINFORCE), Capacitated Vehicle Routing Problem
Transformer_implementation_and_application
⭐
15
The 300 lines of code (Tensorflow 2) completely replicates the Transformer model and is used in neural machine translation tasks and chat bots. 300行代码(Tensorflow 2)完整复现了Transformer模型,并且应用在神经机器翻译任务和聊天机器人上。
Mhka
⭐
11
The corresponding code from our paper "Social Commonsense Reasoning with Multi-Head Knowledge Attention (EMNLP 2020)". Do not hesitate to open an issue if you run into any trouble!
Smhsa
⭐
9
Source code of the paper "Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction, IJCAI 2020."
Visualizationanalysisoflearningattention
⭐
8
Visualization analysis of learningAttention based on PnP Head Pose Estimation
Translob
⭐
7
Transformers for limit order books
Multi Head Attention Labeller
⭐
7
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
Transformer_anatomy
⭐
6
Official Pytorch implementation of (Roles and Utilization of Attention Heads in Transformer-based Neural Language Models), ACL 2020
Qanet
⭐
5
Final project for BGU NLP course.
Transformer Based Model Learning
⭐
5
learn the transformer related model
Related Searches
Python Attention (2,338)
Perl Head (1,520)
Javascript Head (755)
Pytorch Attention (645)
Python Head (613)
Jupyter Notebook Attention (557)
Deep Learning Attention (449)
1-27 of 27 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.