Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for multi head attention
multi-head-attention
x
13 search results found
Tranad
⭐
450
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
Deepxi
⭐
367
Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
Dodrio
⭐
287
Exploring attention weights in transformer-based models with linguistic knowledge.
Attentions
⭐
154
PyTorch implementation of some attentions for Deep Learning Researchers.
Flash_attention_inference
⭐
72
Performance of the C++ interface of flash attention, flash attention v2 and self quantized decoding attention in large language model (LLM) inference scenarios.
Multi2oie
⭐
47
Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)
Attention Visualization
⭐
44
Visualization for simple attention and Google's multi-head attention.
Scdino
⭐
22
Self-Supervised Vision Transformers for multiplexed imaging datasets
Attention
⭐
21
several types of attention modules written in PyTorch
Vrp_drl_mha
⭐
17
PyTorch1.6, TensorFlow2.1, "Attention, Learn to Solve Routing Problems!", Transformer, Deep RL(Policy Gradient, REINFORCE), Capacitated Vehicle Routing Problem
Sentencoding
⭐
15
Sentence encoder and training code for Mean-Max AAE
Datagrand_bert
⭐
14
2019达观杯信息提取第5名代码
Point Transformer
⭐
13
This is the official repository of the original Point Transformer architecture.
Related Searches
Python Multi Head Attention (8)
Deep Learning Multi Head Attention (7)
Pytorch Multi Head Attention (4)
Natural Language Processing Multi Head Attention (4)
Bert Multi Head Attention (3)
1-13 of 13 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.