Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for neural network attention mechanism
attention-mechanism
x
neural-network
x
33 search results found
Awesome Speech Recognition Speech Synthesis Papers
⭐
2,869
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Gat
⭐
2,078
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Pygat
⭐
1,684
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Awesome Transformer Nlp
⭐
1,022
A curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning.
Simgnn
⭐
540
A PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).
Deeplearning.ai Natural Language Processing Specialization
⭐
523
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Geotransformer
⭐
422
[CVPR2022] Geometric Transformer for Fast and Robust Point Cloud Registration
Attention_is_all_you_need
⭐
293
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Eqtransformer
⭐
260
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Ttslearn
⭐
197
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Guided Attention Inference Network
⭐
187
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Multihead Siamese Nets
⭐
173
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Datastories Semeval2017 Task4
⭐
171
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Hart
⭐
144
Hierarchical Attentive Recurrent Tracking
Simplednn
⭐
95
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Attend_infer_repeat
⭐
77
A Tensorfflow implementation of Attend, Infer, Repeat
Neural Chatbot
⭐
73
A Neural Network based Chatbot
Simpool
⭐
72
This repo contains the official implementation of ICCV 2023 paper "Keep It SimPool: Who Said Supervised Transformers Suffer from Attention Deficit?"
Image Captioning
⭐
69
Crabnet
⭐
65
Predict materials properties using only the composition information!
Ca Net
⭐
51
Code for Comprehensive Attention Convolutional Neural Networks for Explainable Medical Image Segmentation.
Perceiver
⭐
50
Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Chatbot Startkit
⭐
44
This repository holds files for the simple chatbot wrote in TensorFlow 1.4, with attention mechanism and bucketing.
Mlpnlp Nmt
⭐
43
This is a sample code of "LSTM encoder-decoder with attention mechanism" mainly for understanding a recently developed machine translation framework based on deep neural networks.
Attention_is_all_you_need
⭐
41
Transformer In Transformer
⭐
40
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Speakerrecognitionfromscratch
⭐
38
Final project for the Speaker Recognition course on Udemy, 机器之心, 深蓝学院 and 语音之家
Stanford Cs231n Assignments 2020
⭐
32
This repository contains my solutions to the assignments for Stanford's CS231n "Convolutional Neural Networks for Visual Recognition" (Spring 2020).
Mambatransformer
⭐
31
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Cosformer Pytorch
⭐
28
Unofficial PyTorch implementation of the paper "cosFormer: Rethinking Softmax In Attention".
3han
⭐
26
3HAN: A Deep Neural Network for Fake News Detection: https://link.springer.com/chapter/10.1007%2F978-3-
Layer_augmentation
⭐
23
Implementation of the NLI model in our ACL 2019 paper: Augmenting Neural Networks with First-order Logic.
Neural Networks For Time Series Analysis
⭐
22
Compare how ANNs, RNNs, LSTMs, and LSTMs with attention perform on time-series analysis
Sentimentanalysis
⭐
22
Sentiment Analysis: Deep Bi-LSTM+attention model
Ag Cnn
⭐
20
This is a reimplementation of AG-CNN. ("Thorax Disease Classification with Attention Guided Convolutional Neural Network","Diagnose like a Radiologist: Attention Guided Convolutional Neural Network for Thorax Disease Classification")
Unsupervisedattentionmechanism
⭐
19
Code for our paper: "Regularity Normalization: Neuroscience-Inspired Unsupervised Attention across Neural Network Layers".
Compact Global Descriptor
⭐
19
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Nlpvis
⭐
18
Visualization tool for interpreting NLP models
Vision2022
⭐
18
Slides of "Machine Vision" course by Zhejiang University Student Smart Factory Innovation Club 浙江大学学生智能工厂创新俱乐部《智能工厂软机器视觉教学》社团精品课程
Sa Pinns
⭐
17
Implementation of the paper "Self-Adaptive Physics-Informed Neural Networks using a Soft Attention Mechanism" [AAAI-MLPS 2021]
Fetalcpseg
⭐
14
A Deep Attentive Convolutional Neural Network for Automatic Cortical Plate Segmentation in Fetal MRI
Pointer_generator_summarizer
⭐
12
Pointer Generator Network: Seq2Seq with attention, pointing and coverage mechanism for abstractive summarization.
Text2phones
⭐
10
Attentional Neural Network that translates text to phones.
Neural Dialogue System
⭐
9
SEQ2SEQ model with Attention mechanism for QA also for NMT +(Generating text using LSTM network)
Simgnn
⭐
8
Keras implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation". Includes synthetic GED data.
Cyclegan With Self Attention
⭐
8
In this repository, I have developed a CycleGAN architecture with embedded Self-Attention Layers, that could solve three different complex tasks. Here the same principle Neural Network architecture has been used to solve the three different task. Although truth be told, my model has not exceeded any state of the art performances for the given task, but the architecture was powerful enough to understand the task that has been given to solve and produce considerably good results.
Mmca
⭐
8
The open source community's implementation of the all-new Multi-Modal Causal Attention from "DeepSpeed-VisualChat: Multi-Round Multi-Image Interleave Chat via Multi-Modal Causal Attention"
Maximal
⭐
8
A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks. Built on TensorFlow 2.
Cpa
⭐
7
Source code for our paper "Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation" (IJCAI 2020)
Rusentrel Leaderboard
⭐
7
This is an official Leaderboard for the RuSentRel-1.1 dataset originally described in paper (arxiv:1808.08932)
Tbinet
⭐
7
TBiNet: A deep neural network for predicting transcription factor binding sites using attention mechanism
Dlnlp Papernotes
⭐
6
Summaries and notes on Deep Learning research papers in natural language processing(NLP) domain.
Bpam
⭐
6
BPAM: Recommendation Based on BP Neural Network with Attention Mechanism
Confidencethroughattention
⭐
5
Confidence Through Attention
Attentiondeeplabv3p
⭐
5
Attention Deeplabv3+: Multi-level Context Attention Mechanism for Skin Lesion Segmentation
Dual_stage_attention_rnn
⭐
5
A Tensorflow Implementation of Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
Related Searches
Python Neural Network (7,064)
Machine Learning Neural Network (4,430)
Deep Learning Neural Network (3,760)
Jupyter Notebook Neural Network (3,677)
Tensorflow Neural Network (2,192)
Pytorch Neural Network (1,416)
Keras Neural Network (1,309)
Neural Network Classification (906)
Dataset Neural Network (857)
C Plus Plus Neural Network (856)
1-33 of 33 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.