Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for machine learning attention mechanism
attention-mechanism
x
machine-learning
x
60 search results found
Ml Nlp
⭐
10,874
此项目是机器学习(Machine Learning)、深度学习(Deep Learning)、NLP面试中常考到的知识点和代码实现,也是作为一个算法工程师必会的理论基础知识。
Reformer Pytorch
⭐
1,917
Reformer, the efficient Transformer, in Pytorch
Whisper Timestamped
⭐
1,217
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Sockeye
⭐
1,190
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Keras Attention
⭐
656
Visualizing RNNs using the attention mechanism
Longnet
⭐
613
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Self Attention Cv
⭐
550
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Simgnn
⭐
540
A PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).
Deeplearning.ai Natural Language Processing Specialization
⭐
523
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Nmt Keras
⭐
514
Neural Machine Translation with Keras
Lamda Rlhf Pytorch
⭐
444
Open-source pre-training implementation of Google's LaMDA in PyTorch. Adding RLHF similar to ChatGPT.
Swarms
⭐
376
Build, Deploy, and Scale Reliable Swarms of Autonomous Agents for Workflow Automation. Join our Community: https://discord.gg/DbjBMJTSWD
Linformer Pytorch
⭐
323
My take on a practical implementation of Linformer for Pytorch.
Multimodalmamba
⭐
321
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Metal Flash Attention
⭐
252
Faster alternative to Metal Performance Shaders
Prediction Flow
⭐
136
Deep-Learning based CTR models implemented by PyTorch
Fast Transformer
⭐
132
An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow
Fusilli
⭐
120
A Python package housing a collection of deep-learning multi-modal data fusion method pipelines! From data loading, to training, to evaluation - fusilli's got you covered 🌸
Ylg
⭐
115
[CVPR 2020] Official Implementation: "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models".
Simplednn
⭐
95
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
100 Days Of Nlp
⭐
83
Crabnet
⭐
65
Predict materials properties using only the composition information!
Attentional Interfaces
⭐
59
🔍 Attentional interfaces in TensorFlow.
Pytorch Attention Guided Cyclegan
⭐
57
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Perceiver
⭐
50
Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Protein Localization
⭐
43
Using Transformer protein embeddings with a linear attention mechanism to make SOTA de-novo predictions for the subcellular location of proteins 🔬
Transformer In Transformer
⭐
40
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Efficient Attention
⭐
37
[EVA ICLR'23; LARA ICML'22] Efficient attention mechanisms via control variates, random features, and importance sampling
Sa Dl
⭐
32
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Mambatransformer
⭐
31
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
San
⭐
23
Attention-based feature ranking for propositional data.
Tcan Tensorflow
⭐
20
TensorFlow implementation of TCAN model for multivariate time series forecasting with sparse attention mechanisms.
Wallnet
⭐
19
Open-source code to support BSides 2019's talk: Bye-Bye False Positives: Using AI to Improve Detection
Vision2022
⭐
18
Slides of "Machine Vision" course by Zhejiang University Student Smart Factory Innovation Club 浙江大学学生智能工厂创新俱乐部《智能工厂软机器视觉教学》社团精品课程
Resolutions 2019
⭐
16
A list of data mining and machine learning papers that I implemented in 2019.
Autort
⭐
16
Implementation of AutoRT: "AutoRT: Embodied Foundation Models for Large Scale Orchestration of Robotic Agents"
Chappie.ai
⭐
15
Generalized AI to perform a multitude of tasks written in python3
Cvdd Pytorch
⭐
14
A PyTorch implementation of Context Vector Data Description (CVDD), a method for Anomaly Detection on text.
Datafestkyiv2017
⭐
14
All presentations from Data Fest Kyiv 2017 http://datafest.in.ua
Retinal Disease Diagnosis With Residual Attention Networks
⭐
13
Using Residual Attention Networks to diagnose retinal diseases in medical images
Sparseattention
⭐
13
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with Sparse Transformers"
Organic Chemistry Reaction Prediction Using Nmt
⭐
13
organic chemistry reaction prediction using NMT with Attention
Semeval2022 Task6 Sarcasm Detection
⭐
11
Sarcasm is a term that refers to the use of words to mock, irritate, or amuse someone. It is commonly used on social media. The metaphorical and creative nature of sarcasm presents a significant difficulty for sentiment analysis systems based on affective computing. The technique and results of our team, UTNLP, in the SemEval-2022 shared task 6 on sarcasm detection are presented in this paper.
Artificial Text Detection
⭐
10
Python framework for artificial text detection: NLP approaches to compare natural text against generated by neural networks.
Jax Models
⭐
10
Explore implementations of deep learning concepts like Transformers, Attention, Llama, GPT, InstructGPT, RLHF, Gaussian Processes, Bayesian Inference, Newton Raphson, Distributed Trainers and more!
Survey Attention Medical Imaging
⭐
10
Implementation of the paper "A survey on attention mechanisms for medical applications: are we moving towards better algorithms?" by Tiago Gonçalves, Isabel Rio-Torto, Luís F. Teixeira and Jaime S. Cardoso.
Text2phones
⭐
10
Attentional Neural Network that translates text to phones.
Mobilevlm
⭐
9
Implementation of the LDP module block in PyTorch and Zeta from the paper: "MobileVLM: A Fast, Strong and Open Vision Language Assistant for Mobile Devices"
Fnc Msc
⭐
9
Deep Learning model to tackle the Fake News Challenge
Conformer
⭐
9
An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
Dailypaperclub
⭐
8
The repository for the exclusive Daily Paper Club hosted at Agora every 10pm NYC time at this discord: https://discord.gg/Gnzh6dnzyz
Maximal
⭐
8
A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks. Built on TensorFlow 2.
Cyclegan With Self Attention
⭐
8
In this repository, I have developed a CycleGAN architecture with embedded Self-Attention Layers, that could solve three different complex tasks. Here the same principle Neural Network architecture has been used to solve the three different task. Although truth be told, my model has not exceeded any state of the art performances for the given task, but the architecture was powerful enough to understand the task that has been given to solve and produce considerably good results.
Simgnn
⭐
8
Keras implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation". Includes synthetic GED data.
Nystromformer
⭐
7
An implementation of the Nyströmformer, using Nystrom method to approximate standard self attention
Selfextend
⭐
7
Implementation of SelfExtend from the paper "LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning" from Pytorch and Zeta
Dvqa
⭐
7
Deepcoda
⭐
6
Deep learning for personalized interpretability for compositional health data
Gats
⭐
6
Implementation of GATS from the paper: "GATS: Gather-Attend-Scatter" in pytorch and zeta
Attenscriptnetpr
⭐
6
This repository contains the codes and instructions to use the trained models for all the four datasets described in the paper : 'Script Identification in Natural Scene Image and Video Frame using Attention based Convolutional-LSTM Network'
Hierarchical Language Modeling
⭐
5
We address the task of learning contextualized word, sentence and document representations with a hierarchical language model by stacking Transformer-based encoders on a sentence level and subsequently on a document level and performing masked token prediction.
Heptapodlm
⭐
5
An Implementation of an Transformer model that generates tokens non-linearly all at once like the heptapods from Arrival
Ram
⭐
5
Implementation of the location-guided deep recurrent attention model (LG-DRAM) I developed for my MSc thesis at UCL
Compositional Attention
⭐
5
an implementation of Compositional Attention: Disentangling Search and Retrieval by MILA
Attention Chatbot
⭐
5
Chatbot for Twitter Customer Support. A Seq2seq Neural Network with Multiplicative Attention mechanism implemented in TensorFlow 2.
Deeplearning
⭐
5
Dive Into Deep Learning 的学习代码以及笔记,在这里我还自己增加了一些学到的东西,具体的内容可以在目录中看到
Related Searches
Python Machine Learning (14,099)
Jupyter Notebook Machine Learning (12,247)
Machine Learning Neural Network (4,397)
Machine Learning Tensorflow (4,050)
Machine Learning Natural Language Processing (3,891)
Machine Learning Artificial Intelligence (3,877)
Machine Learning Data Science (3,802)
Machine Learning Pytorch (2,910)
Machine Learning Dataset (2,298)
Machine Learning Computer Vision (1,966)
1-60 of 60 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.