Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for adam
adam
x
31 search results found
Radam
⭐
2,393
On the Variance of the Adaptive Learning Rate and Beyond
Adam
⭐
385
Adam (or adm) is a coroutine-friendly Android Debug Bridge client written in Kotlin
Pytorch_warmup
⭐
337
Learning Rate Warmup in PyTorch
Keras Radam
⭐
330
RAdam implemented in Keras & TensorFlow
Adam_qas
⭐
298
ADAM - A Question Answering System. Inspired from IBM Watson
Visual Gps Slam
⭐
263
This is a repo for my master thesis research about the Fusion of Visual SLAM and GPS. It contains the research paper, code and other interesting data.
Fujinet Platformio
⭐
200
8-bit systems to ESP32 WiFi Multifunction Firmware
Jiro Nn
⭐
113
A Deep Learning and preprocessing framework in Rust with support for CPU and GPU.
Adamw_keras
⭐
95
AdamW optimizer for Keras
Reinforcementlearning Atarigame
⭐
87
Pytorch LSTM RNN for reinforcement learning to play Atari games from OpenAI Universe. We also use Google Deep Mind's Asynchronous Advantage Actor-Critic (A3C) Algorithm. This is much superior and efficient than DQN and obsoletes it. Can play on many games
Adams
⭐
58
🍢 A simple but graceful typecho theme.
Ada Hessian
⭐
46
Easy-to-use AdaHessian optimizer (PyTorch)
Padam
⭐
38
Partially Adaptive Momentum Estimation method in the paper "Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks" (accepted by IJCAI 2020)
Polysolve
⭐
38
Easy-to-use linear and non-linear solver
Adasoptimizer
⭐
33
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
A Tour Of Pytorch Optimizers
⭐
30
A tour of different optimization algorithms in PyTorch.
Swats
⭐
24
Switching from Adam to SGD optimization in PyTorch.
Ml Optimizers Jax
⭐
23
Toy implementations of some popular ML optimizers using Python/JAX
Optimizers For Tensorflow
⭐
15
Adam, NAdam and AAdam optimizers
Adam
⭐
12
Addon which enhances all user profiles of confluence. It also adds an advanced people directory. The whole addon is configurable by means of an XML, can be localized, supports Velocity templates and supports view and edit restrictions.
Adam_home
⭐
12
ADAM python client and notebooks
Extensisq
⭐
11
Extend scipy.integrate with various methods for solve_ivp
Novograd Pytorch
⭐
11
pytorch implement of NovoGrad Optimizer
Deepvariant On Spark
⭐
11
DeepVariant-on-Spark is a germline short variant calling pipeline that runs Google DeepVariant on Apache Spark at scale.
Adashift
⭐
9
AdaShift optimizer implementation in PyTorch
Neural Networks And Deep Learning
⭐
9
Deep learning projects including applications (face recognition, neural style transfer, autonomous driving, sign language reading, music generation, translation, speech recognition and NLP) and theories (CNNs, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, hyperparameter tuning, regularization, optimization, Residual Networks). Deep Learning Specialization by Andrew Ng, deeplearning.ai
Paper Implementation Overview Gradient Descent Optimization Sebastian Ruder
⭐
7
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
Simpledeepnettoolbox
⭐
7
Simple MATLAB toolbox for deep learning network: Version 1.0.3
Submissions Pilot3 Adam
⭐
7
Development repo for pilot3 submission to FDA - ADaM
Awesome Optimizers
⭐
5
Literature survey of convex optimizers and optimisation methods for deep-learning; made especially for optimisation researchers with ❤️
Demonrangeroptimizer
⭐
5
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
1-31 of 31 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.