Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for cifar sgd
cifar
x
sgd
x
22 search results found
Swa
⭐
785
Stochastic Weight Averaging in PyTorch
Pytorch1.0 Cn
⭐
139
PyTorch 1.0 官方文档 中文版,欢迎关注微信公众号:磐创AI
Fitting Random Labels
⭐
118
Example code for the paper "Understanding deep learning requires rethinking generalization"
Entropy Sgd
⭐
61
Lua implementation of Entropy-SGD
Dfw
⭐
53
Implementation of the Deep Frank-Wolfe Algorithm -- Pytorch
Signsgd
⭐
48
Code for the signSGD paper
Kfac Pytorch
⭐
47
Pytorch implementation of KFAC and E-KFAC (Natural Gradient).
Swalp
⭐
41
Code for paper "SWALP: Stochastic Weight Averaging forLow-Precision Training".
Parle
⭐
38
Pytorch Lr Dropout
⭐
35
"Learning Rate Dropout" in PyTorch
Torch_swa_examples
⭐
31
Hpn
⭐
26
Hyperspherical Prototype Networks
Qsgd
⭐
24
SGD and Ordered SGD codes for deep learning, SVM, and logistic regression
Contrib_swa_examples
⭐
22
Sgd Uap Torch
⭐
9
Universal Adversarial Perturbations (UAPs) for PyTorch
Entropy Sgd Tf
⭐
7
TensorFlow implementation of entropy SGD
Walk_with_sgd
⭐
6
Experiments for the paper "A Walk with SGD" (https://arxiv.org/pdf/1802.08770.pdf)
Cifar10 Faster
⭐
6
[WIP] Demonstration of training a small ResNet on CIFAR10 to 94% test accuracy in less 20 epochs
Cifar_pytorch Lightning
⭐
6
CIFAR10, CIFAR100 results with VGG16,Resnet50,WideResnet using pytorch-lightning
Dnn_sharpest_directions
⭐
6
Code for "On the Relation Between the Sharpest Directions of DNN Loss and the SGD Step Length", ICLR 2019
Path Sgd
⭐
5
Path-SGD: Path-Normalized Optimization in Deep Neural Networks
Nd Adam
⭐
5
ND-Adam is a tailored version of Adam for training DNNs.
Related Searches
Python Cifar (1,478)
Pytorch Cifar (462)
Jupyter Notebook Cifar (409)
Dataset Cifar (369)
Resnet Cifar (354)
Python Sgd (352)
Mnist Cifar (321)
Cifar Cifar10 (315)
Cifar Imagenet (308)
Tensorflow Cifar (305)
1-22 of 22 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.