Awesome Open Source
Awesome Open Source

一些关于机器学习优化函数的练习

  • p1: 最基本的梯度下降法:gradient descent
  • p2: 原始 SGD方法:stochastic gradient descent
  • p3: minibatch-SGD方法
  • p4 momentum SGD: minibatch-SGD with momentum
  • p4 momentum: momentum with SGD
  • p5: Nesterov方法
  • p6: adagrad
  • p7: adadelta
  • p8: adam

参考

p1 参考 https://zhuanlan.zhihu.com/p/27297638

p2~pn 参考 http://ruder.io/optimizing-gradient-descent/index.html

单独参考

p5 参考 http://cs231n.github.io/neural-networks-3/

p6 参考 https://zhuanlan.zhihu.com/p/22252270

p7 参考 https://arxiv.org/abs/1212.5701 (原始论文)

p8 参考 http://www.ijiandao.com/2b/baijia/63540.html


Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
python (51,962
gradient-descent (25

Find Open Source By Browsing 7,000 Topics Across 59 Categories