Awesome Open Source
Awesome Open Source


  • p1: 最基本的梯度下降法:gradient descent
  • p2: 原始 SGD方法:stochastic gradient descent
  • p3: minibatch-SGD方法
  • p4 momentum SGD: minibatch-SGD with momentum
  • p4 momentum: momentum with SGD
  • p5: Nesterov方法
  • p6: adagrad
  • p7: adadelta
  • p8: adam


p1 参考

p2~pn 参考


p5 参考

p6 参考

p7 参考 (原始论文)

p8 参考

Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
python (51,962
gradient-descent (25

Find Open Source By Browsing 7,000 Topics Across 59 Categories