Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Spearmint | 1,310 | 6 years ago | 47 | Python | ||||||
Spearmint is a package to perform Bayesian optimization according to the algorithms outlined in the paper: Practical Bayesian Optimization of Machine Learning Algorithms. Jasper Snoek, Hugo Larochelle and Ryan P. Adams. Advances in Neural Information Processing Systems, 2012 | ||||||||||
Probreg | 670 | 1 | a month ago | 31 | January 08, 2022 | 8 | mit | Python | ||
Python package for point cloud registration using probabilistic model (Coherent Point Drift, GMMReg, SVR, GMMTree, FilterReg, Bayesian CPD) | ||||||||||
Bandits | 472 | 4 years ago | 1 | apache-2.0 | Jupyter Notebook | |||||
Python library for Multi-Armed Bandits | ||||||||||
Simple | 432 | 6 years ago | 3 | agpl-3.0 | Python | |||||
Experimental Global Optimization Algorithm | ||||||||||
Btcpredictor | 269 | 6 years ago | 3 | mit | Matlab | |||||
Bitcoin price prediction algorithm using bayesian regression techniques | ||||||||||
Stein Variational Gradient Descent | 261 | 4 years ago | 1 | mit | Python | |||||
code for the paper "Stein Variational Gradient Descent (SVGD): A General Purpose Bayesian Inference Algorithm" | ||||||||||
Pilco | 213 | 3 years ago | 12 | mit | Python | |||||
Bayesian Reinforcement Learning in Tensorflow | ||||||||||
Optimviz | 119 | 2 years ago | 1 | gpl-3.0 | MATLAB | |||||
Visualize optimization algorithms in MATLAB. | ||||||||||
Data_sciences_campaign | 92 | 13 hours ago | 2 | Jupyter Notebook | ||||||
【数据科学家系列课程】 | ||||||||||
Trueskill | 72 | 35 | 10 years ago | 5 | January 18, 2011 | 4 | mit | Ruby | ||
a ruby gem that implements the trueskill algorithm |
SVGD is a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. SVGD iteratively transports a set of particles to match with the target distribution, by applying a form of functional gradient descent that minimizes the KL divergence.
The package contains implementation of SVGD in both Matlab and Python language. Demos are also provided to reproduce the results in our paper. The Bayesian neural network example is based on Theano.
For more information, please visit our project website - SVGD.
Toy example with 1D Gaussian mixture. The red dashed lines are the target density function and the solid green lines are the densities of the particles at different iterations of our algorithm (estimated using kernel density estimator).
'''
x0: initial particles
dlnprob: returns first order derivative of log probability
n_iter: number of iterations
stepsize: initial learning rate
'''
theta = SVGD().update(x0, dlnprob, n_iter, stepsize)
Qiang Liu and Dilin Wang. Stein Variational Gradient Descent (SVGD): A General Purpose Bayesian Inference Algorithm. NIPS, 2016.
Feedback is greatly appreciated. If you have any questions, comments, issues or anything else really, shoot me an email.
All rights reserved.