Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Solid | 518 | 4 years ago | 7 | mit | Python | |||||
🎯 A comprehensive gradient-free optimization framework written in Python | ||||||||||
Chefboost | 387 | 2 months ago | 17 | February 16, 2022 | 3 | mit | Python | |||
A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4,5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting, Random Forest and Adaboost w/categorical features support for Python | ||||||||||
Mth594_machinelearning | 306 | 6 years ago | 1 | Jupyter Notebook | ||||||
The materials for the course MTH 594 Advanced data mining: theory and applications (Dmitry Efimov, American University of Sharjah) | ||||||||||
Terrain Topology Algorithms | 298 | 2 years ago | 1 | mit | C# | |||||
Terrain topology algorithms in Unity | ||||||||||
Fmin | 295 | 366 | 13 | 5 years ago | 2 | November 25, 2016 | 5 | bsd-3-clause | JavaScript | |
Unconstrained function minimization in Javascript | ||||||||||
Stein Variational Gradient Descent | 261 | 4 years ago | 1 | mit | Python | |||||
code for the paper "Stein Variational Gradient Descent (SVGD): A General Purpose Bayesian Inference Algorithm" | ||||||||||
Jstarcraft Ai | 201 | 2 months ago | 4 | apache-2.0 | Java | |||||
目标是提供一个完整的Java机器学习(Machine Learning/ML)框架,作为人工智能在学术界与工业界的桥梁. 让相关领域的研发人员能够在各种软硬件环境/数据结构/算法/模型之间无缝切换. 涵盖了从数据处理到模型的训练与评估各个环节,支持硬件加速和并行计算,是最快最全的Java机器学习库. | ||||||||||
Spinning Up Basic | 164 | 2 years ago | mit | Python | ||||||
Basic versions of agents from Spinning Up in Deep RL written in PyTorch | ||||||||||
Boml | 124 | 2 years ago | 8 | September 19, 2020 | 1 | mit | Python | |||
Bilevel Optimization Library in Python for Multi-Task and Meta Learning | ||||||||||
Clustering4ever | 109 | 3 | 2 years ago | 18 | May 03, 2020 | apache-2.0 | Scala | |||
C4E, a JVM friendly library written in Scala for both local and distributed (Spark) Clustering. |
Authors: Hiroyuki Kasai
Last page update: April 19, 2017
Latest library version: 1.0.1 (see Release notes for more info)
The GDLibrary is a pure-Matlab library of a collection of unconstrained optimization algorithms. This solves an unconstrained minimization problem of the form, min f(x).
Note that the SGDLibrary internally contains this GDLibrary.
./ - Top directory. ./README.md - This readme file. ./run_me_first.m - The scipt that you need to run first. ./demo.m - Demonstration script to check and understand this package easily. |plotter/ - Contains plotting tools to show convergence results and various plots. |tool/ - Some auxiliary tools for this project. |problem/ - Problem definition files to be solved. |gd_solver/ - Contains various gradient descent optimization algorithms. |gd_test/ - Some helpful test scripts to use this package.
Run run_me_first
for path configurations.
%% First run the setup script
run_me_first;
Now, just execute demo
for demonstration of this package.
%% Execute the demonstration script
demo;
The "demo.m" file contains below.
%% define problem definitions
% set number of dimensions
d = 2;
problem = rosenbrock(d);
%% calculate solution
w_opt = problem.calc_solution();
%% general options for optimization algorithms
options.w_init = zeros(d,1);
% set verbose mode
options.verbose = true;
% set optimal solution
options.f_opt = problem.cost(w_opt);
% set store history of solutions
options.store_w = true;
%% perform GD with backtracking line search
options.step_alg = 'backtracking';
[w_gd, info_list_gd] = gd(problem, options);
%% perform NCG with backtracking line search
options.step_alg = 'backtracking';
[w_ncg, info_list_ncd] = ncg(problem, options);
%% perform L-BFGS with strong wolfe line search
options.step_alg = 'strong_wolfe';
[w_lbfgs, info_list_lbfgs] = lbfgs(problem, options);
%% plot all
close all;
% display epoch vs cost/gnorm
display_graph('iter','cost', {'GD-BKT', 'NCG-BKT', 'LBFGS-WOLFE'}, {w_gd, w_ncg, w_lbfgs}, {info_list_gd, info_list_ncd, info_list_lbfgs});
% display optimality gap vs grads
display_graph('iter','gnorm', {'GD-BKT', 'NCG-BKT', 'LBFGS-WOLFE'}, {w_gd, w_ncg, w_lbfgs}, {info_list_gd, info_list_ncd, info_list_lbfgs});
% draw convergence sequence
w_history = cell(1);
cost_history = cell(1);
w_history{1} = info_list_gd.w;
w_history{2} = info_list_ncd.w;
w_history{3} = info_list_lbfgs.w;
cost_history{1} = info_list_gd.cost;
cost_history{2} = info_list_ncd.cost;
cost_history{3} = info_list_lbfgs.cost;
draw_convergence_sequence(problem, w_opt, {'GD-BKT', 'NCG-BKT', 'LBFGS-WOLFE'}, w_history, cost_history);
The GDLibrary is free and open source for academic/research purposes (non-commercial).
If you have any problems or questions, please contact the author: Hiroyuki Kasai (email: kasai at is dot uec dot ac dot jp)