Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Ray | 24,828 | 80 | 199 | 20 hours ago | 76 | June 09, 2022 | 2,883 | apache-2.0 | Python | |
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a toolkit of libraries (Ray AIR) for accelerating ML workloads. | ||||||||||
Optuna | 7,849 | 26 | 214 | a day ago | 49 | June 13, 2022 | 123 | other | Python | |
A hyperparameter optimization framework | ||||||||||
Hyperopt | 6,575 | 252 | 168 | 2 months ago | 13 | November 17, 2021 | 389 | other | Python | |
Distributed Asynchronous Hyperparameter Optimization in Python | ||||||||||
Awesome Automl Papers | 3,607 | 4 months ago | 1 | apache-2.0 | ||||||
A curated list of automated machine learning papers, articles, tutorials, slides and projects | ||||||||||
Scikit Optimize | 2,539 | 80 | 133 | a month ago | 19 | October 12, 2021 | 286 | bsd-3-clause | Python | |
Sequential model-based optimization with a `scipy.optimize` interface | ||||||||||
Hyperas | 2,147 | 21 | 3 | 3 months ago | 9 | February 28, 2019 | 94 | mit | Python | |
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization | ||||||||||
Auto Pytorch | 1,969 | 14 days ago | 5 | November 23, 2021 | 62 | apache-2.0 | Python | |||
Automatic architecture search and hyperparameter optimization for PyTorch | ||||||||||
Talos | 1,548 | 6 months ago | 8 | mit | Python | |||||
Hyperparameter Optimization for TensorFlow, Keras and PyTorch | ||||||||||
Rl Baselines3 Zoo | 1,148 | a day ago | 44 | mit | Python | |||||
A training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included. | ||||||||||
Hyperparameter Optimization Of Machine Learning Algorithms | 1,025 | 6 months ago | mit | Jupyter Notebook | ||||||
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear) |
Website | Docs | Install Guide | Tutorial | Examples
Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.
Optuna has modern functionalities as follows:
We use the terms study and trial as follows:
Please refer to sample code below. The goal of a study is to find out the optimal set of
hyperparameter values (e.g., regressor
and svr_c
) through multiple trials (e.g.,
n_trials=100
). Optuna is a framework designed for the automation and the acceleration of the
optimization studies.
import ...
# Define an objective function to be minimized.
def objective(trial):
# Invoke suggest methods of a Trial object to generate hyperparameters.
regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
if regressor_name == 'SVR':
svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
regressor_obj = sklearn.svm.SVR(C=svr_c)
else:
rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)
X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)
regressor_obj.fit(X_train, y_train)
y_pred = regressor_obj.predict(X_val)
error = sklearn.metrics.mean_squared_error(y_val, y_pred)
return error # An objective value linked with the Trial object.
study = optuna.create_study() # Create a new study.
study.optimize(objective, n_trials=100) # Invoke optimization of the objective function.
Examples can be found in optuna/optuna-examples.
Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:
Optuna Dashboard is a real-time web dashboard for Optuna. You can check the optimization history, hyperparameter importances, etc. in graphs and tables. You don't need to create a Python script to call Optuna's visualization functions. Feature requests and bug reports welcome!
Install optuna-dashboard
via pip:
$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.
Optuna is available at the Python Package Index and on Anaconda Cloud.
# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna
Optuna supports Python 3.7 or newer.
Also, we also provide Optuna docker images on DockerHub.
Any contributions to Optuna are more than welcome!
If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.
If you already have contributed to Optuna, we recommend the other contribution-welcome issues.
For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).