A hyperparameter optimization framework
Alternatives To Optuna
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Ray24,8288019920 hours ago76June 09, 20222,883apache-2.0Python
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a toolkit of libraries (Ray AIR) for accelerating ML workloads.
Optuna7,84926214a day ago49June 13, 2022123otherPython
A hyperparameter optimization framework
Hyperopt6,5752521682 months ago13November 17, 2021389otherPython
Distributed Asynchronous Hyperparameter Optimization in Python
Awesome Automl Papers3,607
4 months ago1apache-2.0
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Scikit Optimize2,53980133a month ago19October 12, 2021286bsd-3-clausePython
Sequential model-based optimization with a `scipy.optimize` interface
Hyperas2,1472133 months ago9February 28, 201994mitPython
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
Auto Pytorch1,969
14 days ago5November 23, 202162apache-2.0Python
Automatic architecture search and hyperparameter optimization for PyTorch
6 months ago8mitPython
Hyperparameter Optimization for TensorFlow, Keras and PyTorch
Rl Baselines3 Zoo1,148
a day ago44mitPython
A training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included.
Hyperparameter Optimization Of Machine Learning Algorithms1,025
6 months agomitJupyter Notebook
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Alternatives To Optuna
Select To Compare

Alternative Project Comparisons

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license Read the Docs Codecov

Website | Docs | Install Guide | Tutorial | Examples

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

Key Features

Optuna has modern functionalities as follows:

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., regressor and svr_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # An objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.


Examples can be found in optuna/optuna-examples.


Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:

Web Dashboard

Optuna Dashboard is a real-time web dashboard for Optuna. You can check the optimization history, hyperparameter importances, etc. in graphs and tables. You don't need to create a Python script to call Optuna's visualization functions. Feature requests and bug reports welcome!


Install optuna-dashboard via pip:

$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
Listening on http://localhost:8080/
Hit Ctrl-C to quit.


Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

Optuna supports Python 3.7 or newer.

Also, we also provide Optuna docker images on DockerHub.



Any contributions to Optuna are more than welcome!

If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.

If you already have contributed to Optuna, we recommend the other contribution-welcome issues.

For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.


Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).

Popular Optimization Projects
Popular Hyperparameter Optimization Projects
Popular Software Performance Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Machine Learning
Hyperparameter Optimization