Hyperopt

Distributed Asynchronous Hyperparameter Optimization in Python
Alternatives To Hyperopt
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Ray28,9068036310 hours ago95December 04, 20233,471apache-2.0Python
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
Optuna8,9842642520 hours ago59October 17, 202384otherPython
A hyperparameter optimization framework
Hyperopt6,890252248a month ago13November 17, 2021301otherPython
Distributed Asynchronous Hyperparameter Optimization in Python
Awesome Automl Papers3,607
a year ago1apache-2.0
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Scikit Optimize2,688801919 days ago19October 12, 2021317bsd-3-clausePython
Sequential model-based optimization with a `scipy.optimize` interface
Auto Pytorch2,158
a month ago5November 23, 202171apache-2.0Python
Automatic architecture search and hyperparameter optimization for PyTorch
Hyperas2,147213a year ago9February 28, 201994mitPython
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
Talos1,596
15 days ago10mitPython
Hyperparameter Optimization for TensorFlow, Keras and PyTorch
Rl Baselines3 Zoo1,542218 days ago18November 17, 202350mitPython
A training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included.
Vizier1,1243a day ago35November 30, 202341apache-2.0Python
Python-based research interface for blackbox and hyperparameter optimization, based on the internal Google Vizier Service.
Alternatives To Hyperopt
Select To Compare


Alternative Project Comparisons
Readme

Hyperopt: Distributed Hyperparameter Optimization

build pre-commit.ci status PyPI version Anaconda-Server Badge

Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.

Getting started

Install hyperopt from PyPI

pip install hyperopt

to run your first example

# define an objective function
def objective(args):
    case, val = args
    if case == 'case 1':
        return val
    else:
        return val ** 2

# define a search space
from hyperopt import hp
space = hp.choice('a',
    [
        ('case 1', 1 + hp.lognormal('c1', 0, 1)),
        ('case 2', hp.uniform('c2', -10, 10))
    ])

# minimize the objective over the space
from hyperopt import fmin, tpe, space_eval
best = fmin(objective, space, algo=tpe.suggest, max_evals=100)

print(best)
# -> {'a': 1, 'c2': 0.01420615366247227}
print(space_eval(space, best))
# -> ('case 2', 0.01420615366247227}

Contributing

If you're a developer and wish to contribute, please follow these steps.

Setup (based on this)

  1. Create an account on GitHub if you do not already have one.

  2. Fork the project repository: click on the ‘Fork’ button near the top of the page. This creates a copy of the code under your account on the GitHub user account. For more details on how to fork a repository see this guide.

  3. Clone your fork of the hyperopt repo from your GitHub account to your local disk:

    git clone https://github.com/<github username>/hyperopt.git
    cd hyperopt
    
  4. Create environment with:
    $ python3 -m venv my_env or $ python -m venv my_env or with conda:
    $ conda create -n my_env python=3

  5. Activate the environment:
    $ source my_env/bin/activate
    or with conda:
    $ conda activate my_env

  6. Install dependencies for extras (you'll need these to run pytest): Linux/UNIX: $ pip install -e '.[MongoTrials, SparkTrials, ATPE, dev]'

    or Windows:

    pip install -e .[MongoTrials]
    pip install -e .[SparkTrials]
    pip install -e .[ATPE]
    pip install -e .[dev]
    
  7. Add the upstream remote. This saves a reference to the main hyperopt repository, which you can use to keep your repository synchronized with the latest changes:

    $ git remote add upstream https://github.com/hyperopt/hyperopt.git

    You should now have a working installation of hyperopt, and your git repository properly configured. The next steps now describe the process of modifying code and submitting a PR:

  8. Synchronize your master branch with the upstream master branch:

    git checkout master
    git pull upstream master
    
  9. Create a feature branch to hold your development changes:

    $ git checkout -b my_feature

    and start making changes. Always use a feature branch. It’s good practice to never work on the master branch!

  10. We recommend to use Black to format your code before submitting a PR which is installed automatically in step 6.

  11. Then, once you commit ensure that git hooks are activated (Pycharm for example has the option to omit them). This can be done using pre-commit, which is installed automatically in step 6, as follows:

    pre-commit install
    

    This will run black automatically when you commit on all files you modified, failing if there are any files requiring to be blacked. In case black does not run execute the following:

    black {source_file_or_directory}
    
  12. Develop the feature on your feature branch on your computer, using Git to do the version control. When you’re done editing, add changed files using git add and then git commit:

    git add modified_files
    git commit -m "my first hyperopt commit"
    
  13. The tests for this project use PyTest and can be run by calling pytest.

  14. Record your changes in Git, then push the changes to your GitHub account with:

    git push -u origin my_feature
    

Note that dev dependencies require python 3.6+.

Algorithms

Currently three algorithms are implemented in hyperopt:

Hyperopt has been designed to accommodate Bayesian optimization algorithms based on Gaussian processes and regression trees, but these are not currently implemented.

All algorithms can be parallelized in two ways, using:

Documentation

Hyperopt documentation can be found here, but is partly still hosted on the wiki. Here are some quick links to the most relevant pages:

Related Projects

Examples

See projects using hyperopt on the wiki.

Announcements mailing list

Announcements

Discussion mailing list

Discussion

Cite

If you use this software for research, please cite the paper (http://proceedings.mlr.press/v28/bergstra13.pdf) as follows:

Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. TProc. of the 30th International Conference on Machine Learning (ICML 2013), June 2013, pp. I-115 to I-23.

Thanks

This project has received support from

  • National Science Foundation (IIS-0963668),
  • Banting Postdoctoral Fellowship program,
  • National Science and Engineering Research Council of Canada (NSERC),
  • D-Wave Systems, Inc.
Popular Optimization Projects
Popular Hyperparameter Optimization Projects
Popular Software Performance Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Algorithms
Git
Optimization
Hyperparameter Optimization