Conference site » Proceedings

Better and faster hyperparameter optimization with Dask

Scott Sievert
University of Wisconsin–Madison
Relevant work performed while interning for Anaconda, Inc.

Tom Augspurger
Anaconda, Inc.

Matthew Rocklin
Relevant work performed while employed for Anaconda, Inc.


Nearly every machine learning model requires hyperparameters, parameters that the user must specify before training begins and influence model performance. Finding the optimal set of hyperparameters is often a time- and resource-consuming process. A recent breakthrough hyperparameter optimization algorithm, Hyperband finds high performing hyperparameters with minimal training via a principled early stopping scheme for random hyperparameter selection li2016hyperband. This paper will provide an intuitive introduction to Hyperband and explain the implementation in Dask, a Python library that scales Python to larger datasets and more computational resources. The implementation makes adjustments to the Hyperband algorithm to exploit Dask's capabilities and parallel processing. In experiments, the Dask implementation of Hyperband rapidly finds high performing hyperparameters for deep learning models.


distributed computation, hyperparameter optimization, machine learning



Bibtex entry

Full text PDF