Capítulo de livro Acesso aberto

Hyperparameter Optimization

2019; Springer International Publishing; Linguagem: Inglês

10.1007/978-3-030-05318-5_1

ISSN

2520-1328

Autores

Matthias Feurer, Frank Hutter,

Tópico(s)

Machine Learning and Algorithms

Resumo

Recent interest in complex and computationally expensive machine learning models with many hyperparameters, such as automated machine learning (AutoML) frameworks and deep neural networks, has resulted in a resurgence of research on hyperparameter optimization (HPO). In this chapter, we give an overview of the most prominent approaches for HPO. We first discuss blackbox function optimization methods based on model-free methods and Bayesian optimization. Since the high computational demand of many modern machine learning applications renders pure blackbox optimization extremely costly, we next focus on modern multi-fidelity methods that use (much) cheaper variants of the blackbox function to approximately assess the quality of hyperparameter settings. Lastly, we point to open problems and future research directions.

Referência(s)