Towards automatically-tuned deep neural networks

Abstract: Recent advances in AutoML have led to automated tools that can compete with machine learning experts on supervised learning tasks. In this work, we present two versions of Auto-Net, which provide automatically-tuned deep neural networks without any human intervention. The first version, Auto-Net 1.0, builds upon ideas from the competition-winning system Auto-sklearn by using the Bayesian Optimization method SMAC and uses Lasagne as the underlying deep learning (DL) library. The more recent Auto-Net 2.0 builds upon a recent combination of Bayesian Optimization and HyperBand, called BOHB, and uses PyTorch as DL library. To the best of our knowledge, Auto-Net 1.0 was the first automatically-tuned neural network to win competition datasets against human experts (as part of the first AutoML challenge). Further empirical results show that ensembling Auto-Net 1.0 with Auto-sklearn can perform better than either approach alone, and that Auto-Net 2.0 can perform better yet

Location
Deutsche Nationalbibliothek Frankfurt am Main
Extent
Online-Ressource
Language
Englisch
Notes
Automated Machine Learning. - Cham : Hutter, Frank; Kotthoff, Lars; Vanschoren, Joaquin [Hrsg.], 2019. - 135-149, ISBN: 978-3-030-05318-5

Event
Veröffentlichung
(where)
Freiburg
(who)
Universität
(when)
2024
Creator
Mendoza, Hector
Klein, Aaron
Feurer, Matthias
Springenberg, Jost Tobias
Urban, Matthias
Burkart, Michael
Dippel, Maximilian
Lindauer, Marius
Hutter, Frank
Contributor
Maschinelles Lernen und Natürlichsprachliche Systeme, Professur Frank Hutter

DOI
10.1007/978-3-030-05318-5_7
URN
urn:nbn:de:bsz:25-freidok-1542403
Rights
Open Access; Der Zugriff auf das Objekt ist unbeschränkt möglich.
Last update
25.03.2025, 1:50 PM CET

Data provider

This object is provided by:
Deutsche Nationalbibliothek. If you have any questions about the object, please contact the data provider.

Associated

Time of origin

  • 2024

Other Objects (12)