Seminar: Automated Machine Learning and Hyperparameter Optimization
Have you ever trained an ML model and wondered how to efficiently optimize its hyperparameters? Then this seminar is for you!
TL;DR AutoML aims to support ML users (and ML researchers) by automating parts of the ML workflow. In this seminar we will read some classic and also recent research papers in the field of AutoML with a focus on Bayesian optimization and AutoML systems for tabular data.
Course Title | Automated Machine Learning and Hyperparameter Optimization |
---|---|
Course ID | ML4501f |
Registration | ILIAS |
ECTS | 3 |
Time | Wednesdays 12 c.t - 14 |
Language | english |
#participants | up to 20 |
Location | in-person at Maria-von-Linden-Straße 6; mostly lecture hall ground floor |
Why should you attend this seminar?
Besides practicing your scientific communication skills, you will also
- learn about key contributions in the field of AutoML
- be able to discuss recent research in the field of hyperparameter optimization and AutoML systems
- gain experience in reading, understanding and presenting research papers
Requirements
We strongly recommend that you know the foundations of machine learning and deep learning. Ideally, you also have some experience in applying ML to get the most out of this seminar.
Topics
The seminar focuses on understanding the underlying concepts of modern AutoML methods. Since this field is ever-growing, in this semester, we will focus on only a few topics.
Here is a tentative meeting schedule:
Date | Content |
---|---|
18.10 | Initial Meeting; see slides |
25.10; seminar room 3.OG | How to give a good presentation / Bayesian Optimization for HPO |
01.11 | No meeting |
08.11 | HPO I (#1, #2) |
15.11 | No meeting |
22.11 | HPO I (#3, #4) |
29.11 | No Meeting |
06.12 | AutoML for Tabular (#6, #7) |
13.12 | AutoML for Tabular (#8) HPO II (#9) |
20.12 | HPO II (#10, #11) |
27.12-03.01 | No meeting; holiday |
10.01 | Intro: NAS |
17.01 | NAS (#12, #13) |
24.01 | NAS (#14) AutoML for Tabular (#5) |
31.01 | No meeting |
07.02 | NAS (#15, #16) |
Paper List:
HPO I
- Practical Bayesian Optimization of Machine Learning Algorithms Jasper Snoek, Hugo Larochelle, Ryan P. Adams; NeurIPS 2012
- BOHB: Robust and Efficient Hyperparameter Optimization at Scale Stefan Falkner, Aaron Klein, Frank Hutter; ICML 2018
- Scalable Global Optimization via Local Bayesian Optimization David Eriksson, Michael Pearce, Jacob Gardner, Ryan D. Turner, Matthias Poloczek; NeurIPS 2019
- PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning Neeratyoy Mallik, Edward Bergman, Carl Hvarfner, Danny Stoll, Maciej Janowski, Marius Lindauer, Luigi Nardi, Frank Hutter; NeurIPS 2023
AutoML for Tabular Data
- AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data Nick Erickson, Jonas Mueller, Alexander Shirkov, Hang Zhang, Pedro Larroy, Mu Li, Alexander Smola; arXiv 2020
- Revisiting Deep Learning Models for Tabular Data Yury Gorishniy, Ivan Rubachev, Valentin Khrulkov, Artem Babenko; NeurIPS 2021
- Why do tree-based models still outperform deep learning on tabular data? Léo Grinsztajn, Edouard Oyallon, Gaël Varoquaux: NeurIPS 2022
- TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second Noah Hollmann, Samuel Müller, Katharina Eggensperger, Frank Hutter; ICLR 2023
HPO II
- HEBO: Pushing The Limits of Sample-Efficient Hyper-parameter Optimisation Alexander I. Cowen-Rivers, Wenlong Lyu, Rasul Tutunov, Zhi Wang, Antoine Grosnit, Ryan Rhys Griffiths, Alexandre Max Maraval, Hao Jianye, Jun Wang, Jan Peters, Haitham Bou-Ammar; JAIR 2022
- Towards Learning Universal Hyperparameter Optimizers with Transformers Yutian Chen, Xingyou Song, Chansoo Lee, Zi Wang, Richard Zhang, David Dohan, Kazuya Kawakami, Greg Kochanski, Arnaud Doucet, Marc’Aurelio Ranzato, Sagi Perel, Nando de Freitas, NeurIPS 2022
- PFNs4BO: In-Context Learning for Bayesian Optimization Samuel Müller, Matthias Feurer, Noah Hollmann, Frank Hutter, ICML 2023
NAS
- DARTS: Differentiable Architecture Search Hanxiao Liu, Karen Simonyan, Yiming Yang; ICLR 2019
- Understanding and Simplifying One-Shot Architecture Search Gabriel Bender, Pieter-Jan Kindermans, Barret Zoph, Vijay Vasudevan, Quoc Le; ICML 2018
- HAT: Hardware-Aware Transformers for Efficient Natural Language Processing Hanrui Wang, Zhanghao Wu, Zhijian Liu, Han Cai, Ligeng Zhu, Chuang Gan, Song Han; ACL 2020
- Neural Architecture Search without Training Joe Mellor, Jack Turner, Amos Storkey, Elliot J Crowley; ICML 2021
- Zero-Cost Proxies for Lightweight NAS Mohamed S Abdelfattah, Abhinav Mehrotra, Łukasz Dudziak, Nicholas Donald Lane; ICLR 2021
How the seminar will look like?
We will meet each week (with a few exceptions). In the first two weeks, we will start with introductory lectures on automated machine learning, empirical experimentation and how to critically review and present research papers. After that, each week, we will have presentations, followed by discussions.
Other Important information
Registration: Please register on ILIAS. The number of participants is limited and the registration opens on September 29th, noon. There will be a waiting list (please unregister to let other people take your place). Please come to the first lecture even if you are still on the waiting list. If you’re enrolled and don’t show up, your spot will be freed for someone on the waiting list.
Grading/Presentations: Grades will vbe based on your presentation, slides and active participation in the seminar. Further details will be discussed in the intro sessions.