Seminar: AutoML in the Age of Large Pre-trained Models

Have you heard about AutoML? Do you wonder what LLMs can do for AutoML and vice versa? Then this seminar is for you!

TL;DR AutoML aims to support ML users (and ML researchers) by automating parts of the ML workflow. In this seminar we will read recent research papers in the field of AutoML with a focus on methods for and with LLMs and pre-trained models. For a detailed overview, see this paper.

Course TitleAutoML in the Age of LLMs and Pre-trained Models
Course IDML4501f
RegistrationILIAS
ECTS3
TimeTuesdays, 12:15-13:45
Languageenglish
#participantsup to 12
Locationin-person at Maria-von-Linden-Straße 6; lecture hall ground floor

Why should you attend this seminar?

Besides practicing your scientific communication skills, you will also

Requirements

We strongly recommend that you know the foundations of machine learning and deep learning, including modern neural architectures including transformer models. Ideally, you also have some experience in applying ML to get the most out of this seminar.

Topics

The seminar focuses on understanding the underlying concepts of modern AutoML methods. Since this field is ever-growing, in this semester, we will focus on only a few topics as stated above.

Here is a tentative meeting schedule and a tentative paper list:

DateContent
16.04.2024Intro: Organization
23.04.2024Intro: Bayesian Optimization
30.04.2024Intro: How to give a good presentation / TBA
07.05.2024break
14.05.2024break
21.05.2024break
28.05.2024Bayesian Optimization (OptFormer)
04.06.2024break
11.06.2024Tabular Data (TabPFN;CAAFE)
18.06.2024Data Science (MLAgent)
25.06.2024break
02.07.2024break
09.07.2024Neural Architecture Search (GPT4NAS;GPT-NAS)
16.07.2024ModelSelection (Bandits4LLMs)
23.07.2024no meeting
  1. [OptFormer] Chen et al. Towards learning universal hyperparameter optimizers with transformers (NeurIPS’22)
  2. [TabPFN] Hollmann et el. TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second (ICLR’23)
  3. [CAAFE] Hollmann et al. Large Language Models for Automated Data Science: Introducing CAAFE for Context-Aware Automated Feature Engineering (NeurIPS’24)
  4. [MLAgent] Huang et al. Benchmarking Large Language Models as AI Research Agents (arxiv’23)
  5. [GPT4NAS] Zheng et al. Can GPT-4 Perform Neural Architecture Search? (arxiv’23)
  6. [GPT-NAS] Yu et al. GPT-NAS: Evolutionary Neural Architecture Search with the Generative Pre-Trained (arxiv’22)
  7. [Bandits4LLMs] Xia et al. Which LLM to Play? Convergence-Aware Online Model Selection with Time-Increasing Bandits (WWW’24)

How the seminar will look like?

We will meet each week (with a few exceptions). In the first few weeks, we will start with introductory lectures on automated machine learning, Bayesian optimization, neural architecture search and how to critically review and present research papers. After that, each week, we will have presentations, followed by discussions.

Other Important information

Registration: Please register on ILIAS. The number of participants is limited and the registration opens on March 29th, noon. There will be a waiting list (please unregister to let other people take your place). Please come to the first lecture even if you are still on the waiting list. If you’re enrolled and don’t show up, your spot will be freed for someone on the waiting list.

Grading/Presentations: Grades will be based on your presentation, slides, active participation and a short report. Further details will be discussed in the intro session.