Seminar: Adaptation and Fine-Tuning of Foundation Models

Foundation models are pretrained once on broad and heterogeneous data distributions; however, real-world deployment typically requires reliable performance in narrow, specialized domains. This seminar will discuss fundamental and recent approaches to address the following question: How do we specialize foundation models to a target domain under computational and data constraints?

To explore this question, we will read and discuss basic and state-of-the-art methods on how to systematically adapt and fine-tune large models through parameter-efficient fine-tuning, prompt engineering, test-time adaptation, context augmentation and many more.

For an overview, see this paper: Zhang et al. Parameter-Efficient Fine-Tuning for Foundation Models (arxiv’25)

Course TitleTAdaptation and Fine-Tuning of Foundation Models
Course IDINF-MSc-102
Registrationdrop me an email
ECTS4
Time[tentative] Wednesdays, 10:15-11:45
Languageenglish
#participantsmax 10
Locationin-person JvF25; seminar room 4th floor
organized byKatharina Eggensperger w/ Amir Rezaei Balef, Mykhailo Koshil

Requirements

Familiarity with foundations of deep learning, including transformer architectures and in-context learning.

Topics

| Date | Content | |—————|——————–| | 22.04.2026 | intro I | | 29.04.2026 | intro II | | tba | sessions |

A non-complete paper pool for the seminar.

Parameter-Efficient Fine-Tuning

Prompt Engineering

Reinforcement Learning with Human Feedback

Test-Time Adaptation

How the seminar will look like?

We will regularly throughout the semester. In the first few weeks, we will start with introductory lectures on adapting foundation models and how to critically review and present research papers. After that, we will have several sessions with presentations, each followed by discussions. In the end we will have one more concluding session.

Other Important information

Grading/Presentations: Grades will be based on your presentation, slides, active participation and a short report. Further details will be discussed in the introductory sessions.