News 📰
01/24 – Our work on how in-context learning in LLMs learns label relationships has been accepted to ICLR 2024.
09/23 – Work from my internship at Google on contrastive learning with pre-trained models has been accepted to NeurIPS 2023.
08/23 – New preprint on how in-context learning in large language models learns label relationships.
07/23 – Work from my internship at Google on contrastive learning with pre-trained models has been accepted at ES-FoMo Workshop at ICML.
07/23 – Work from my internship at DeepMind on Active Acquisition for Multimodal Temporal Data: A Challenging Decision-Making Task has been accepted at Transactions on Machine Learning Research.
05/23 – New Preprint vom my time as student rearcher at Google: Three Towers: Flexible Contrastive Learning with Pretrained Image Models.
11/22 – Our work on Active Surrogate Estimators: An Active Learning Approach to Label-Efficient Model Evaluation has been accepted as an oral to NeurIPS 2022.
11/22 – Work from my internship at DeepMind on Active Acquisition for Multimodal Temporal Data: A Challenging Decision-Making Task has been accepted at the Foundation Models for Decision Making NeurIPS 2022 Workshop.
09/22 – I am excited to start working as a Student Researcher at Google Research in Zurich.
05/22 – Starting as Research Scientist Intern with DeepMind in London.
04/22 – I have received a ‘Highlighted Reviewer’ award for my reviewing at ICLR 2022.
04/22 – Invited talk at Frank Hutter’s group at Uni Freiburg on Non-Parametric Transformers.
02/22 – I have received a ‘Best Reviewers (Top 10%)’ award for my reviewing at AISTATS 2022.
11/21 – We are excited to give a lecture on Non-Parametric Transformers as part of the Stanford Lecture Course ‘CS25: Transformers United’ in November 2021.
10/21 – Excited to announce that Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning has been accepted for publication at NeurIPS 2021.
10/21 – Selected as reviewer for AISTATS 2022.
09/21 – Invited talk at Google Research on Non-Parametric Transformers.
08/21 – I have received a ‘Best Reviewers (Top 10%)’ award for my reviewing at ICML 2021.
08/21 – Invited talk at AI Campus Berlin on Non-Parametric Transformers.
07/21 - Invited talk at Cohere about Non-Parametric Transformers.
06/21 – Selected as reviewer for ICLR 2022.
06/21 – I have decided to release my 2020 MSc thesis Modelling Videos of Physically Interacting Objects.
06/21 – Very excited to announce a new-preprint: Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning.
05/21 – Invited talk at AI Campus Berlin on Active Testing: Sample-Efficient Model Evaluation.
05/21 – Very excited to announce that Active Testing: Sample-Efficient Model Evaluation, my first paper at OATML Oxford, has been accepted for publication at ICML 2021.
04/21 – Selected as reviewer for NeurIPS 2021.
12/20 – Selected as reviewer for ICML 2021.
11/20 – Our YouTube series on explaining AI and ML in cooperation with the Federal Agency for Civic Education has launched.
10/20 – Starting my PhD at OATML Oxford with Yarin Gal and Tom Rainforth.
08/20 – Selected as reviewer for Object Representations for Learning and Reasoning Workshop @ NeurIPS 2020.
05/20 – Selected as reviewer for Object-Oriented Learning Workshop @ ICML 2020.
03/20 – Hands-On with AI workshop at Digi-Konferenz organized by Federal Agency for Civic Education in Frankfurt, Germany.
12/19 – Excited to announce that Structured object-aware physics prediction for video modeling and planning has been accepted to ICLR 2020.
11/19 – Science-Slam & AI workshop at Tegernsee Science Days.
10/19 – Our (German) book Wie Maschinen Lernen explaining machine learning in layman’s terms and pretty images has finally been published.