Speaker
Mr
Joschka Birk
(University of Hamburg)
Description
Foundation models are multi-dataset and multi-task machine learning methods that
once pre-trained can be fine-tuned for a large variety of downstream applications.
We introduce OmniJet-α, a Transformer-based model designed for tokenized
particle jets, showcasing notable advancements on two fronts.
Firstly, our work shows extensive studies on the encoding (tokenization) quality of
our tokenized particle jets.
Secondly, we demonstrate the first successful transfer learning between
unsupervised jet generation and supervised jet tagging, marking a
significant advancement in building foundation models for particle physics.