Speaker
Description
In high energy physics, most machine learning models are still built for narrow, task-specific analyses, which makes it hard to scale across the immense space of new physics possibilities. In this talk, I’ll introduce EveNet, a foundation model we developed to change that paradigm.
EveNet is a transformer trained on 500 million simulated Standard Model events, combining self-supervised and supervised objectives to learn a common event representation. With just light fine-tuning, it handles very different analysis tasks effectively. Tested on CMS Open Data, it matches—and often outperforms—models trained specifically for those individual tasks. Even more interestingly, EveNet generalizes well to beyond-Standard-Model signals and real collision data, suggesting it’s learning the underlying physical structure of the events themselves.
I’ll discuss what this means for the future of collider studies—how a single pretrained network might streamline hundreds of analyses and move us closer to a genuinely universal, data-driven approach to event understanding.