skip to content

Theoretical High Energy Particle Physics Group

High-precision simulations based on first principles are a cornerstone of the LHC physics programme. As we approach the high-luminosity phase of the LHC, however, the demand for both accuracy and speed is pushing traditional simulation pipelines to their limits. This motivates a broader shift towards modern computing paradigms: machine learning for more efficient numerical evaluations, and hardware-aware implementations for scalable deployment. After introducing the basic structure of the Monte Carlo simulation chain and the relevant machine-learning concepts, I will present recent progress along three complementary directions: neural importance sampling as implemented in the MadNIS framework; machine-learned surrogate models for expensive amplitude calculations; and GPU-based implementations designed for large-scale event generation. Taken together, these developments pave the way towards a new generation of LHC simulation tools — faster, smarter, and cooler.

Further information

Time:

15May
May 15th 2026
16:00 to 17:00

Venue:

MR19 (Potter Room, Pavilion B), CMS

Speaker:

Ramon Winterhalder (University of Milan)

Series:

HEP phenomenology joint Cavendish-DAMTP seminar