When particle physicists search for the basic building blocks of the universe, for example the elusive dark matter that keeps galaxies together, they are today faced with serious bottlenecks in computing. In order to test realistic new models against the plethora of data that exists today from anything from astrophysical observations to collider experiments at the Large Hadron Collider, the large parameter spaces of the models must be explored in detail. This means repeated numerical reevaluation of precision theoretical predictions for the model for different parameter values. Possibly millions to billions of evaluations are needed in total for a single model, depending on its number of parameters. These predictions are computationally very expensive, and today testing anything beyond the simplest models to high fidelity is infeasible, even with the use of supercomputing strategies.
This project aims to remove such computational bottlenecks by an interdisciplinary collaboration between particle physicists and statisticians. To achieve its goals the project will follow a number of research strategies: it will construct modern machine learning tools for performing precision calculations in quantum field theory and evaluating their uncertainties, and it will develop a software framework for their use in high-energy physics. To reduce the number of reevaluations needed in exploring new physics models the project will develop sophisticated new statistical techniques to reliably extract the best-fit parameter regions of models, and make model comparisons possible within today's computational limitations. Finally, the project will use these developments to explore five different new physics models that are particularly interesting as solutions to some of the major problems in high-energy physics today, for example the nature of dark matter.

When exploring the smallest fundamental constituents of the Universe physicists are faced with very serious calculational bottlenecks. To compare new physics models to data, for example from the Large Hadron Collider or astrophysical observations, we need to perform very computationally expensive calculations in quantum field theory (QFT); expensive due to the increasing complexity of higher-order quantum corrections. These are today too slow to perform at the necessary precision except for the simplest models. At the same time, the interpretation of the models given the available data, the best-fit regions of their parameter spaces, and the comparison of different models with each other through their goodness-of-fit, is made computationally intractable due to the size of the parameter spaces of the models and the complexity of the likelihood evaluation for each model.
The solution to these inherent problems can not be found in physics alone. This project builds on an interdisciplinary collaboration between physicists and statisticians focused on statistical learning and inference problems in high-energy physics. The project will develop machine learning based regression techniques to speed up QFT calculations with a proper probabilistic interpretation of uncertainties from higher-order contributions, it will develop a continual learning framework for faster emulation of the likelihood for model parameters, and it will investigate new improved statistical approaches to the problems of best-fit and goodness-of-fit using these emulations.
Finally, the project will investigate a number of promising new physics models that can answer questions such as: What is dark matter? Are the properties of the Higgs boson indicating that the Universe is fundamentally unstable? Why is there more matter than anti-matter in the Universe? If funded, this project will increase the reach of large and costly experiments in answering these questions for a relatively small extra cost.