​
There are two very different notions of "modeling" in particle physics. In experimental physics modeling refers to the use of Monte Carlo simulations in statistical analyses, for validating or excluding various hypotheses from experimental data. In theoretical physics modeling refers to the mathematical construction of physical theories in a way that is consistent with all known measurements, and novel in its explanatory ability.
Making discoveries in particle physics requires a combination of functional statistics, high-performance computing, and computer-assisted statistics (re: machine learning). The overarching goal of the Large Hadron Collider experiment was to "prove" the existence of a special particle called the Higgs boson, whose existence was predicted in 1964. Over the last 50 years, thousands of different theories have been constructed to predict how this particle would manifest, each with its own unique set of predictions. Much of the work that I performed as a member of the collaboration was dedicated towards statistically quantifying the degree to which classes of theories could be excluded. The process is a bit like the cartoon below, except the theory space and the data space are both extremely high dimensional. The Large Hadron Collider experiment produces 600 millions particle collisions per second and records data at a rate of about 90 TB per hour, most of which is discarded by high-pass filters.
![stats.png](https://static.wixstatic.com/media/2d680a_87421eb8efd941fb9476a04509963a14~mv2.png/v1/fill/w_663,h_278,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/stats.png)
Theories that closely resemble the null hypothesis require an enormous amount of data (~PB) to exclude. In a recent paper my collaborators and I developed strategies for increasing the experimental sensitivity to one of the most notoriously difficult searches called "the compressed stop" scenario. The Bayesian methods we used to validate our strategies required simulating enormous quantities of data, and creating modular workflows to pipe it through different stages of the process pictured below. Real data from the experiment was typically processed using CERN's global computing grid, while simulation software was run on local batch farms.
With these pipelines we were able to make very robust predictions for the amount of data that would be needed for our methods to provide enough sensitivity to exclude these models. The plot below is taken from the paper, and shows the amount of data (y-axis) as a function of the most important model parameter (x-axis). Two variants of the method (solid vs. dashed) are shown.
![tweedieplot.png](https://static.wixstatic.com/media/2d680a_6ce84f5fbfb34785891f2640c8ec3d3e~mv2.png/v1/fill/w_457,h_299,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/tweedieplot.png)
![process.png](https://static.wixstatic.com/media/2d680a_a22b149276844dce9728361bac0106dc~mv2.png/v1/fill/w_359,h_511,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/process.png)
During my time in graduate school, I was lucky enough to be on the ground floor for the discovery of the Higgs boson. Finding a Higgs was often described as trying to find a needle in a warehouse full of haystacks. One of the ways in which the Higgs was discovered was by observing its decay to two particles of light called photons, symbolized by the Greek letter g. The production and decay of the Higgs is parameterized by a large number of features, such as the angle q between the two outgoing photons. The central challenge for this task was to find a way to separate pairs of photons produced by a Higgs decay, from the vastly greater number of photon pairs produced through random crap in the terabytes of data that was being written to tape.
​
![hgg.png](https://static.wixstatic.com/media/2d680a_2aea7a3afdad43a7848d8d33d0ce3fba~mv2.png/v1/fill/w_651,h_217,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/hgg.png)
Roughly ~2500 of these for every ~1 of these
​
Machine learning algorithms such as deep neural networks and boosted decision trees are typically used in particle physics to find the optimal parameters with which to filter the massive data sets, while maximizing signal to noise ratios and minimizing bias. The loss functions for training can all be approximated with cutting-edge simulation software and reasonable strain on modern computing resources. However these simulations introduce a small but irreducible amount of arbitrariness, which is required for computational tractability. For example a minimum energy must be defined in order to define a particle as a real particle because the low energy distribution is divergent. This introduces incalculable biases in the form of dozens of nuisance parameters, which must all be integrated over during these analyses. In order to maximize the separation between the tiny Higgs signal and the huge noise, I trained deep neural networks and boosted decision trees on ~TB of simulated data in order to evaluate their relative performance as Higgs classifiers. For the final internal study, we used a BDT on the large set of parameters characterizing pairwise photon decays, and were able to enhance the statistical significance of the Higgs signal by ~0.5 standard deviations, thus significantly hastening its discovery. The figure below shows a plot of the raw data versus the data enhanced by BDT weightings.
![HiggsMVA.png](https://static.wixstatic.com/media/2d680a_b24ba02a5bf647288bea3fc6f42c7bb0~mv2.png/v1/fill/w_374,h_327,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/HiggsMVA.png)
![discriminatehiggs.png](https://static.wixstatic.com/media/2d680a_201554f077e44904a97dd85dac55bd5e~mv2.png/v1/fill/w_382,h_350,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/discriminatehiggs.png)
For theorists, the discovery of the Higgs boson was not much of a reason to celebrate, as its existence had been predicted since 1964. The real goal for our community was to explain its origin, which continues to be an open question. Progress in fundamental physics has always occurred by observing inexplicable phenomena and hypothesizing mechanisms for them. Unfortunately, the Higgs boson appears to have come bereft of any clues about the microscopic dynamics that gave rise to it. This has long been dubbed by particle physicists as "the nightmare scenario". The measured properties of the Higgs defy all of the hypotheses that were previously constructed to explain it. The question marks in this figure will continue to remain a mystery for as long as this issue remains unresolved.
![scales.png](https://static.wixstatic.com/media/2d680a_b65a4d7eba0147dab096dfd044a3f043~mv2.png/v1/fill/w_321,h_502,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/scales.png)
During my postdoctoral years, I became obsessed with trying to build a model for the microscopic dynamics that could give rise to such a peculiar looking particle. In a recent paper my collaborators and I constructed a viable explanation for the Higgs in which it appears as a composite particle. This is somewhat like the proton which is a composite particle made out of quarks, and the basic picture of our framework is illustrated in the figure below. In many ways the Higgs boson appears to be very sequestered from the particles that make up our every day reality, the particles of the Standard Model. It is the only known particle with zero angular momentum (spin), and is neutral under most of the known forces of nature. It's interactions with other known particles are thus extremely feeble, which is why it took this long to discover.
![hiddenhiggs.png](https://static.wixstatic.com/media/2d680a_cb7dfe2c88d74a6a99e9b7981e1109e4~mv2.png/v1/fill/w_508,h_160,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/hiddenhiggs.png)
The particles which make up the Higgs boson in our theory are totally disconnected from the rest of the Standard Model. This makes them an intriguing candidate for Dark Matter, which appears to dominate the universe, and is also well sequestered in its interactions with the Standard Model particles. We further conducted a detailed analysis on the ability of the Large Hadron Collider to discover such particles, and showed that they would have evaded detection up until this point, but could be discovered at any moment. The plot below shows the range of parameters for this theory that have been excluded thus far.
​
​
​
​
![hiddenhiggslimits.png](https://static.wixstatic.com/media/2d680a_911794c731434594b99326a4a334225a~mv2.png/v1/fill/w_441,h_351,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/hiddenhiggslimits.png)