So I thought I would provide here a very short glossary of terms you are likely to hear, and which you might have a hard time understanding correctly. Let me see if I can do a decent job.
- gamma: a gamma-ray is a photon, i.e. a quantum of light. A very energetic one, to be sure: a gamma ray is such only if it carries significantly more energy than a x-ray, so above a Mega-electron-Volt or so. The gammas we will be hearing about are those directly coming from a Higgs boson decay, and these have an energy of 62.3 GeV, equivalent to the kinetic energy of a mosquito traveling at 9 centimeters per second.
- VBF, or "vector-boson fusion": this is a process whereby the two protons (in the case of LHC collisions, but this is not specific to the VBF reaction) interact by both emitting a W, or a Z boson. The two bosons "fuse" - they annihilate - and a Higgs particle is thus created. The peculiarity of the VBF production of Higgs bosons is that from the protons, which both emitted the W or Z originating the process, emerge two energetic hadronic jets. The latter arise from the quarks which emitted the W or Z in the first place: they need to balance the momentum, so they are ejected from the parent proton (which then dissociates, since a proton does not remain stable if you pull a quark out of it); these VBF jets are typically emitted at small angle from the proton direction, but they are energetic enough that they constitute a nice "tag" of the VBF production process. VBF production of Higgs bosons is not the highest-rate production mechanism, but it is significant because of the distinctive feature of the two forward jets.
- gluon fusion: this is the most common way of producing Higgs bosons. Both protons emit a gluon, and the two exchange a virtual pair of top quarks; the latter in turn annihilate creating the Higgs boson. Note that the top quark is the one doing this the most, because it is the heaviest fermion, and the Higgs boson "couples" preferentially to heavy stuff. Also note that unlike the VBF process discussed above, gluon-gluon fusion does not give rise to energetic, forward jets. That is because gluons are massless, and the quarks that emitted them in the protons do not typically recoil in the plane transversal to the beam axis.
- Associated production: Higgs bosons can be produced together with a W or Z boson, when the latter "radiate out" a Higgs. The resulting signatures are richer than in the gluon-fusion process, but the rate of these events is smaller. One typically looks for this process for low Higgs masses, when the rate is not too small; in this case, one is bound to look for the Higgs decaying into two b-quark jets, because the gamma-gamma decay rate is several hundred times smaller.
- background model: we'll hear that discussed a lot in the ATLAS and CMS talks. Indeed, we are after a very small signal (particularly in the H->gamma-gamma decay mode) on top of a large background, which unfortunately is not easy to predict with simulations. So what the experiments do is to parameterize it with a simple function: a low-degree polynomial, or a sum of two exponentials, etcetera. This functional form of the "background model" is used together with a signal model (derived from simulation with much more confidence) to fit the data. Since the signal is small, the different background models will in general produce different results. Experimentalists thus need to be very careful in estimating how their a priori assumptions bias the result. We will probably also hear a lot about how the background models have been tested in pseudoexperiments, so see the next item too.
- pseudoexperiment: this is not how CMS members mockingly call the ATLAS detector, or vice-versa. Rather, a pseudoexperiment is a set of data derived from the random generation of events following a pre-defined model. To clarify, imagine you want to test whether your fitting procedure is sensitive to detect a Higgs decay to two gammas in the diphoton mass spectrum, when you assume that the background shape is a falling exponential distribution and the signal has the strength expected from Standard Model production. What you do is to randomly generate mass values distributed according to the background model, add the correct amount of mass values distributed according to the signal model for a specific Higgs mass hypothesis, and fit the resulting mass distribution. You obtain a signal strength estimate, an uncertainty, etcetera, as in a real experiment. This is a pseudoexperiment: you can iterate as much as you want the procedure, creating distributions of the expected relevant quantities for each mass hypothesis. This allows you to draw those fancy "brazil bands" in green and yellow which describe what is the expected result of your experiment.
- p-value: beware! this is a very tricky thing. You will see plots of p-values and will need some insight to really understand what you are looking at. A p-value is the probability of obtaining data at least as "extreme" as the one you observe, if your null hypothesis is true. What is your null hypothesis ? Well, it can be anything, but in our case it will be the absence of a Higgs boson of a pre-defined mass. "Extreme" is quite tricky instead: it depends on what is your "alternate hypothesis" of reference, and what kind of departure it would produce on the studied statistic derived from the data. In searching for a Higgs boson we will combine channels where event excesses are sought with various techniques: the combination is performed by using a global test statistic called "CLs". You need not know what that is (it is very, very technical), but its distribution will be different for signal+background and for background-only hypotheses (the alternate and the null). So "extreme" will mean "departing from the typical values expected for the null hypothesis, toward the values expected from the alternate hypothesis".
- Connected with the p-value is the issue of probability inversion, which is the typical pitfall that 90% of outsiders and 40% of insiders fall in (and maybe I am being a bit too optimistic with my colleagues' understanding of statistics). Again, the p-value is the probability of obtaining data at least as extreme as the ones observed, if the null hypothesis is true. This is a world apart from saying that it is the probability of the null hypothesis being true, given that you observed that extreme data! Beware! If your ability on the long jump puts you in the 99.99% percentile, that does not mean that you are a kangaroo, and neither can one infer that the probability that you belong to the human race is 0.01%.
- But what is a "sigma" anyway ? A couple of colleagues suggested that I be more explicit on explaining this - I often give it for granted. A "sigma" is the unit of measurement of how discrepant is a result with respect to its expected value. The jargon comes from the parameter sigma of the Gaussian distribution. If we say that a result is "three sigma" away from expectations, we are saying that it is quite far from what we would get if the "null hypothesis" is correct. This does not allow one to say that the null hypothesis is false, of course - see the explanation above about probability inversion. However, the chance to obtain a result as discrepant as "one-sigma", "two-sigma", etc, can be read off from the following table (courtesy Andre David):
z=1 sigma, p(by chance)~16% or once every 6 times.
z=2 sigma, p(by chance)~2.3% or once every 44 times.
z=3 sigma, p(by chance)~0.13% or once every 741 times. <= What we call "evidence" level.
z=4 sigma, p(by chance)~0.0032% or once every 31'574 times.
z=5 sigma, p(by chance)~0.000028% or once every 3'486'914 times. <= What we call "discovery" level.
Comments