Tag Archives: HEP

Why Some Particles Don’t Decay


Most particles do not “live” forever. The neutron, a neutral particle which resides in the nucleus of every element except hydrogen, has an average lifetime of 882 seconds (just under quarter of an hour) when outside of the nucleus. The proton has a very long lifetime and has never been observed to decay, but as Murray Gell-Mann said “Everything not forbidden is compulsory”. We therefore know that because the proton can decay, it must (eventually) decay. (Current thinking is that the lower bound, the shortest possible half-life for the proton, is at least six billion trillion trillion years, much much longer than the age of the Universe.)

But some particles: the electron, the electron neutrino, and the photon do live forever. Why is this?

When a particle decays there are a number of quantities that must be conserved, i.e. they must be the same before and after the decay. Mass-energy is conserved, so the amount of mass and energy before and after the decay must be the same. Charge is conserved, so the total charge before and after the decay must be the same.

If we take the electron as an example, it is the lightest of the charged particles, with a mass of about one eighteen hundredth of the mass of a proton. When a particle decays it must decay into a lighter particle (otherwise where would the extra mass come from?) and it must obey the conservation rules explained above. But here we have a problem: there are particles lighter than the electron (e.g. the electron neutrino) and there are particles with the same charge as the electron (e.g. the muon), but there are no particles that are both lighter than the electron and which are charged. Because the decay of an electron would violate conservation laws it is forbidden, there is nothing it could decay into, and therefore decay does not occur.

What is “Five Sigma” Data?

or “Why do some experiments take such a long time to run?”

Before you go any further, watch the first minute of this video of Professor Andrei Linde learning from Assistant Professor Chao-Lin Kuo of the BICEP2 collaboration that his life’s work on inflationary theory has been shown by experiment to be correct.

The line we’re interested in is this one from Professor Kuo:

“It’s five sigma at point-two … Five sigma, clear as day, r of point-two”

You can see, from Linde’s reaction and the reaction of his wife, that this is good news.

The “r of point-two” (i.e. r = 0.2) bit is not the important thing here. It refers to the something called the tensor-to-scalar ratio, referred to as r, that measures the differences in the polarisation of the cosmic microwave background radiation caused by gravitational waves (the tensor component) and those caused by density waves (the scalar component).

The bit we’re interested in is the “five sigma” part. Scientific data, particularly in physics and particularly in particle physics and astronomy is often referred to as being “five sigma”, but what does this mean?

Imagine that we threw two non-biased six-sided dice twenty thousand times, adding the two scores together each time. We would expect to find that seven was the most common value, coming up one-sixth of the time (3333 times) and that two and twelve were the least common values, coming up one thirty-sixth of the time (556 times each). The average value of the two dice would be 7.00, and the standard deviation (the average distance between each value and the average) would be 2.42.

I ran this simulation in Microsoft Excel and obtained the data below. The average was 6.996 and the standard deviation (referred to as sigma or ?) was 2.42. This suggests that there is nothing wrong with my data, as the difference between my average and the expected average was only 0.004, or 0.00385 of a standard deviation, and this is equivalent to a 99.69% chance that our result is not a fluke, but rather just due to the expected random variation.


Now imagine that we have a situation in which we think our dice are “loaded” – they always come up showing a six. If we repeated our 20000 throws with these dice the average value would obviously 12.0, which is out from our expected average by 5.00 or 2.07 standard deviations (2.07?). This would seem to be very good evidence that there is something very seriously wrong with our dice, but a 2.07? result isn’t good enough for physicists. At a confidence level of 2.07? there is still a 1.92%, or 1 in 52, chance that our result is a fluke.

In order to show that our result is definitely not a fluke, we need to collect more data. Throwing the same dice more times won’t help, because the roll of each pair is independent of the previous one, but throwing more dice will help.

If we threw twenty dice the same 20000 times then the expected average total score would be 70, and the standard deviation should be 7.64. If the dice were loaded then the actual average score would be 120, making our result out by 6.55?, which is equivalent to a chance of only 1 in 33.9 billion that our result was a fluke and that actually our dice are fair after all. Another way of thinking about this is that we’d have to carry out our experiment 33.9 billion times for the data we’ve obtained to show up just once by chance.

This is why it takes a very long time to carry out some experiments, like the search for the Higgs Boson or the recent BICEP2 experiment referenced above. When you’re dealing with something far more complex than a loaded die, where the “edge” is very small (BICEP2 looked for fluctuations of the order of one part in one hundred thousand) and there are many, many other variables to consider, it takes a very long time to collect enough data to show that your results are not a fluke.

The “gold standard” in physics is 5?, or a 1 in 3.5 million chance of a fluke, to declare something a discovery (which is why Linde’s wife in the video above blurts out “Discovery?” when hearing the news from Professor Kuo). In the case of the Higgs Boson there were “tantalising hints around 2- to 3-sigma” in November of 2011, but it wasn’t until July 2012 that they broke through the 5? barrier, thus “officially” discovering the Higgs Boson.

The man who put his head in a particle accelerator

The U-70 synchrotron control room.

On July 13 1978 Anatoli Bugorski, a physicist working on the U-70 synchrotron at the Institute of High Energy Physics in Protvino, Russia decided to put his head into the particle accelerator whilst it was running. Presumably he did not know it was running at the time, and presumably there were some safety features that should have prevented him from doing so but which had failed.

Nonetheless, Bugorski somehow managed to put his head into a beam of 76 GeV protons (for comparison, the LHC accelerates protons to an energy of 3500 GeV).

The beam caused a flash in Bugorski’s eyes “brighter than a thousand suns” and the left side of his face swelled up beyond recognition. He was later taken to a state hospital that specialised in treating radiation injuries where it was expected he would die. Amazingly, despite the huge dose of radiation, Bugorski survived; probably due to the fact that the radiation was confined to a very small area.

A photograph of Bugorski taken for Pravda in 1998.

With the left side of his face paralysed, with no hearing in his left ear and suffering from seizures, Bugorski still managed to complete his PhD and became the coordinator of experiments at the U-70 accelerator.

(I’d like to point out that the original title of this post was going to be In Soviet Russia, Particles Accelerate You [source] but I resisted.)