Skip to main content

Solving IMO 1989 #6 using Probability and Expectation

The Importance of the Higgs Boson


The Higgs boson is the smallest detectable wave in the Higgs field. Interacting with the Higgs field causes  articles to acquire inertial mass; without the Higgs field, no particle would have inertial mass. Some particles don't feel the Higgs field at all (photons) and so are massless; some feel it very lightly (neutrinos) and have little mass; ordinary particles feel it strongly.
In physics the current best understanding of the forces (excluding gravity) is called the Standard Model. The one remaining elementary particle in the Standard Model that hasn't been experimentally detected is - the Higgs boson.
The Standard Model describes these forces:
Electromagnetism (attraction/repulsion due to electric charge)
Weak force (causes radioactive decay)
Strong force (holds quarks together to form protons,neutrons)
Electromagnetism is the unification of electricity and magnetism, which were originally thought to be two different forces.
The next steps in physics would be:
Electroweak unification - electromagnetism and weak force unified into the "electroweak" force. The Higgs field explains why these two forces normally appear to be different. The discovery of the Higgs boson could be considered final verification for electroweak unification.
Grand unified theory - electroweak and strong force unified into the "grand" force.
Theory of everything - grand force and gravity unified. This is the ultimate purpose behind areas of research such as string theory.

Comments

Popular posts from this blog

A Bayes' Solution to Monty Hall

For any problem involving conditional probabilities one of your greatest allies is Bayes' Theorem. Bayes' Theorem says that for two events A and B, the probability of A given B is related to the probability of B given A in a specific way.

Standard notation:

probability of A given B is written \( \Pr(A \mid B) \)
probability of B is written \( \Pr(B) \)

Bayes' Theorem:

Using the notation above, Bayes' Theorem can be written: \[ \Pr(A \mid B) = \frac{\Pr(B \mid A)\times \Pr(A)}{\Pr(B)} \]Let's apply Bayes' Theorem to the Monty Hall problem. If you recall, we're told that behind three doors there are two goats and one car, all randomly placed. We initially choose a door, and then Monty, who knows what's behind the doors, always shows us a goat behind one of the remaining doors. He can always do this as there are two goats; if we chose the car initially, Monty picks one of the two doors with a goat behind it at random.

Assume we pick Door 1 and then Monty sho…

What's the Value of a Win?

In a previous entry I demonstrated one simple way to estimate an exponent for the Pythagorean win expectation. Another nice consequence of a Pythagorean win expectation formula is that it also makes it simple to estimate the run value of a win in baseball, the point value of a win in basketball, the goal value of a win in hockey etc.

Let our Pythagorean win expectation formula be \[ w=\frac{P^e}{P^e+1},\] where \(w\) is the win fraction expectation, \(P\) is runs/allowed (or similar) and \(e\) is the Pythagorean exponent. How do we get an estimate for the run value of a win? The expected number of games won in a season with \(g\) games is \[W = g\cdot w = g\cdot \frac{P^e}{P^e+1},\] so for one estimate we only need to compute the value of the partial derivative \(\frac{\partial W}{\partial P}\) at \(P=1\). Note that \[ W = g\left( 1-\frac{1}{P^e+1}\right), \] and so \[ \frac{\partial W}{\partial P} = g\frac{eP^{e-1}}{(P^e+1)^2}\] and it follows \[ \frac{\partial W}{\partial P}(P=1) = …

Mixed Models in R - Bigger, Faster, Stronger

When you start doing more advanced sports analytics you'll eventually starting working with what are known as hierarchical, nested or mixed effects models. These are models that contain both fixed and random effects. There are multiple ways of defining fixed vs random random effects, but one way I find particularly useful is that random effects are being "predicted" rather than "estimated", and this in turn involves some "shrinkage" towards the mean.

Here's some R code for NCAA ice hockey power rankings using a nested Poisson model (which can be found in my hockey GitHub repository):
model <- gs ~ year+field+d_div+o_div+game_length+(1|offense)+(1|defense)+(1|game_id) fit <- glmer(model, data=g, verbose=TRUE, family=poisson(link=log) ) The fixed effects are year, field (home/away/neutral), d_div (NCAA division of the defense), o_div (NCAA division of the offense) and game_length (number of overtime periods); off…