I’m reading a book that deals with the history of probability theory (Classic Problems of Probability by Prakash Gorroochurn). It is interesting to look back from a modern perspective and realize how many heated arguments could have been resolved in a few minutes if the participants would have had access to a computer.
For example, Chapter 1 deals with Cardano and how he (understandably) confused probability with expectation. If you roll a die 3 times, the expectation value of the number of sixes is 0.

I recently came across a surprising connection between the geometric mean and logarithms. At least, it was surprising to me, but isn’t that surprising once you see the derivation (a lot of things in math are this way).
The connection is this. Take a set of observations \(y_1, y_2, \dots, y_n\). Assume that these are all positive numbers. Then \[ \langle \log(y) \rangle = \log(G).\] Here the angle brackets indicate the mean (expectation value), and \(G\) is the geometric mean of the observations.

In a previous post, I said that an appropriate purgatory for physicists would be to have them read all of the words ever written on the interpretation of quantum mechanics.
I have since noticed similar “purgatorial readings” in other fields. Here is a table:
Field Purgatorial Reading Physics everything ever written on the interpretation of quantum mechanics Theology everything ever written on the Filioque History everything ever written on the Fall of Rome Statistics everything ever written on frequentist vs.

In my last post, I discussed the geometric mean and how it relates to the more familiar arithmetic mean. I mentioned that the geometric mean is often useful for estimation in physics.
Lawrence Weinstein, in his book Guesstimation 2.0, gives a mental algorithm for approximating the geometric mean. Given two numbers in scientific notation \[ a \times 10^x \quad \text{and}\quad b \times 10^y, \] where the coefficients \(a\) and \(b\) are both between 1 and 9.

Given two positive numbers \(a\) and \(b\), the most well-known way to average them is to take the sum divided by two. This is often called the average or the mean of \(a\) and \(b\), but to distinguish it from other means it is called the arithmetic mean: \[ \text{arithmetic mean} = \frac{a + b}{2}. \]
Another common, and useful, type of mean is the geometric mean. For two positive numbers \(a\) and \(b\), \[ \text{geometric mean} = \sqrt{ab}.

I was recently flipping through a real analysis textbook, and came across a notation that I had not seen before. First, the set difference of two sets \(A\) and \(B\) (which I had seen before) was defined as \[ A \setminus B = A \cap B^c. \] Here \(B^c\) is the set compliment of \(B\), or all elements not in \(B\). You can equivalently write the set difference as \[ A \setminus B = \{x \, |\, x \in A\, \text{and}\, x \notin B\} \]

Probability Current in Quantum Mechanics In nonrelativistic 1D quantum mechanics, consider the probability of finding a particle in the interval \([x_1, x_2]\): \[ P_{x_1,x_2} = \int_{x_1}^{x_2} |\Psi(x,t)|^2\, \text{d}x. \] It is interesting to look at how this probability changes in time. Taking a time derivative, and using the Schrodinger equation \[ i \hbar \partial_t \Psi = \frac{-\hbar^2}{2m} \partial_x^2 \Psi + V \Psi \] to simplify things, we get (assuming a real potential) \[ \frac{d P_{x_1, x_2}}{dt} = \frac{\hbar}{2mi}\int_{x_1}^{x_2} \left[ (\partial_x^2 \Psi^*) \Psi - (\partial_x^2 \Psi) \Psi^* \right] \, \text{d}x \] where since \[ \left[ (\partial_x^2 \Psi^*) \Psi - (\partial_x^2 \Psi) \Psi^* \right] = \partial_x \left[ (\partial_x \Psi^*) \Psi - (\partial_x \Psi) \Psi^*\right], \] we can write this as \[ \frac{d P_{x_1, x_2}}{dt} = J(x_1, t) - J(x_2, t), \] where \[ J(x,t) = \frac{\hbar}{2mi} (\Psi^* \partial_x \Psi - \Psi \partial_x \Psi^*) \] is called the “probability current.

I once read, though I cannot now recall where, that an appropriate purgatory for theologians would be forcing them to read all of the words ever written on the *Filioque*.

If that is so, then an appropriate purgatory for physicists would be forcing them to read all of the words ever written on the interpretation of quantum mechanics.

I am teaching undergraduate quantum mechanics for the first time this semester. One thing I have discovered is that it is very easy to make mistakes when talking about quantum mechanics. Not mathematical mistakes (the math is fairly straightforward), but conceptual mistakes in the interpretation of the mathematics.
I was therefore pleased to read a recent paper by Blake Stacey entitled “Misreading EPR: Variations on an Incorrect Theme.” The “EPR” in the title stands for Einstein-Podolsky-Rosen (Einstein and his two postdocs at the time), and is used as shorthand for a famous thought-experiment these three published in 1935.

What is the equation of a straight line in the complex plane? There are many different forms, but I want to look at some of the simplest ones.
If you know the slope \(m \in \mathbb{R}\) and intercept \(b \in \mathbb{R}\) of the line, you can write an equation in parametric form \[ z = x + i (m x + b) ,\] where \(x \in \mathbb{R}\) and \(z \in \mathbb{C}\).