A New Physics

I’ve always thought that physics took a wrong turn when it ran wild with wave mechanics. It’s true that wave mechanics did manage to solve just about every problem associated with anything smaller than a micrometer — I don’t deny the value of that. But conceptually, I’ve always thought that the Heisenberg Uncertainty Principle was more fundamental: 

The uncertainty in the measured position of a particle multiplied by the uncertainty in its measured momentum is always greater than or equal to the Heisenberg constant, which is 6.62 x 10^-34 Joule-seconds. 

or, in physics-ese:

∆x * ∆p ≥ h

This equation has always struck me as important, for two reasons. First, unlike any other fundamental equation in physics, it describes information, not some obviously physical quantity like mass or velocity. It’s not about the position x or the momentum p — it’s about the UNCERTAINTY in our measurements of x and p. It’s about information.

The second thing that has always bothered me about this law is that it is expressed as an inequality, not an equality. All the other fundamental physics equations are just that: statements of equality. But this one is a statement of inequality. That’s odd. Oh, wait! There is one other fundamental law of physics that expresses an inequality: the Second Law of Thermodynamics:

dS/dt ≥ 0

“The entropy of an isolated system always increases with time.” OK, so what is entropy? Look it up and you’ll find lots of different characterizations. Entropy is confusing, largely because the concept first arose in thermodynamics and was thought of in terms of heat and temperature. Later on it was redefined in terms of statistical mechanics, which described it in terms of the number of states accessible to an isolated system. Still later it got an information-theory defintion based on the amount of information being communicated. Furthermore, the concept itself is abstruse. It’s not about anything tangible or sensible. That is, you cannot directly sense entropy. You can’t see it in a telescope or a microscope, weigh it with scales, measure it with a stopwatch, or anything like that. 

So what is my own description of entropy? First, I prefer to think in terms of negentropy, which is the negative or opposite of entropy. Negentropy is a tad easier to understand than entropy. Here’s one way of perceiving the difference. Consider the following two sequences of binary digits:

100101011001000010111100010110101001

1111111000001111100000000000111111100

The upper sequence has less negentropy than the lower one. An easy way to perceive this is to imagine how much information you’d have to communicate in order to tell somebody else what each sequence contains. For the upper one, the easiest way to tell somebody what it contains is simply to present the sequence itself. But for the lower one, you could say something like “7 ones, 5 zeros, 5 ones, 11 zeros, 7 ones, and two zeros.” If you wanted to store it compactly, you’d write it as something like 7,5,5,11,7,2. In computer science, this kind of thing is called data compression, and the degree of compressibility of a bunch of data is a measure of its negentropy. 

You can also think of negentropy as information. Something with lots of information content has lots of negentropy. Here are two photos of my dog Moose sleeping on the couch:

Final Moose

I deliberately blurred the lower picture. It contains less information than the upper picture. When saved in jpg format, the upper photo consumes 99 KB of space, while the lower one needs only 36 KB. This demonstrates quantitatively that there’s less information in the lower photo. Hence, negentropy and information content are pretty much the same thing. 

Back to Second Thermo
Recall that the Second Law of Thermodynamics states that the entropy of an isolated system will never decrease. This means that the negentropy — the negative of the entropy — will never increase. And since negentropy is information, we can say that the information content of an isolated system will never increase. 

So let’s imagine the simplest possible system of all: a single particle traveling through space unaffected by anything else. Let’s say that we already know its mass to be 1 kg, and we measure its position x and velocity v at time t. Let’s for the moment forget the Heisenberg Uncertainty Principle. We’ll still have some uncertainties in our measurements of the position x, the velocity v, and the time t. Let’s suppose that this particle is a ball with a mass of 1 kilogram, and we measure its position to be x = 1 ± 0.1 meters and its velocity v to be 2 ± 0.1 meters per second. OK, so far, so good. 

 But now we pull a trick We wait for, oh, say 1000 seconds. By now our ball should be somewhere around 2,000 meters down the path. We find it and measure its position and velocity again, and again we measure the new position and velocity with exactly the same uncertainties: now the ball is at x = 2010 ± 0.1 meters and its velocity is still 2 ± 0.1 meters per second. But here’s the catch: we can now calculate its velocity with much greater accuracy. Remember, distance = velocity x time, so velocity = distance/time. The ball traveled a distance of 2010 ± 0.2 meters in a time of 1000 seconds, so now we know that it’s true velocity was actually 2.010 ± 0.2 meters per second. 

The trick here is that, by simply waiting and taking a second measurement, we can get a more accurate measurement of the velocity of the ball. If we can do this trick for 1,000 seconds, we can do it for 1,000,000 seconds get a result that’s a thousand times more accurate. We can achieve as much accuracy as we want by just waiting. 

But accuracy is information, and information is negentropy. Therefore, we can increase our information about the system merely by waiting and measuring a second time. We can increase negentropy with time — thereby violating the Second Law of Thermodynamics!

Uh-oh.

But here is where Heisenberg’s Uncertainty Principle comes riding to the rescue. It guarantees that there is always a minimum error in our measurements.