Time moves forward. Present comes after past and before future. Past, Present and Future are a fundamental ordinality of our language. It’s a moot question to ask why Past has to come before Present — because the words have been defined that way. But the real question is — why should there be an order to time?
Fundamental Laws of Physics and The Arrow of Time
There are four fundamental forces in nature. Everything in the universe can be explained as an interaction of matter and/or energy, by these four fundamental forces viz. Gravitational force, Electromagnetic force, Strong Nuclear force and Weak Nuclear force.
All these forces have fundamental equations that govern the behaviour interactions taking place with these forces e.g. General theory of relativity for gravitation, Maxwell’s Equations for classical theory of Electromagnetism [Caution: Wikipedia links are easy rabbit holes to fall into]. These models have been around for so long, because they work i.e. they predict what we’ll observe and explain what we have observed.
There’s a problem though — all these fundamental theories about the fundamental forces of nature are symmetric in time. There’s no temporal order to these fundamental equations. Anything that happens for time t = 0 → t = 1, can also happen for t = 1 → t = 0. In these equations, what we call Past does not necessarily have to come before Present.
If these grand models, with sophisticated equations and ingenious formulations for the Universe are ‘correct’, shouldn’t the most basic thing of life also be factored in? Time moves forward, we remember the Past and we expect the Future while being in Present. There’s an ‘Arrow of Time’ which points in one direction. But that arrow of time is inherently absent in fundamental formulations of the universe.
This isn’t a new problem. This incongruence has been conspicuous since the time of Newtonian Theory of Motion. It was not until Boltzmann’s Formulation of Thermodynamics, specifically that of Second of Law of Thermodynamics (SLOT), that there was some mathematical formulation of this forward march through time.
Entropy — The Concept
Entropy has many meanings —
Entropy of a physical system is the amount of energy, which is no longer available for physical work.
Information Entropy quantifies the amount information that can be carried with the theoretically best lossless compression algorithm.
Entropy of a dynamical system, is the average flow of information per unit time.
Entropy in human language, has come to mean disorder.
Although there are many contextual meanings of entropy the tenet is disorder. A glass of warm milk kept in a room, eventually becomes the same temperature as that of the room. It does not spontaneously become hotter. The glass of milk spontaneously comes into thermal equilibrium with the room. To push the glass away from the equilibrium requires work i.e. either to make it colder or hotter than the surroundings, will require a heater or a refrigerator, each of them requiring some amount of work.
The SLOT, has been restated in many forms, and all of them more or less say this — To take a system away from thermal equilibrium, will require energy. The system would rather sit in the equilibrium. And, if you push the system away from equilibrium, it will eventually return to equilibrium, and all the work which you did to push it away from equilibrium, is lost.
The restatement of SLOT, that is relevant here is this —
The total Entropy of a system either increases or remains constant in any spontaneous process; it never decreases.
When you drop a wine glass on the floor, and it shatters, the entropy of the system increases. Have you ever seen, in real life, a shattered wine glass coming together to become un-shattered? It will, in fact, take a lot of work to un-shatter the wine glass. A wine glass has a definite shape, and hence, is ordered. There are many more ways for the wine glass to be shattered, than for it to be un-shattered. It is simply a matter of probability, that you see a wine glass spontaneously shatter more often than for it to become un-shattered.
A physical system moves from one state to another, only if the Entropy increases.
Ludwig Boltzmann formulated a statistical interpretation of Entropy, encapsulated by an equation, which is also his epitaph —
S = k. log Ω
Boltzmann showed that the SLOT is merely a statistical implication. The ordered states of a system are less probable than the disordered states, hence a system tends to move towards a more probable state. Next time, you see an egg being scrambled, just know that there are way more ways to scramble an egg than to unscramble.
Entropy in Other Places
I found a couple of interesting cases where entropy plays the central role —
A New Physics Theory of Life — Jeremy England in his paper interpreted the origin of life as a statistical outcome of atoms interacting with a large heat bath (large water bodies). Although, the math of the paper is well-established, but the interpretation — that if you sign light on a bunch of atoms surrounded by a heat bath for long enough, you would get life — is not widely accepted. However, what’s interesting is that entropy plays a major role. In fact, organisms reproduce naturally because it increases the overall entropy of the universe, even though locally it reduces the entropy (a fetus is a highly ordered object).
Measuring Entropy of Art — Art is simple and complex at the same time. Haraldo Ribeiro et.al. in his paper “History of art paintings through the lens of entropy and complexity” developed a computer program to deconstruct 140,000 digitized painting from different eras spanning over a millennium to estimate the entropy of paintings. Ribeiro et.al assessed the statistical complexity and permutation entropy of each painting. The changes in the magnitude of complexity and entropy were along the lines that art historian divide different styles of painting. Modern art, with loose brush strokes are less complex and have higher entropy compared to the older arts with definite shape, hence more complex and have lower entropy. Art too, seems to have gone through the eventual increase in entropy.
Entropy is one of the few all-encompassing concepts in Physics, which we see every day in our lives. There are of course, sophisticated formulations of entropy, and every context for entropy needs different definition and math to explain behavior of a system. Each of them shed an interesting light on system, sometimes confirming what we expect intuitively and other times predicting counterintuitive behavior. Next time, somebody asks you “What’s up?” and accurate reply would be - “Same as everything in the universe — Entropy is up.”