The second law of thermodynamics has dominated physics, chemistry, engineering and biology since the steam engine began modernizing the world. An upgrade is underway.
The study of energy began in the 1800s as steam engines drove the Industrial Revolution. Imagine a sponge cake, fresh from the oven, cooling on a countertop. The smell of the cake is carrying heat. A physicist might wonder how many different ways the molecule can be arranged. This number of arrangements is called the molecule's entropy. When the cake is fresh, the entropy is relatively small. The volume of the whole kitchen is larger than the entropy because the molecules have had time to travel farther. The second law of thermodynamics says that the system's entropy grows or stays constant if the windows and doors are not open. The scent of sponge cake is inescapable in the kitchen.
This behavior is summed up in an inequality: S_i, where S_i is the molecule. The inequality isn't useful because it doesn't tell us how much the entropy will grow. It happens when large-scale properties remain constant, and no net flows of anything enter or leave the system. Our cake's scent molecule reaches equilibrium after it has fully filled the kitchen. The second law strengthens to an equality at equilibrium. The general equality gives precise information about many different types of equilibrium systems.
Most of the world is far from equilibrium. The wild west to theoretical physicists and chemists is not equilibrium. It's difficult to prove equalities about physics far from equilibrium on the wild west.
It's not impossible. Physicists have worked with equalities to strengthen the second law. The equalities are called fluctuation relations. It is difficult to reason about the properties of systems far from equilibrium.
Imagine a strand of DNA floating in water. The DNA is at equilibrium, sharing the water's temperature. We can use lasers to hold one end of the strand steady. Stretching the strand shocks it out of equilibrium and requires work in the physics sense of the word: structured energy harnessed to accomplish a useful task. The amount of work required varies from strand to strand, since a water molecule sometimes kicks the strand here. Every possible amount of work has a chance of being needed in the next pull.
These probabilities are related to the properties of the DNA at equilibrium. An equality can capture that relation.
The core of fluctuation relations are the properties of a system far from equilibrium. Chris Jarzynski discovered this in 1997. The rest of us call it Jarzynski's equality, but he calls it the nonequilibrium fluctuation relation. One of the most famous tests of this principle was the DNA experiment, but the equation governs loads of systems, including those that involve electrons, beads the size ofbacteria and brass oscillators that look like tire swings.
Fluctuation relations have implications. We can derive an expression of the second law from these equalities. As we saw with the DNA strand, fluctuation relations extend our knowledge far from equilibrium, and also recapitulate information we know about equilibrium.
While equilibrium properties are easier to reason about theoretically, they are harder to measure than far-from-equilibrium properties. To measure the work needed to stretch the DNA far out of equilibrium, we can simply pull the strand quickly. To measure the work needed to stretch it while it remains at equilibrium, we would have to stretch so slowly that the DNA would always remain at rest.
Using fluctuation relations gives scientists an experimental foothold because they are interested in the equilibrium properties of the molecule. They can measure the work required in each trial. They can use this data to estimate the likelihood of needing work in the next trial. They can plug those probabilities into the equilibrium side of the fluctuation relation. Researchers have used mathematical tools to mitigate the difficulty of the method.
In this way, fluctuation relations have provided detailed predictions about the world far from equilibrium. Their usefulness doesn't stop there.
During the 2000s, those of us who study how quantum physics changes classical concepts like work, heat and efficiency wanted in on the fun, even though our discipline introduces extra puzzles. It is not clear how to define and measure quantum work because of quantum uncertainty.
Different researchers have proposed different definitions for quantum work. I think of the definitions as species. The definition of the hummingbird requires us to measure the quantum system gently, as the fluttering of a hummingbird's wings by your ear would disturb you. A definition that focuses on average energy exchanges keeps it in the middle of the pack. There are other definitions in the quantum-thermodynamics literature.
Different definitions lead to different quantum fluctuations. Similar definitions can be adapted to different physical settings. Some relations are easier to test than others. One describes chaos in black holes, and another describes the universe's expansion. Some quantum fluctuations have been tested with trapped ion and quantum dots.
Will one equality rise to the top of the pile, like a monarch who bested all their relatives for the throne? I don't expect that. In my opinion, the usefulness of definitions and equations depends on which system you are interested in, how you poke it and how you measure it.
The Theory of Everything expected to unify all the fundamental forces is stereotypically prized by physicists. It is possible that some principle will unify the quantum fluctuation relations and show them to be different sides of a coin. Maybe quantum thermodynamics is richer than other fields of physics.