But What About Entropy?
When I first shared the idea that self-organization and aggregation are unifying principles behind the emergence of complexity in the universe, someone immediately said, “But doesn’t that violate the second law of thermodynamics?” I wasn’t surprised. The second law is one of those grand scientific ideas that has drifted into popular culture, often in a garbled form. Many people have absorbed the notion that entropy is simply disorder, and that any emergence of order (anything that looks like pattern, structure, or complexity) must somehow be cheating the rules.
It isn’t.
In fact, the emergence of order through self-organization happens precisely because of thermodynamic principles, not in spite of them.
Let’s start by clarifying what entropy actually is. In the simplest terms, entropy is a measure of the number of microstates a system can take on without changing how it looks or behaves at the macroscopic level (its macrostate) (Boltzmann 1877). The more microstates consistent with a macrostate, the higher the entropy. In other words, entropy measures probability, not disorder.
Think of a pool table with the balls set in a neat triangle. The macrostate is “triangle arrangement,” but the microstates are the exact positions of the balls within that shape. If you swapped two balls, the macrostate would be unchanged. But when you “break,” the balls scatter across the table. The chance that they will reassemble into a tidy triangle is vanishingly small. Instead, the overwhelmingly likely outcome is that the balls are randomly distributed all over the table. And once the balls are scattered, further rearrangements barely register, because the macrostate (“balls all over the table”) has far more microstates (exact positions of each of the balls) associated with it than the original triangle. The entropy has increased.
The same principle applies to everyday life. A messy room has higher entropy than a tidy one because there are vastly more ways to be messy than to be perfectly ordered. Left alone, systems drift toward states with more possible arrangements (more entropy) simply because those states are overwhelmingly more probable.
Entropy is about probability. Systems with higher entropy appear more disordered because disorder is much more probable than order. Almost any change you make to a system will either keep it equally disordered, or increase its disorder.
This statistical view leads directly to the second law of thermodynamics which says that in an isolated system, entropy cannot decrease. An isolated system is one that exchanges neither energy nor information with its surroundings. But most of the systems we care about, such as stars, rivers, brains, economies, ecosystems, are not isolated. They are open systems, constantly exchanging energy and matter with their environment. And in open systems, local decreases in entropy are not only allowed; they’re expected. The key is that any local increase in order must be paid for by a greater increase in entropy elsewhere.
Consider boiling water. As you heat a pot on the stove, energy enters from below. At a certain point, the water doesn’t just warm uniformly. Instead, it forms convection cells: graceful, rolling loops where hot water rises in the center, cools at the surface, and sinks at the edges (Chandrasekhar 1961). These cells are not chaotic; they are structured, repeating, and beautiful. Why do they form?
Because they are a more efficient way for the system to move heat from the bottom of the pot to the top. The system organizes itself to dissipate energy faster. The local pattern is a dissipative structure, a moment of order born from the drive toward greater global entropy.
It’s a bit like a ballroom dance. Imagine a room of people standing still. Suddenly, music starts to play (the heat source), and small groups begin moving in swirling patterns. These movements aren’t chaotic; they follow a rhythm, a flow. The dance floor becomes alive with circulating motion, but it’s all being driven by the energy entering from outside.
The same phenomenon occurs in the Belousov–Zhabotinsky (BZ) reaction, one of the most hypnotic examples of chemical self-organization (Zhabotinsky 1964). In a shallow dish, you mix a set of chemicals, and instead of quietly settling into equilibrium, the solution pulses with waves of color: orange to blue to clear to orange again. It looks alive, as though the chemicals are breathing. No one commands these cycles; they emerge spontaneously from the reactions themselves.
The BZ reaction is like a chemical drum circle. Imagine a group of drummers seated in a wide ring. At first, the beats are random. But as energy flows into the group, through excitement, anticipation, or momentum, a rhythm begins to emerge. One beat triggers another, and soon the entire circle pulses with coordinated waves. The rhythm wasn’t dictated from above. It arose from local feedback, from each drummer responding to the others. Just like the chemical oscillations, the pattern is sustained only so long as energy continues to flow.
These are not quirks or exceptions. They are examples of a general principle: open systems driven by energy flows often produce ordered, dynamic patterns as a way of increasing overall entropy more effectively.
In biology, we see this writ large. Living organisms are intricate hierarchies of aggregated structures: cells, tissues, organs, organisms, maintained by a constant intake and outflow of energy. The organization of life doesn’t defy the second law. Life is powered by it. We eat, metabolize, and release heat and waste. In doing so, we increase entropy in our surroundings even as we maintain order within ourselves.
And this isn’t only true on Earth. Recent analysis of samples from the asteroid Bennu revealed the presence of organic molecules: sixteen amino acids used by life on Earth, all four DNA nucleobases, and uracil, a key part of RNA (Lauretta et al. 2024). These aren’t simple molecules; they are highly complex arrangements of atoms, with specific three-dimensional structures and delicate chemical properties. That such molecules, fundamental to the machinery of life, could arise spontaneously in the absence of living systems has long seemed improbable to many. And yet Bennu shows us otherwise. These molecules assembled through natural chemical processes, likely catalyzed by minerals as the asteroid formed, and their presence is a clear extraterrestrial demonstration that molecular complexity can arise through physical law alone.
Far from violating thermodynamics, the aggregation of matter into complex molecules, cells, and structures is one of its most compelling consequences. The universe builds patterns because those patterns are good at burning through energy. And the better a structure is at channeling energy flows, the more likely it is to persist, grow, and evolve.
Even evolution itself is a kind of ratchet. Natural selection preserves and builds upon structures that can survive and reproduce. This too is a kind of aggregation: useful arrangements of molecules lead to cells; useful arrangements of cells lead to organisms; useful behaviors lead to adaptive populations. At every step, there is order; and at every step, entropy is increasing overall.
Some physicists have proposed a broader principle: that systems naturally evolve toward states that maximize their entropy production (Dewar 2003). Whether or not this principle turns out to be universal, it captures a deep truth. Order doesn’t fight entropy. It rides it (a perspective aligned with Sean Carroll’s poetic naturalism; Carroll 2016).
When we see a snowflake, a tree, a spiral galaxy, or a human brain, we are seeing the output of energy flows and feedback loops that naturally generate structure. These structures are not anomalies in a universe of decay. They are part of the same choreography of thermodynamics, playing out in time.
Self-organization and aggregation are not only consistent with entropy; they may be the most beautiful ways that entropy moves through the world.
Even as entropy flows through us, shaping and sustaining every system we are part of, it also gives rise to something unexpected: moments. These quiet, vivid instants, seemingly mundane, are, in fact, the universe made visible at our scale.