Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

A Tale of Two Levels

I imagine a world made entirely of particles. They are bumping around like billiard balls, colliding, exchanging momentum, following precise trajectories. To me, they are moving slowly enough that I can tell exactly what will happen next. When this particle strikes that one, momentum is transferred, directions change, and I can predict their future states with exquisite precision.

I am what is known in physics as Laplace’s Daemon. I live at the micro-level.

At this level, there is no such thing as order or disorder. There is no storm and no calm. There are only particles changing their states through time according to fixed laws. Nothing is surprising, and nothing is uncertain, not because the system is simple, but because nothing is being summarized.

Now I zoom out.

I am no longer Laplace’s Daemon. I am a fisherman in a small boat, caught in a storm. The wind is howling. Waves crash over the bow. The boat pitches violently beneath my feet. I am frightened. I am struggling to stay alive. I am trying to judge whether the storm is weakening or growing worse, whether I should ride it out or attempt to escape.

At this level, everything feels chaotic. Order and disorder suddenly matter a great deal. Prediction is difficult. Survival is uncertain.

And yet the same stuff makes up both scenes.

The particles are still there for the fisherman. They make up the water, the air, the boat, and even my own body. Nothing about the underlying physics has changed. But at this level, I cannot track particles. I must attend to large-scale patterns, to waves, winds, pressures, and forces that emerge only when countless micro-events are taken together.

Here, entropy becomes meaningful.

When I am Laplace’s Daemon, I do not care what a storm is, or a boat, or a fisherman. Those concepts do not exist at my level of description. All I see are particles evolving predictably through time.

But when I am the fisherman, the microstate is irrelevant. I live in a world of macrostates, where patterns matter more than particles, and where uncertainty is not a failure of knowledge but a feature of the level at which I must act.

The mistake is to imagine that one of these perspectives cancels the other. They do not compete. They coexist, each complete within its own domain, each blind to what only the other can see.

The shift from particles to storms is not unique to physics. It is simply a particularly vivid example of a pattern that appears again and again whenever complex systems organize themselves into layers.

In the brain, individual neurons fire or remain silent, governed by electrochemical dynamics that are local and mechanical. At that level, there are no beliefs, no intentions, and no meanings, only spikes propagating through networks. And yet, when we zoom out, those same neurons participate in thoughts, memories, emotions, and plans. No neuron contains a thought, just as no particle contains a wave. Thought exists only at a higher level of organization, where the activity of many neurons is summarized into stable patterns.

The same shift appears in social systems. Individual people make choices, speak words, and take actions for reasons that are local to them. But societies exhibit properties that no individual possesses: norms, institutions, cultures, economic trends. Inflation is not something a person does. A recession is not located in any single decision. These are macroscopic patterns that emerge only when countless individual actions are taken together.

In each case, the mistake is the same. We imagine that because the lower level is more detailed, it must be more real or more explanatory. But detail is not explanation. A complete description at one level does not render another level redundant. It simply makes it inaccessible.

The same misunderstanding now appears in discussions of artificial intelligence.

Modern transformer models are often described as “just predicting the next token.” This description is technically correct and profoundly misleading.

At the micro-level, the model does exactly that. Given a sequence of tokens, it computes a probability distribution over possible next tokens and samples from it. Nothing more. There is no explicit database of facts, no symbolic world model, no internal narrator that understands what it says.

But when we zoom out, something else becomes visible.

At the macro-level, the model behaves as if it has a working model of the world. It can answer arbitrary questions, reason across domains, maintain consistency over long stretches of text, and adapt its responses to novel situations. These abilities do not reside in any single token prediction, just as a storm does not reside in any single particle collision.

The world model is not a separate module hidden inside the system. It is an emergent pattern, a large-scale regularity in how predictions are constrained across many layers and many tokens. Asking where the world model is stored is like asking where a wave is stored in the ocean. It is not stored. It is enacted.

Confusion arises when we try to explain behavior at one level using the concepts of another. To someone focused only on tokens, the model’s apparent understanding looks like illusion or sleight of hand. To someone focused only on behavior, the token-level mechanics seem irrelevant. Both perspectives are incomplete on their own.

What these examples share is not metaphor, but structure. Complex systems do not merely accumulate parts. They organize them. When they do, new levels of description become not only useful but unavoidable. Each level brings with it its own objects, its own regularities, and its own notions of order and disorder.

To live at one level is to be blind to another, not because one is wrong, but because no single vantage point can see the whole hierarchy at once.