High Entropy Markets

High Entropy Markets: When Nobody Knows What Anything Means

Imagine watching a Marie Kondo show in reverse. Marie walks backwards into a meticulously cleaned and organized apartment and begins untidying things. Clothing is unfolded and scattered, books are taken from shelves and placed randomly around the apartment. By the time Marie leaves the apartment, backwards, it looks like a hoarder lives there.

Backwards Marie as we will call this monster, can be thought of as an agent of Entropy. Entropy is a core concept in Physics and Information Theory, and I find it to be a useful tool for thinking about markets that are particularly interesting.

The Physics of Confusion?

Rudolf Clausius defined Entropy as a measure of the energy in a system that is unavailable to do work. Energy is conserved (First Law of Thermodynamics), but thanks to the Second Law of Thermodynamics the total Entropy of a closed system cannot decrease over time. A nice hot cup of coffee sits on a kitchen counter. An hour later you remember it after answering a ‘few’ emails and it is at room temperature. The kitchen as a whole sits at the same energy as before, but the heat from your sad forgotten coffee has dissipated. Entropy has increased.

Richard Feynman liked to explain entropy differently, and I think his version is more useful for our purposes. Entropy, Feynman argued, is the logarithm of the number of ways a system can be internally arranged while appearing the same to us externally. An explanation of Entropy, and its tendency to increase, can be got at by thinking of things called mircostates and macrostates. Macrostates are nice and easy. Things like temperature are a macro state. Microstates are the the arrangement of small (micro) things that allow to you get the same macro state. Think of a deck of cards. If I tell you there’s a deck perfectly sorted by suit and rank on my desk, there’s exactly one microscopic arrangement that satisfies that description. But if I tell you the deck is “shuffled”? There are about 10^68 different arrangements that all look equally “shuffled” to us. The shuffled deck has higher entropy because there are vastly more ways to be shuffled than sorted.

Interestingly, the microstate explanation shows us that things don’t have to cool down to increase in entropy. An ice cube has lower entropy than a puddle of water, which has lower entropy still than water vapor. Each phase transition opens up more possible arrangements for the molecules. More possibilities, more entropy.

This is all to do with the Physicists definition of Entropy. Information Theory borrows the idea of Entropy, and there are connections between the two at the level of the mathematics.

Information entropy measures uncertainty, that is, how surprised you’ll be by the next bit of data. A fair coin flip has one bit of entropy because either outcome is equally likely. A two-headed coin has zero entropy—there’s no uncertainty, no information gained by observing the flip. The more unpredictable a message, the higher its entropy.

High entropy means high unpredictability, multiple equally plausible interpretations. Low entropy means the message is predictable, the meaning clear. And this leads to the idea of a High Entropy Market.

What is a ‘High Entropy Market’?

A High Entropy Market is a market where nobody agrees on what anything means, where the underlying technology is shifting too fast for stable categories, and where value can be difficult to extract because the ‘energy’ is too dispersed.

In this context, high entropy manifests as confusion about multiple dimensions: technological feasibility, market applications, customer readiness, business models, and regulatory frameworks. This differs from simple “high risk” markets as high entropy specifically involves confusion about the fundamental nature of the opportunity itself.

Core Characteristics of High Entropy Markets

Definitional Ambiguity: The market lacks consensus on basic terminology. As an experiment, try writing researching a definition of the following terms:

  • Agent
  • 6G
  • Quantum Computing
  • Synthetic Biology
  • Blockchain and Web3

For at least some of these, you will likely find that a, everyone is using the same words differently or b, everyone is using different words for the same underlying concepts.

Value Location Uncertainty: It isn’t clear where in the ecosystem value accrues. Answering the question ‘Where Is The Money?’ can be confusing. In thermodynamics, useful work comes from gradients, that is, differences you can exploit. High entropy means gradients are not obvious (or are not there yet)

Competing Paradigms: Multiple incompatible mental models compete for dominance. Some see AI as automation, others as augmentation. Some think in terms of agents, others in terms of workflows. An extension of Definitional Ambiguity above, but more in terms of conflict about what a thing is for. This is where the old chestnut of a solution in search of a problem comes in and people used to talk about ‘The Killer App’ for X.

Rapid Capability Shifts: In a High Entropy Market, one of the challenges in determining what something means, what is it for and how we can make money from it predictably is that the ground shifts under ones feet. The underlying technology evolves so quickly that business models built on current limitations become obsolete before they mature.

Evaluating Entropy in a Market

Entropy as we are loosely applying it to markets can be found along at least three dimensions that we can evaluate:

  • Technical Uncertainty. That is, what is actually possible?
  • Economic Uncertainty. Who gets money, for what, and how?
  • Semantic Uncertainty. What are we talking about?

We can keep these dimensions in mind when we face the ultimate question of So What? But first, some motivating examples.

High Entropy Market Examples

The AI orchestration space is a perfect example. Every day brings new frameworks, new abstractions, new ways of thinking about agent coordination. The terminology shifts faster than documentation can be written. Companies pivot their entire positioning quarterly as they try to find stable ground in the categorical chaos.

Early cloud computing was another high-entropy market. Was it virtualization? Utility computing? Application hosting? The market didn’t know what it was selling or buying. Amazon’s genius wasn’t technical - it was categorical. They reduced entropy by defining clear service boundaries: compute (EC2), storage (S3), database (RDS). They created low-entropy pockets in a high-entropy space.

Blockchain markets remain high-entropy despite years of development. The definitional ambiguity persists: is it about currency, contracts, or coordination? The value location remains unclear: protocol layer, application layer, or infrastructure? The competing paradigms haven’t resolved: decentralization versus efficiency, permission versus permission-less.

Pushing Back

This is where we get to the So What part. Something interesting about Entropy is that Life pushes in the opposite direction. Scientists who study this talk about systems far from equilibrium - coffee cups that are hot, card decks that are organized and living things that stay organized in the face of the Second Law. We (living things) do this by investing energy into organizing things (hence perhaps ‘The Life-Changing Magic of Tidying Things Up’) and dumping Entropy elsewhere.

As in the examples of High Entropy Markets above, an investment of energy to reduce Semantic and Economic Uncertainty helped push back against High Entropy in the cloud computing space. And here’s the thing - markets that are interesting (e.g. valuable, but also fun to think about) do move away from a High Entropy state. This is where Markets defy Physics and our analogy ceases to be useful, but up to that point the following would be an interesting experiment:

  • Think of a High Entropy Market you are interested in.
  • See if you can evaluate where it sits in terms of the three dimensions of uncertainty mentioned above (Technical, Economic and Semantic)
  • Next, and this is the fun part I think, see if there are signs of the uncertainty decreasing along any of these vectors. Better yet, is there anything you can to to resolve this uncertainty for yourself?