How do you measure complexity? Is a sprawling, repetitive geometric pattern more complex than a short, chaotic splatter of ink? Is a library full of identical books more complex than a single, unique poem? Our everyday sense of complexity is vague. What was needed was a universal, objective yardstick—a way to measure the intrinsic complexity of any single object, be it a number, a text, or an image. The elegant and profound answer to this quest is Kolmogorov Complexity, an idea that defines complexity not by what something is, but by the shortest possible way to describe it.
The central insight behind Kolmogorov Complexity (also known as Algorithmic Complexity) is as simple as it is powerful:
The true complexity of an object is the length of the shortest possible computer program that can produce it as output.
This shifts our perspective entirely. Instead of looking at the size or appearance of an object, we look for its hidden patterns and regularities. If an object has a simple underlying structure, it can be described by a short set of instructions—a short “recipe.” If it is patternless and chaotic, the only way to describe it is to write it out in full.
Analogy: The Tale of Two Strings
Imagine you need to describe two different strings of text to a friend over the phone.
abababababababababababababababab (32 characters)8kH&tZ@Lp9#Js4VfQ7c$XwE*n2G!uM (32 characters)This leads to a profound connection: complexity is incompressibility.
Kolmogorov complexity gives us a formal way to define the entire spectrum from absolute order to absolute randomness.
Objects with low complexity are full of patterns and redundancy.
Example: The number Pi (Ï€). The first million digits of Pi form an incredibly long string of numbers. However, the Kolmogorov complexity of these digits is very low. Why? Because a relatively short computer program (an algorithm to calculate Pi) can generate all one million digits (and more). The complex output comes from a simple generative process.
An object is considered algorithmically random if it is incompressible. There is no hidden pattern, and the shortest program that can produce it is essentially the instruction “print the object.”
Example: A Fair Coin Toss. The result of a series of truly random coin flips, like H-T-T-H-T-H-H-H-T-H-T-T, is likely to be algorithmically random. There is no simple rule that generated it. While you might find small, coincidental patterns, there is no underlying recipe that is shorter than just listing the results. This gives us one of the most powerful definitions of randomness ever conceived.
Here is the subtle and crucial twist: Kolmogorov complexity is uncomputable.
This means that for any given object, we can never be 100% certain that we have found the absolute shortest program to generate it.
Analogy: The Shortest Story Competition
Imagine a competition to write the shortest possible story that perfectly captures the essence of War and Peace.
The problem is, we can never know for sure if we’ve reached the end. There’s no master algorithm that can look at a story and tell us, “Yes, this is the absolute shortest possible description that exists.” We can find short programs, but we can never be certain we’ve found the shortest.
Because of this, Kolmogorov complexity is not a practical tool you can use to measure things every day. It is a theoretical ideal—a philosophical and mathematical concept that defines the absolute limit of compression and the true nature of complexity.
These two concepts are related but measure fundamentally different things.
Shannon entropy is about probabilities and averages. It looks at the process that generates information and tells you the average amount of surprise you should expect.
It answers the question: “If I have a biased coin that lands on heads 90% of the time, how much information, on average, does each flip give me?” It’s a property of the source of the data.
Kolmogorov complexity is about a single, specific object that already exists. It doesn’t care about the process that made it or the probabilities of other outcomes. It just looks at the final artifact and asks, “What’s the shortest recipe for this exact thing?”
It answers the question: “Here is a specific sequence of 100 coin flips: HHHHH…H. What is the complexity of this particular sequence?” The answer is very low (print ‘H’ 100 times), even if the coin used was fair. It’s a property of the individual string itself.
Kolmogorov complexity provides a profound and elegant lens through which to view the world. It suggests that the universe is a mix of simple, compressible phenomena governed by concise physical laws (like the equations for gravity), and incompressible, random events. It gives us a formal definition for randomness and a theoretical yardstick for complexity, even if that yardstick is one we can never fully grasp. It reminds us that the simplest explanation is often the best, and that true complexity lies not in size, but in the stubborn refusal of an object to be described by anything other than itself.