Learn something new every day
More Info... by email
Algorithmic complexity, (computational complexity, or Kolmogorov complexity), is a foundational idea in both computational complexity theory and algorithmic information theory, and plays an important role in formal induction.
The algorithmic complexity of a binary string is defined as the shortest and most efficient program that can produce the string. Though there are an infinite number of programs that can produce any given string, one program or group of programs will always be the shortest. There is no algorithmic way of finding the shortest algorithm that outputs a given string; this is one of the first results of computational complexity theory. Even so, we can make an educated guess. This result, (the computational complexity of a string), turns out to be very important for proofs related to computability.
Since any physical object or property can in principle be described to near-exhaustion by a string of bits, objects and properties can be said to have algorithmic complexity as well. In fact, reducing the complexity of real-world objects to programs that produce the objects as output, is one way of viewing the enterprise of science. The complex objects around us tend to come from three main generating processes; emergence, evolution, and intelligence, with the objects produced by each tending towards greater algorithmic complexity.
Computational complexity is a notion frequently used in theoretical computer science to determine the relative difficulty of computing the solutions to wide classes of mathematical and logical problems. More than 400 complexity classes exist, and additional classes are continuously being discovered. The famous P = NP question concerns the nature of two of these complexity classes. Complexity classes include problems far more difficult than anything one might confront in mathematics up to calculus. There are many imaginable problems in computational complexity theory that would require a near-infinite amount of time to solve.
Algorithmic complexity and related concepts were developed in the 1960s by dozens of researchers. Andrey Kolmogorov, Ray Solomonoff and Gregory Chaitin made important contributions in the late 60s with algorithmic information theory. The principle of minimum message length, closely related to algorithmic complexity, provides much of the foundation of statistical and inductive inference and machine learning.
One of our editors will review your suggestion and make changes if warranted. Note that depending on the number of suggestions we receive, this can take anywhere from a few hours to a few days. Thank you for helping to improve wiseGEEK!