Main Page | See live article | Alphabetical index

Chaitin's constant

Chaitin's constant, &Omega (capital omega), also called the halting probability, is a construction by Gregory Chaitin. For a given model of computation or programming language, Ω is the probability that a randomly produced string will represent a program that, when run, will eventually halt.

The fact that this number can be defined is important because the question whether an individual program halts is not decidable with a general algorithm (see halting problem). The number Ω can be defined, but it cannot be computed; we don't know its value for any Turing-complete programming language, nor are we ever likely to (unless physics lets us build a hypercomputer).

It is important to realize that Chaitin's constant is not a constant in the usual sense: it is not a fixed, canonically defined number such as &pi or e since its definition depends on the arbitrary choice of computation model and program encoding. It should more properly be referred to as "Chaitin's construction".

To define Ω formally, we first need to fix a (Turing-complete) model of computation, for instance Turing machines or Lisp or Pascal programs. We then need to specify an unambiguous encoding of programs (or machines) as bit strings. This encoding must have the property that if w encodes a syntactically correct program, then no proper prefix of w encodes a syntactically correct program. This can always be achieved by using a special end symbol. We only consider programs that don't require any input.

Let P be the set of all programs which halt. Ω is then defined as:

This is an infinite sum which has one summand for every syntactically correct program which halts. |p| stands for the length of the bit string of p. The above requirement that programs be prefix-free ensures that this sum converges to a real number between 0 and 1.

It can then be shown that Ω represents the probability that a randomly produced bit string will encode a halting program. This means that if you start flipping coins, always recording a head as a one and a tail as a zero, the probability is Ω that you will eventually reach the encoding of a syntactically correct halting program.

One can prove that there is no algorithm which produces the digits of Ω: Ω is definable but not computable. Furthermore, Ω is a normal number.

If you fix, in addition to the computation model and encoding mentioned above, a specific consistent axiomatic system for the natural numbers, say Peano's axioms, then there exists a constant N such that no digit of Ω after the N-th can be proven to be one or zero within that system. (The constant N heavily depends on the encoding choices and does not reflect the complexity of the axiomatic system in any way.) This is an incompleteness result akin to Gödel's incompleteness theorem and Chaitin's own result mentioned under algorithmic information theory.