Main Page | See live article | Alphabetical index

Worst-case performance

The term worst-case performance is used to describe the way an algorithm behaves under conditions that cause its performance to be as poor as possible. For example, a simple linear search has an average running time of n/2, but a worst case performance of n steps, when the item to be found is the last item in the table.

For many algorithms, it is important to analyze worst-case performance as well as average performance. A classic example is Quicksort, which is, in the average case, a very fast algorithm. But if not implemented with great care, its worst-case performance can degrade to O(n2) ((see Big O notation), ironically when the target table is already sorted.

Worst-case performance analysis is often easier to do than "average case" performance. For many programs, determining what "average input" is, is in itself difficult, and often that "average input" has characterics which make it difficult to characterise mathematically (consider, for instance, algorithms that are designed to operate on strings of text). Similarly, even when a sensible description of a particular "average case" (which will probably only be applicable for some uses of the algorithm) is possible, they tend to result in more difficult to analyse equations.

See: sort algorithm - an area where there is a great deal of performance analysis of various algorithms.