The minute details of a function for arbitrary big values of its argument seldom matter. Often it is enough to compare its overall behaviour to standard sets of mathematical function so as qualify the function at hand along a restricted set of significant behaviours, such as bounded, oscillating, divergent, etc..., and to indicate the speed with which it reach this asymptotic behaviour.

For instance if a function reaches zero faster than a square integrable function, it is known that it has a well defined Fourier transform, no matter what the function actually looks like.

To thus classify functions, one recourse to the **asymptotic notation**. Knuth advocated the following notations (other notations hold in various contexts, e.g., Landau and Hardy notations):

- O-notation (big-oh)
- o-notation (little-oh)
- Ω-notation
- ω-notation
- Θ-notation

The notation is highly conventionnal and must not be assigned any mathematical significance. For instance if *g=*Θ*(f)* and *h=*Θ*(f)* it does not mean that *g=h*. A better notation would be *g∈*Θ*(f)* as Θ*(f)* (and others) can be regarded as the set of functions sharing a common asymptotic behavior.

The **O-notation** means that the function is bounded from above by a function which varies like *f*, i.e.,

The **o-notation** restricts the O-notation to mean that the function is bounded from above but that the bound is not *tight*, i.e., the function does not tend towards its o-reference. Formally:

The **Ω-notation** is the counterpart of O-notation for functions bounded from below:

The function can sandwiched by two representatives of its Θ-reference. For instance *x ^{2}+x=*Θ

These notations are widely used in algorithmy to estimate the complexity of an algorithm. In mathematics it is current to use the O-notation with the meaning of the Θ-notation.

**Reference:**

[1] D. E. Knuth. *Fundamental Algorithms*, vol. 1 of *The Art of Computer Programming.*