Table of contents |

2 Types of dynamical systems 3 Examples of dynamical systems |

An important goal is to describe the fixed points, or steady states of a given dynamical systems; these are values of the variable which won't change over time. Some of these fixed points are *attractive*, meaning that if the system starts out in a nearby state, it will converge towards the fixed point.

Similarly, one is interested in *periodic points*, states of the system which repeat themselves after several timesteps. Periodic points can also be attractive. Sarkovskii's theorem is an interesting statement about the number of periodic points of a one-dimensional discrete dynamical system.

Even simple nonlinear dynamical systems often exhibit almost random, completely unpredictable behavior that has been called *chaos*. The branch of dynamical systems which deals with the clean definition and investigation of chaos is called chaos theory.

The changing variable *x* is often a real number, but can also be a vector in **R**^{k}.

We distinguish between **linear dynamical systems** and **nonlinear dynamical systems**. In linear systems, the right-hand-side of the equation is an expression which depends linearly on *x*, as in

The two examples given earlier are nonlinear systems. These are much harder to analyze and often exhibit a phenomenon known as chaos which marks complete unpredictability; see also nonlinearity.