In formal mathematical terms, an algorithm is considered to be any sequence of operations which can be performed by a Turing-complete system.

Different algorithms may complete the same task with a different set of instructions in more or less time, space, or effort than others. A cooking recipe is an example of an algorithm. Given two different recipes for making potato salad, one may have "peel the potato" before "boil the potato" while the other presents the steps in the reverse order, yet they both call for these steps to be repeated for all potatoes and end when the potato salad is ready to be eaten.

Correctly performing an algorithm will not solve a problem if the algorithm is flawed, or not appropriate to the problem. For example, performing the potato salad algorithm will fail if there are no potatoes present, even if all the motions of preparing the salad are performed as if the potatoes were there.

Table of contents |

2 Implementing algorithms 3 Example 4 History 5 Classes of algorithms 6 See also 7 References 8 Other Web Reference |

Algorithms are essential to the way computers process information, because a computer program is essentially an algorithm that tells the computer what specific steps to perform (in what specific order) in order to carry out a specified task, such as calculating employees' paychecks or printing students' report cards.

Typically, when an algorithm is associated with processing information, data is read from an input source or device, written to an output sink or device, and/or stored for further use. Stored data is regarded as part of the internal state of the entity performing the algorithm.

For any such computational process, the algorithm must be rigorously defined: specified in the way it applies in all possible circumstances that could arise. That is, any conditional steps must be systematically dealt with, case-by-case; the criteria for each case must be clear (and computable).

Because an algorithm is a precise list of precise steps, the order of computation will almost always be critical to the functioning of the algorithm. Instructions are usually assumed to be listed explicitly, and are described as starting 'from the top' and going 'down to the bottom', an idea that is described more formally by *flow of control*.

So far, this discussion of the formalization of an algorithm has assumed the premises of imperative programming. This is the most common conception, and it attempts to describe a task in discrete, 'mechanical' means. Unique to this conception of formalized algorithms is the assignment operation, setting the value of a variable. It derives from the intuition of 'memory' as a scratchpad. There is an example below of such an assignment.

See functional programming for an alternate conception of an algorithm.

Once a formal description has been obtained, an **algorithm** is a well-defined method or procedure for solving a problem such as a problem in mathematics; or otherwise relating to the manipulation of information.

Algorithms are now most often implemented as computer programs but can be implemented by other means, such as electric circuits or a machine. They may even be performed directly by humans: think for example of an abacus, or performing arithmetic with pen and paper or its mental equivalent - most people use algorithms they learned as a child to do this.

The analysis and study of algorithms is a central discipline of computer science, and is often practiced abstractly (without the use of a specific programming language, designed for practical implementation). In this sense, it resembles other mathematical disciplines in that the analysis focuses on the underlying principles driving the algorithm, and not it's particular implementation. The 'coding' (more properly 'codifying' though this terminology is seldom, if ever, used) of algorithms in such an abstract manner is termed 'writing pseudocode'.

Some restrict the definition of *algorithm* to procedures that eventually finish. Others include procedures that could run forever without stopping, arguing that a computer may be required to carry out an ongoing task. Other requirements beside having an ending state, then, are used to determine if the algorithm successfully completes a task.

Here is a simple example of an algorithm.

- Pretend the first number in the list is the largest number.
- Look at the next number, and compare it with this largest number
- Only if this next number is larger, then keep that as the new largest number
- Repeat steps 2 and 3 until you have gone through the whole list.

Given: a listNotes on notation:Listof lengthLengthcounter = 1 largest = List[counter] while counter <= Length: if List[counter] > largest: largest = List[counter] counter = counter + 1 print largest

*=*as used here indicates assignment. That is, the value on the right-hand side of the expression is assigned to the container (or variable) on the left-hand side of the expression.*List[counter]*as used here indicates the counter^{th}element of the list. For example: if the value of*counter*is 5, then*list[counter],*refers to the 5^{th}element of the list.*<=*as used here indicates 'less than or equal to'

The word *algorithm* is a corruption of early English *algorisme*, which came from Latin *algorismus*, which came from the name of the Persian mathematician Abu Ja'far Mohammed ibn Musa al-Khwarizmi (ca. 780 - ca. 845). He was the author of the book *Kitab al-jabr w'al-muqabala* (*Rules of Restoration and Reduction*) which introduced algebra to people in the West. The word *algebra* itself originates from *al-Jabr* from the book title. The word *algorism* originally referred only to the rules of performing arithmetic using Arabic numerals but evolved into *algorithm* by the 18th century. The word has now evolved to include all definite procedures for solving problems or performing tasks.

The first case of an algorithm written for a computer was Ada Byron's notes on the analytical engine written in 1842, from which she earns the title of the world's first programmer.

The lack of mathematical rigor in the "well-defined procedure" definition of algorithms posed some difficulties for mathematicians and logicians of the 19th and early 20th centuries. This problem was largely solved with the description of the Turing machine, an abstract model of a computer described by Alan Turing, and the demonstration that every method yet found for describing "well-defined procedures" advanced by other mathematicians could be emulated on a Turing machine (a statement known as the Church-Turing thesis).

Nowadays, a formal criterion for an algorithm is that it is a procedure implementable on a completely-specified Turing machine or one of the equivalent formalisms. Turing's initial interest was in the halting problem: deciding when an algorithm describes a terminating procedure. In practical terms computational complexity theory matters more: it includes the puzzling problem of the algorithms called NP-complete, which are generally presumed to take more than polynomial time.

Algorithms come in different flavours. A recursive algorithm is one that invokes (makes reference to) itself repeatedly until a certain condition matches, which is a method common to functional programming. A greedy algorithm works by making a series of simple decisions that are never reconsidered. A divide-and-conquer algorithm recursively reduces an instance of a problem to one or more smaller instances of the same problem, until the instances are small enough to be directly expressible in the programming language employed (what is 'direct' is often discretionary). A dynamic programming algorithm works bottom-up by building progressively larger solutions to subproblems arising from the original problem, and then uses those solutions to obtain the final result. Many problems (such as playing chess) can be modeled as problems on graphs. A graph exploration algorithm specifies rules for moving around a graph and is useful for such problems. Probabilistic algorithms are those that make some choices randomly (or pseudo-randomly).

Algorithms are usually discussed with the assumption that computers execute each instruction of an algorithm at a time. Those computers are sometimes called serial computers. An algorithm designed for such an environment is called a serial algorithm, as opposed to parallel algorithms, which take advantage of computer architectures where several processors can work on a problem at the same time.

Genetic algorithms attempt to find solutions to problems by mimicking biological evolutionary processes, with a cycle of random mutations yielding successive generations of 'solutions'. Thus, they emulate reproduction and "survival of the fittest". In genetic programming, this approach is extended to algorithms, by regarding the algorithm itself as a 'solution' to a problem.

A list of algorithms discussed in Wikipedia is available.

- Bulletproof algorithms
- Numerical analysis
- Cryptographic algorithms
- Sort algorithms
- Search algorithms
- Merge
- String algorithms
- List of algorithms
- Data structure
- Genetic Algorithms

- Donald E Knuth:
*The Art of Computer Programming*, Vol 1-3, Addison Wesley 1998. Widely held as a definitive reference. - Gaston H. Gonnet and Ricardo Baeza-Yates: Example programs from
*Handbook of Algorithms and Data Structures.*Free source code for lots of important algorithms. - Dictionary of Algorithms and Data Structures. "This is a dictionary of algorithms, algorithmic techniques, data structures, archetypical problems, and related definitions."

- Algorithms and Data Structures: Growing work on algorithms and data structures
*Savannah - Algorithm Project*Code and documentation for algorithms and data structure, under GPL and GFL (Note: The collection is constantly growing).\n