Main Page | See live article | Alphabetical index

Parallel programming

Parallel programming, or concurent programming is the technique of computer programming that provide for the execution of operations (either actually or apparently) concurrently, either within a single computer, or across a number of systems. In the latter case, the term distributed computing is used. Multiprocessor machines achieve better performance by taking an advantage of this kind of programming.

Parallel programming or computing is the splitting of a single task into a number of subtasks that can be computed relatively independently and then aggregated to form a single coherent solution.

Distributed computing can be defined as a method of information processing in which work is performed by separate computers linked through a communications network. Parallel programming is most often used for tasks that can easily broken down into independent tasks such as purely mathematical problems e.g. factorisation. Problems such as these are known to be 'embarrassingly parallel'.

Pioneers in the field of concurrent programming include Edsger Dijkstra and C. A. R. Hoare.

Topics in concurrent programming include:

Related topics