parallel processing

parallel processing, the concurrent or simultaneous execution of two or more parts of a single computer program, at speeds far exceeding those of a conventional computer. Parallel processing requires two or more interconnected processors, each of which executes a portion of the task; some supercomputer parallel-processing systems have hundreds of thousands of microprocessors. The processors access data through shared memory. The efficiency of parallel processing is dependent upon the development of programming languages that optimize the division of the tasks among the processors.

See E. Rietman, Exploring Parallel Processing (1990); K. M. Chandy and S. Taylor, An Introduction to Parallel Programming (1992); D. I. Moldovon, Parallel Processing from Applications to Systems (1993); G. S. Almasi and A. Gottlieb, Highly Parallel Computing (1993).

The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2012, Columbia University Press. All rights reserved.

More on parallel processing from Fact Monster:

  • grid computing - grid computing grid computing, the concurrent application of the processing and data storage ...
  • supercomputer - supercomputer supercomputer, a state-of-the-art, extremely powerful computer capable of ...
  • RISC processor - RISC processor RISC processor [Reduced Instruction Set Computer], computer arithmetic-logic unit ...
  • computer program - computer program computer program, a series of instructions that a computer can interpret and ...
  • neural network - neural network neural network or neural computing,computerarchitecture modeled upon the human ...

See more Encyclopedia articles on: Computers and Computing