The differences between serial and parallel processing

Written by carlos mano
  • Share
  • Tweet
  • Share
  • Pin
  • Email
The differences between serial and parallel processing
Parallel computers are really a collection of serial computers working together. (Dynamic Graphics/Dynamic Graphics Group/Getty Images)

Computers are inherently serial. Working in parallel makes a lot more sense in many applications, so the parallel computer was invented -- a set of serial computers working together. Parallel computers can make some jobs go a lot faster, but not all problems automatically run faster on parallel computers, and each problem must be broken up to run in parallel by a highly trained, and very expensive, parallel programmer.

The Sequential CPU

The Central Processing Unit (CPU) takes instructions from main memory and executes them one at a time. After executing an instruction, the CPU gets the next instruction and continues to execute instructions serially. It can do anything that the programmer can describe in a sequence of instructions that the computer can understand. It is inherently serial.

Parallel Advantages

Serial processing is like using a laundromat that has only one washer and dryer. If you have a lot of laundry, it will take a long time. With enough machines you can do dozens of loads in about the same time as one. However, it is not always possible to break problems into parts that can be run simultaneously. If you are summing or searching through a million numbers you can form groups of 1,000 numbers each and process the groups simultaneously on 1,000 processors, finishing 1,000 times faster than on one processor. Other tasks, like dividing two numbers, must be performed sequentially.

Parallel Problems

The first disadvantage of parallel computing is cost. Good serial computers start at about £650. Parallel computers start at over £0.6 million and go up -- way up. Software and trained programmers for parallel computers are also more expensive. Even if a problem can be broken up into parts that can be run at the same time, it can be difficult to coordinate all the parts.

Amdahl's Law

Another problem is due to Amdahl's law. Every parallel program has two parts: housekeeping and problem processing. Housekeeping involves the coordination of multiple processors, while problem processing is the actual computation. Amdahl's law states that the percentage of time each processor spends on housekeeping increases with the number of parallel processors. The implication of Amdahl's law is that it is impractical to increase the number of parallel processors beyond a certain point.

Don't Miss

  • All types
  • Articles
  • Slideshows
  • Videos
  • Most relevant
  • Most popular
  • Most recent

No articles available

No slideshows available

No videos available

By using the site, you consent to the use of cookies. For more information, please see our Cookie policy.