Definition
Parallelism (Computer Architecture)
Parallelism is the use of overlapping execution so that multiple operations, or multiple parts of a computation, make progress at the same time.
In computer architecture, this is used to increase throughput and performance. Two common forms are spatial parallelism and temporal parallelism.
Types
Spatial
Definition
Link to originalSpatial Parallelism (Computer Architecture)
Spatial parallelism is parallelism obtained by duplicating hardware so that multiple operations can be performed at the same time.
The key idea is that different hardware units work in parallel on different parts of the computation.
Temporal
Definition
Link to originalTemporal Parallelism (Computer Architecture)
Temporal parallelism is parallelism obtained by breaking a task into several stages and overlapping these stages over time.
Instead of duplicating hardware for the whole task, the computation is organised as a pipeline in which different inputs are processed in different stages at the same time.
In computer architecture, temporal parallelism is therefore often called pipelining.
Examples
Trays through a two-stage task
Suppose one tray first needs minutes of preparation and then minutes of baking.
In both cases, the latency of one tray is therefore
The difference is the throughput.
- With spatial parallelism, duplicated hardware lets two trays be processed side by side, giving
- With temporal parallelism, the stages are overlapped as a pipeline, so the throughput is determined by the slower stage, giving