computer-architecture parallelism
Definition
Amdahl's Law
Amdahl’s law states that the overall speedup of a system is limited by the fraction of execution time that cannot be improved.
If a fraction of a computation can be accelerated by a factor , while the remaining fraction is unchanged, then the total speedup is
So improving a subsystem yields a large overall benefit only when that subsystem accounts for a large part of the original running time.
Consequence
Diminishing return
Even a very large improvement of one part of a system has little effect if most of the running time lies elsewhere.
This is why Amdahl’s law is often used to reason about the limits of parallelism and other performance optimisations.