Gustafson's Law, also known as Gustafson-Barsis's Law, is a principle in parallel computing that addresses the issue of scalability in parallel systems. It was proposed by John L. Gustafson and his colleague Edwin H. Barsis in 1988 as a response to Amdahl's Law, which was more pessimistic about the potential performance improvements achievable through parallel processing.
Gustafson's Law states that it is possible to achieve significant speedup in parallel processing by increasing the problem size. In other words, the law suggests that as the number of processors increases, the overall computational workload can be increased proportionally to maintain constant efficiency. This contrasts with Amdahl's Law, which focuses on the fixed-size problem and emphasizes the importance of the sequential portion of a computation that limits the maximum speedup achievable.
Gustafson's Law can be expressed mathematically as follows:
Speedup = P + (1 - P) * S
- Speedup is the factor by which the computation time is reduced when using multiple processors.
- P is the proportion of the computation that can be performed in parallel.
- S is the proportion of the computation that must be performed sequentially.
Key aspects of Gustafson's Law include:
- Scalability: Gustafson's Law highlights the importance of scalability in parallel processing, suggesting that significant performance improvements can be achieved by increasing the problem size as the number of processors increases.
- Focus on parallel portion: Unlike Amdahl's Law, which emphasizes the impact of the sequential portion of computation on performance, Gustafson's Law focuses on the parallel portion and how it can be scaled to achieve better performance.
- Optimism for parallel processing: Gustafson's Law presents a more optimistic view of the potential for parallel processing, as it suggests that there is no inherent limit to the speedup that can be achieved through parallelism, as long as the problem size is increased proportionally.
- Real-world applicability: Gustafson's Law is more applicable to real-world problems, as many computational tasks naturally scale with the size of the data or the complexity of the problem being solved.
In summary, Gustafson's Law is a principle in parallel computing that emphasizes the potential for performance improvements through parallel processing by increasing the problem size as the number of processors increases. It offers a more optimistic view on parallelism compared to Amdahl's Law and provides a better understanding of the scalability of parallel systems in real-world applications.