Home > Software engineering >  What is Gustafson's law trying to argue?
What is Gustafson's law trying to argue?

Time:10-17

Gustafson's law is a counter to Amdahl's law claiming that most parallel processing applications actually increase the workload when having increased access to parallel processing. Thus, Amdahl's law which assumes constant workload to measure speedup is a poor method for determining benefits from parallel processing.

However, I'm confused as to what exactly Gustafson is trying to argue.

For example, say I take Gustafson's law and apply it to a sequential processor and two parallel processors:

I generate a workload to run on parallel processor #1. It takes 1 second to execute on the parallel processor and takes 10 seconds to execute on the sequential processor.

I generate a bigger workload and run it on parallel processor #2. It takes 1 second to execute on the parallel processor and takes 50 seconds to execute on the sequential processor.

For each workload, there is a speedup relative to the sequential processor. However, this doesn't seem to violate Amdahl's law since there is still some upper speedup limit for each workload. All this is doing is varying the workload such that the "serial only" portion of the code is likely reduced. As per Amdahl's law, decreasing the serial only portion will increase the speed-up limit.

So I'm confused, Gustafson's law which advocates for increasing the workload and maintain constant execution time doesn't seem to add any new information.

What exactly is Gustafson arguing? And specifically, what does "scaled-speedup" even mean?

CodePudding user response:

Gustafson's law is a counter to Amdahl's law claiming that most parallel processing applications actually increase the workload when having increased access to parallel processing. Thus, Amdahl's law which assumes constant workload to measure speedup is a poor method for determining benefits from parallel processing.

No.

First of all, the two laws are not claiming anything. They show the theoretical (maximum) speed-up an application can get based on a specific configuration. The two are basic mathematical formula modelling the behaviour of a parallel algorithm. They goals is to understand some limitations of parallel computing (Amdahl's law) and what developers can do to overcomes them (Gustafson's law).

The Amdahl's law is a mathematical formula describing the theoretical speed-up of an application with a variable-time given workload (but the amount of computation is fixed) computed by several processing units. The workload contains a serial execution part and a parallel one. It shows that the speed-up is bounded by the portion of serial portion of the program. This is not great for developers of parallel applications since this means that the scalability of their application will be quite bad for some rather-sequential workload assuming the workload is fixed.

The Gustafson's law break this assumption and add a new one to look at the problem differently. Indeed, the mathematical formula describes the theoretical speed-up of an application with a fixed-time workload (but the amount of computation is variable) computed by several processing units. It shows that the speed-up can be good as long as application developers can add more parallel work to a given workload.

I generate a workload to run on parallel processor #1. It takes 1 second to execute on the parallel processor and takes 10 seconds to execute on the sequential processor. I generate a bigger workload and run it on parallel processor #2. It takes 1 second to execute on the parallel processor and takes 50 seconds to execute on the sequential processor.

A parallel program cannot be more than 2 time faster with 2 processing unit compared to one processing unit with these two models. This is possible in practice due to some confounding factors (typically due to cache-effects) but the effect do not uniquely come from the processing units. What do you means by "two parallel processors"? The number of processing units is missing in your example (unless you want to find it from the provided informations).

So I'm confused, Gustafson's law which advocates for increasing the workload and maintain constant execution time doesn't seem to add any new information.

The two laws are like two point-of-views of the same scalability problem. However, they make different assumptions: fixed amount of work with a variable work time VS variable amount of work with a fixed work time.

If you are familiar with the concepts of strong/weak scaling, please note that the Amdahl's law is to the strong scaling what the Gustafson's law is to the weak scaling.

  • Related