Herb fought the law--Amdahl's Law, that is--and Herb won-DDJ --- Did sutter really fight?

Herb fought the law--Amdahl's Law, that is--and Herb won-DDJ --- Did sutter really fight?

Post by hsutte » Thu, 21 Feb 2008 02:14:45



The conclusion of that blog post:

- Break Amdahl's law - It is a correct attitude. You must not
- get into a deadlock by finding the algorithm is not much
- improvable because of Amdahl's law results.

Thanks, that's the title I chose for the article. :-)

- "Herb fought the law--Amdahl's Law, that is--and Herb won-DDJ"
- It is not a good title.

Right, the copyeditor added that tagline, and I don't like it either.
If you look at the print version of this article, you'll see I got
them to change the above back to "Here's a law you should break early
and often" there, but the change didn't stick online. (Yes, I could
ask them again to change it online too, but I already ask for enough
changes which they always kindly accept, and you have to prioritize
what's worth complaining about and pick your battles, like when the
wrong figures get used... even with good editors mistakes happen, and
these are quite good editors though a just little creative
sometimes. :-) )

- Herb didn't even try to fight with the Amdahl's law. Herb only
- proves it is better to take Gustafson's law into consideration
- while deciding on going for parallelization or not.

Partly true but incomplete. I also mentioned techniques to apply
parallelism to improve "s" (the sequential portion) by at least a
fixed amount. That doesn't transform s into "p" (the fully scalable
portion) but it still cheats Amdahl's Law as generally understood by
developers in the trenches.

Common statements of Amdahl's Law have got many developers into the
mindset of thinking a program is made up of only "stuff that's fully
parallelizable (e.g., divide-and-conquer on independent data)" and
"everything else is sequential and therefore can't benefit from
parallel hardware" -- and the latter conclusion is not correct. That
mindset ignores the third "middle ground" category consisting of
applying "non-fully" parallel techniques like pipelining that have
fixed limits but are still valuable to improve the performance of "s"
on a parallel machine.

I'm sure Amdahl realized that and had it in mind as ways of speeding
up "s", but most programmers I've heard citing him don't seem to
understand that, and they conclude that everything other than "p"
cannot benefit from parallel hardware -- and it's that last bit that
is wrong and I thought needed some illumination.

Herb
 
 
 

Herb fought the law--Amdahl's Law, that is--and Herb won-DDJ --- Did sutter really fight?

Post by Dmitriy V' » Fri, 22 Feb 2008 19:00:22


Correct me if I'm wrong.
Sutter's Law:

Tparallel = s + pp1/k1 + pp2/k1 + ... + ppn/kn + p/n

s - strictly sequential part
ppn - partly parallel part
kn - parallelism coefficient (actually min(n, parallelism
coefficient))
p - fully parallel part
n - number of processors

(Amdahl) s = (Sutter) s + pp1 + pp2 + ... + ppn

Tparallel(sutter) <= Tparallel(amdahl)
Tparallel(sutter) < Tparallel(amdahl), if there is at least one partly
parallel part

Tparallel(amdahl) = s, if p = 0
Tparallel(sutter) < s, if p = 0 and there is at least one partly
parallel part

Amdahl's Law is suitable for HPC, where one really want to enable all
processors
Sutter's Law is suitable for client/server applications, where it's
enough to enable at least some part of processors (if one can't easily
enable all processors)

And Sutter's Law is more optimistic wrt multicore future and free
lunches :)
Have I got your point?

Dmitriy V'jukov

 
 
 

Herb fought the law--Amdahl's Law, that is--and Herb won-DDJ --- Did sutter really fight?

Post by Amal » Sat, 23 Feb 2008 14:25:40

n Feb 21, 3:00 pm, "Dmitriy V'jukov" < XXXX@XXXXX.COM > wrote:




Sutter didn't make any law. He explained the use of Gustafson's
law. And it means think in terms of size of work. Not about the time
alone.


Amdahl's law totaly depends on the number of processors. Hence it
totally depends on the amount of work that could be parallelized. So
mostly when you go for performance improvement you run the amdahls
law. You check the time that could be parallelized. The time which we
cant do anything about because it is entirely serial. Then run the
Amdahl's law. Now we finds the maximum improvement. If it is telling a
good improvement can be made we go for parallelization. This is not a
good habbit. That is what Gustafson proved in his paper Reevaluating
Amdahl's law. It is like we have to think in terms of size of work. It
is common to see we add more and more work. Then there will be more
part to be parallelized. And what Sutter intended was basically this.
Never leave hope on parallelization by seeing Amdahl's law result. In
future you may add more work and get more thing done in less time if
the added work is parallelizable.

It causes a big ambiguity like there is a law called "Sutter's Law"
because of the subtitle of the article. That is why I even made this
thread.

Thanks,
Amal