I wonder how accurate the 1000 single-bit operations estimation is. Mind you, I am not a
electronic design expert, so this is just rambling.
I've been thinking that a 1-bit adder has three inputs, so it has a 8-entry truth table. (Each
entry looks like something like "o=i1 & ~i2 & i3").
After optimizations, I am counting 18 single bit operations ("ands" and "nots"). It has two
outputs, I guess that is just double the whole thing to 36.
So for a 32-bit integer adder we have 32*36=1152 single-bit operations operations. The
straight-forward multiplier, which they thought us in school, will do an addition per set bit,
so if we assume 16 set bits on average, we get 18432 single bit ops, ignoring the shifts.
So, when using the most simplistic algorithms, the 1000 operations number seems far too
Can someone do a similar estimation for the super-clever algorithms they actually use the CPUs
? (There is a detailed description in "Computer Architecture: A Quantitative Approach", but I
never understood it well enough to actually keep it in my head, and the book is not with me.
Oh yes, and I am lazy)