On Wednesday, June 25, 2003, at 09:42 PM, Peter Tattersall wrote: > 3) even the benchmarks showed that the Mac was still slower than the > Dell in integer arithmetic, though it beat the Dell handily in > floating point. Just to clarify, a single 3 GHz P4 or 3.06 GHz Xeon beat a single 2 GHz G5, but the dual G5 beat the dual Xeon (P4 can only do one CPU). To me, that says something about how well the G5 scales to multiple processors (at least two) compared to the Xeon. > The claim was then made that integer arithmetic is more important than > floating point, so this shows the Dell is better. I can't speak to > that: I wish I could find someone who could. My impression was that integer used to be much more important, but that it was a side effect of processors simply being much faster at integer math than floating point. That means many developers coerced their floating point data into integer values to improve performance. The simplest example would be a checkbook program that represents everything internally in cents (integer) and only converts to dollars and cents (floating point) for display purposes. Since floating point speed has caught up and sometimes outstripped integer speed, those kinds of hacks are no longer necessary. Which really leaves me wondering what the relative importance between fp and int really is today. Also, consider this. The G5 has two integer execution units and two floating point units. If integer math was so overwhelmingly important, why wouldn't they have 3 integer units and one floating point? -Mike