Another way to measure performance - fast cores vs more cores

Started by Gordon, February 21, 2017, 03:43:36 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Gordon

Now that Kaby Lake CPUs are finally out, I wondered where the 'break even' point is on the difference between more cores vs faster cores.  We happened to have a bunch of network rendering jobs going at the moment so I threw a few different machines into the mix as slaves and am seeing consistent results.  For example these two machines render nearly the same number of regions within a given time-frame:
Machine 1:  i7-6700k - 8 cores @ 4.0GHz
Machine 2:  E5-1650 v1 - 12 cores @ 3.2GHz (but it's an older generation of CPU)
Breaking the numbers down it's obvious that the faster machine can do more with less, but I couldn't help but think that there is a relationship in there that can help guide decisions on the most cost effective hardware to buy.  When I took these values along with the rest of our slaves, I seemed to be finding that (Frequency * Cores) per region ranged between 3.0-4.0 depending on the age of the CPU (newer CPU's run 3.8-4.0 while older CPU's are lower.  All were relatively consistent across the specific Intel architecture generation however.   So assuming you are comparing 'apples to apples' as far as generation of Intel CPU, you can infer that (Frequency * Cores) is a general measure of performance.

So in the example above, although the E5-1650 v1 is a pretty old CPU (so the Frequency Cores per region is low), it's also easy to find used machines using these CPU's for next to nothing, so the savings over a new i7 machine would more than offset going from a 64 to 96 CPU network rendering license.

guest84672


Arn

If you are rendering a lot, don't forget to factor in power consumption. I have been looking at old server hardware, but with any practical amount of use, the power consumption alone negates the price benefit. Not to mention the noise, heat and everything that goes with it. The larger tech companies like Facebook replace their servers after just three years, simply because the economics make sense.

Not too many people look at the TCO. They just see a price tag on a new machine :)