Keyshot 9

Started by Rantech, July 08, 2019, 04:11:17 AM

Previous topic - Next topic

0 Members and 6 Guests are viewing this topic.

DetroitVinylRob

#45
Our GPU render times so far (without Global Illumination) with Quadro RTX 8000, 6000, and 5000 show a huge performance jump (time wise) with 6000 vs 5000, but also a relatively huge roll off in improvement from 6000 vs 8000 respectively. Regardless of price currently, the 6000 seems to be the sweet spot performer in our tests. We also see "real-time rendering" that is visually consistent (clarity wise) with our GPU render time performance. This of course may all change as KS9 continues to develop.

Ref: We utilize PCs running Windows 10 PRO on HP Z6 Intel(R) Xeon(R) Gold 6128 CPU@ 3.40 GHz 3.39 GHz (2 processor) 64GB RAM (usable) 64-bit operating system, x64 processor and 12 core/24 thread.

NormanHadley

So, Rob, could you post a bar chart, say, with four columns for rendering the same scene with: CPU only, RTX4000, RTX5000 and RTX8000?

DetroitVinylRob

#47
     P5000 GPU RenderA 106.11 RenderB 109.55s w/o global illumination
RTX6000 GPU RenderA   34.14 RenderB   44.28s w/o global illumination
RTX8000 GPU RenderA   33.51 RenderB   45.16s w/o global illumination
-----NA---- CPU RenderA 2h42m56s RenderB 3h54m39s w/ global illumination
Network Rendering 220 cores RenderA 28m33s RenderB 30m33s w/ global illumination

These where standardized render files and single camera positions of the same rather complex transportation lighting assemblies.

The performance is measuring time it takes from initiating a render, to completion.

Each result was an average of three attempts of the same task. Not wholly scientific, but reasonably accurate I believe.

mafrieger

Many Thanks Rob!

Really a good jump by using RT-Cores in RTX cards.

would you mind adding values for CPU only?


DetroitVinylRob

#49
Quote from: mafrieger on October 02, 2019, 12:13:45 PM
Many Thanks Rob!

Really a good jump by using RT-Cores in RTX cards.

would you mind adding values for CPU only?
Yes, coming... I will add the numbers to the previous chart (above).

dkwon89

#50
Quote from: Eric Summers on September 20, 2019, 01:00:40 PM
Quote from: Furniture_Guy on September 20, 2019, 11:01:53 AM
Matt,

So if you were in the market for a new video card for your Ryzen Threadripper 2990WX machine, what would be a good one?

THANKS!

Perry (Furniture_Guy)

Nothing less than a Quadro RTX 8000.  ;D

Bro... RTX8000 is a $6000 card... RTX6000 is $3500...

Man... these graphics cards are more expensive than most people's entire rig.

BoazD

Quote from: dkwon89 on October 04, 2019, 05:28:57 PM
Quote from: Eric Summers on September 20, 2019, 01:00:40 PM
Quote from: Furniture_Guy on September 20, 2019, 11:01:53 AM
Matt,

So if you were in the market for a new video card for your Ryzen Threadripper 2990WX machine, what would be a good one?

THANKS!

Perry (Furniture_Guy)

Nothing less than a Quadro RTX 8000.  ;D

Bro... RTX8000 is a $6000 card... RTX6000 is $3500...

Man... these graphics cards are more expensive than most people's entire rig.

hopefully we wouldn't need Quadro cards for Keyshot and the GeForce RTX cards would do the trick. still awaiting Luxion's response..
https://www.keyshot.com/forum/index.php?topic=25057.0

Jon-213

Watching the tests numbers.
If you want to use global illumination you can't use the GPU?
Or its a work in progress and will be possible?

DetroitVinylRob

#53
Quote from: Jon-213 on October 22, 2019, 08:19:47 AM
Watching the tests numbers.
If you want to use global illumination you can't use the GPU?
Or its a work in progress and will be possible?

I'm not sure at all. It is not available on my current beta version. Has anyone seen from KS a decisive word? Perhaps we will just have to wait and see... Possibly some highlight will be presented at What's New in KeyShot 9 Thursday, October 31st, 2019 11:00 AM PDT webinar.

jogeshocp

That will be the best option.

Prof

So... my preliminary findings... i9 9900k, Quadro P4000, ground and global illumination, 100 samples:

Image 1. cpu, no denoise: 4:49
Image 2. gpu, no denoise: 1:28
Image 3. gpu, denoise set to .5: 1:42
Image 4. cpu, denoise set to .5: 4:53

The samples would need to be bumped up on the gpu to get an image equivalent to cpu, but it would still be faster.

Interesting difference in the impact denoise has on rendering time between cpu and gpu.

Does anyone know how the display should be set up if I add a RTX 4000? Which should drive the display, P4000 or RTX 4000?

mattjgerard

Shouldn't matter, really. Display output is such a small part of what GPU's do these days computationally speaking, that its not that big of a drain on resources. Years ago, yes, it would matter. But all modern cards have so much horsepower that its all the computational stuff that taxes them, not the display output itself. What might make a difference is the screen resolution. Get too crazy and it does have to calculate all those pixels. I've got 3 monitors plugged in to a n old GTX980 and it shows very little use when using keyshot. I have to sort out the driver issues before I can engage the GPU rendering though I'm not expecting much, as its a pretty powerful CPU workstation.

Prof

#57
I disagree... maybe, because Keyshot will only use 8Gb of the available 16gb of ram on the two cards. So the question becomes what ram is Keyshot using if both cards are enabled?... the GDDR6 on the RTX 4000 or the GDDR5 on the P4000? If Keyshot uses whatever ram it will regardless of which card drives the displays, then you're correct.

I would think that how each card is set in the Nvidia Control Panel... "Dedicated to graphics tasks" or "Use for graphics and compute needs"... would also come into play. Although it makes sense that the later option would be the logical choice for both cards. Anyone?

figure1a

Quote from: Prof on November 05, 2019, 08:41:53 PM
So... my preliminary findings... i9 9900k, Quadro P4000, ground and global illumination, 100 samples:

Image 1. cpu, no denoise: 4:49
Image 2. gpu, no denoise: 1:28
Image 3. gpu, denoise set to .5: 1:42
Image 4. cpu, denoise set to .5: 4:53

All the GPU renders look bad, right? Even the CPU render with no denoise looks better than the GPU with denoise.

Prof

The two are different, output appearance wise. You must set up the shot for either one or the other to get the look and quality you want. The rendering output set up is different as well... the samples must be bumped up 3x - 4x on gpu, but it's still faster depending on the scene and your hardware.