Battle of the Titans: Intel vs. AMD
Mark Broadbridge, the Senior Render Wrangler at YellowDog, shares insight from his team’s investigation into the assumption that rendering across multiple brands of CPU may produce inconsistent results. Read the full article below...
There is a lot of myth and speculation surrounding the effect that rendering with different brands of CPU has on the consistency of the final rendered output. Does it matter? Or should you stick to one brand? You’ll find online evangelists on both sides of the fence, with most opinions seemingly based on theories and gut-feelings.
The CPU Investigation brief
Prompted by YellowDog’s own suspicions, and after they failed to find any conclusive evidence one way or the other, they conducted a pseudo-scientific investigation, running a series of tests using the two market leaders: Intel and AMD, to try and lay the issue to rest.
Mark Broadbridge and team ran a series of simple renders on YellowDog’s Intel Xeon 2.6Ghz 32-core cloud nodes, and also on AMD EPYC 2.0Ghz 32-core nodes. They also sanity-checked their results across a variety of high-end production scenes, which reflected the results displayed in this report below.
The Analysis Tool
They compared the results of the output renders using the image comparison tool
online-image-comparison.com.The tool works by overlaying two images and highlighting any pixels with variations in their RGB value. This shows minute variation in two seemingly identical images. An example image is below.
Test 1 (part one) - Establishing a variance benchmark in biased render engines
The following scenes were rendered using the same Intel Xeon cloud node. They rendered a scene in Maya with V-Ray, 3ds Max with V-Ray, and Cinema 4D (C4D) with the physical render engine. Here are the outputs of those renders before using the comparison tool:
Test one (part two)
They then rendered the same scenes again using the same settings and the same Intel CPU. The outputs were fed into the online comparison tool and the pixels that showed any RGB variance were highlighted in red. They ran the same test scenario on multiple scenes of varying complexity including YellowDog customer scenes where permission was granted. The pixel RGB variance reported was consistent with the results of the sample scenes throughout.
Test 1 conclusion
By their very nature, biased render engines calculate, estimate, and determine a level of pixel accuracy that will be acceptable to the naked eye. Despite total consistency in hardware from one rendered frame to the next, there is a variance when rendering with biased engines and analysing using a comparison tool.
Test 2 - Testing the same scenes using AMD EPYC processors
Test 2 conclusion
YellowDog then ran the same test using AMD EPYC processes. This investigation suggests that if you stick to the same hardware there is a negligible distinction in consistency between rendering on AMD or Intel. They encountered a small increase in pixel RGB variance when using the AMD CPU compared to an Intel CPU, but more testing over a range of scenarios is required to establish this perspective. 5 bonus points to you if you spotted the single red pixel in the Maya scene for this test.
Test 3 - Comparing an AMD rendered frame to an Intel rendered frame
So… what happens if you want to mix things up in your render farm? YellowDog works with many studios who have a mixture of different CPU types so they wanted to investigate. The following test shows the variation of the same scenes rendered once with Intel and once with AMD.
Test 3 conclusion
The image comparison tool shows up a noticeable difference: markedly increased from the tests ran on identical CPU types earlier in this investigation. Of note is the clear difference along the edges of the bucket calculations in the Maya image. Parts of both the 3ds Max and Maya image are so varied that, for the first time in the investigation, you can observe an output variance that is visible to the naked eye when the two frames are played in looping video format.
Test 4 - Comparing Intel with Intel but different hardware configuration
Are Test 3 results due to the difference between AMD and Intel or is it due to a broader difference in any hardware configuration?
YellowDog decided to run the same tests but this time compare Intel to Intel: rendering with Intel Xeon 2.6Ghz 32 core nodes and rendering with Intel Xeon 2.3GHz workstations.
Test 4 conclusion
The variance in pixel RGB value, while decreased compared to Test 3, is remarkably similar. Flicker is still present in some areas of the scene as was observed in Test 3.
So what does this mean for your local render farm or for your chosen cloud render provider? If you want consistent renders, then the evidence from this investigation suggests that you should render with a consistent batch of hardware. Whether the CPU is AMD or Intel is largely irrelevant on the basis of consistency; you just need to pick one and stick with it throughout a production. You should also ensure that the hardware configuration is consistent; it is shown that rendering with a consistent CPU type but inconsistent hardware configuration produces the same results as if you had changed between AMD and Intel. The ideal best practice is to ensure total consistency of render nodes between frames, between shots, and between scenes.
Maintaining consistency and quality will become more important than ever in the years to come for rendering due to increasing demands for higher resolution productions and immersive experiences, which puts render quality closer to the eyes of the viewer than ever before. YellowDog renders productions with identical cloud nodes: containing identical CPU and hardware configurations to mitigate quality risk. The same is true of their GPU technology.