Ok to start with, yes, as we've all said here, training grounds does not equal real game performance, but it is the only place where you can do repeatable consistent bench-marking right now. It still gives you the difference between how the various settings perform at their basics.
And ok, I repeated a few of the tests now that I've increased my CPU clocks from 3.3 GHz to 3.8 GHz.
BEFORE: (3.3 GHz)
bits - dx - Min FPS - Max FPS - Avg FPS
64 - 9 - 37 - 88 - 76.461
64 -11 - 43 - 79 - 67.160
AFTER: (3.8 GHz)
bits - dx - Min FPS - Max FPS - Avg FPS
64 - 9 - 44 - 89 - 77.913
64 -11 - 40 - 79 - 67.358
The results actually seem fairly similar to be honest. On the dx9 low side the min fps jumped up a fair bit but the other numbers not so much. So that is a 15% increase in CPU speed which only translated to a 0.2% to 1.8% increase in average frame rates. In training grounds this game doesn't seem to be as CPU bound as people say. Now of course in game is another story of course, probably different results in game.
If anyone wants to test out for themselves during a real match, this is super easy to do. Install FRAPS free. In the options, turn on benchmarking min/max/avg to file. At the start of the match, press F11 to start recording, and press F11 at the end. This will save your min/max/avg to file.
Since in game there are so many different variables going on, I would recommend recording results from at least 3 different matches on the same map and then figuring out an average. Then change the in game settings and test again. The comparisons we want are between the different combos of 32/66/dx9/dx11, with all other settings set to the same.
Edited by Trixxstrr, 22 December 2014 - 06:11 PM.