Jump to content

64Bit Vs 32Bit/dx11 Vs Dx9 Benchmarks


49 replies to this topic

#21 Kain Demos

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 2,629 posts
  • LocationTerra

Posted 22 December 2014 - 12:46 PM

View PostEgoSlayer, on 22 December 2014 - 12:41 PM, said:


I've never seen it drop to slo mo, and I run 2550x1440 on an HD 7950. I am GPU bound, my CPU doesn't ever see more than about 30% utilization. But I still never see less than 30FPS, get ~80 in TG, average in game is around 50. Turn particles to low - they are the biggest performance killers.


I have to have them at least on high---tried medium for a while but it makes incoming PPCs much harder to see and dodge. Kind of sad that PPCs are so slow you can move out of the way but still, while its possible I'll do it.

I never see high utilization on my CPU or either of my GPUS (which my 2nd GPU stays idle since there is no CrossfireX support) either which has always led me to believe the problem is in the game's software.

Edited by Kain Thul, 22 December 2014 - 12:47 PM.


#22 EgoSlayer

    Member

  • PipPipPipPipPipPipPipPip
  • Wrath
  • Wrath
  • 1,909 posts
  • Location[REDACTED]

Posted 22 December 2014 - 12:55 PM

View PostKain Thul, on 22 December 2014 - 12:46 PM, said:


I have to have them at least on high---tried medium for a while but it makes incoming PPCs much harder to see and dodge. Kind of sad that PPCs are so slow you can move out of the way but still, while its possible I'll do it.

I never see high utilization on my CPU or either of my GPUS (which my 2nd GPU stays idle since there is no CrossfireX support) either which has always led me to believe the problem is in the game's software.


That's the cause of your problem then; my GPU load according to the AMD System Monitor and GPU-Z both show 100% GPU utilization. If you're not seeing that, there is your problem. Might be because of the lack of Crossfire support it's not using your first card correctly, or it's adding extra latency figuring out which GPU. Certainly a software issue, I'd try disabling (device manager or if possible in BIOS/UEFI) or removing the 2nd card and trying that out, unless you have an x2 card. Device manager disable might still work with the x2 depending on how they are presented. Because you should be able to load up one GPU to full utilization.

Edited by EgoSlayer, 22 December 2014 - 12:56 PM.


#23 Kain Demos

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 2,629 posts
  • LocationTerra

Posted 22 December 2014 - 12:58 PM

View PostEgoSlayer, on 22 December 2014 - 12:55 PM, said:


That's the cause of your problem then; my GPU load according to the AMD System Monitor and GPU-Z both show 100% GPU utilization. If you're not seeing that, there is your problem. Might be because of the lack of Crossfire support it's not using your first card correctly, or it's adding extra latency figuring out which GPU. Certainly a software issue, I'd try disabling (device manager or if possible in BIOS/UEFI) or removing the 2nd card and trying that out, unless you have an x2 card. Device manager disable might still work with the x2 depending on how they are presented. Because you should be able to load up one GPU to full utilization.



Starting to think about dumping my 7990s since they are 18 months old now anyway and just buying one of these new, massive NVIDIA cards. I'll be ******* pissed if the problem is still there though.

Don't think going two GPUs is worth it anymore. Games are almost always multi platform now and are optimized for low end systems and consoles in many cases and then have software problems your 4,000.00 $$ worth of hardware can never solve.

Edited by Kain Thul, 22 December 2014 - 12:59 PM.


#24 DarthPeanut

    Member

  • PipPipPipPipPipPipPip
  • Liquid Metal
  • 861 posts

Posted 22 December 2014 - 01:00 PM

Interesting results, thanks for sharing.

View PostRhialto, on 22 December 2014 - 10:20 AM, said:

Do yourself a present for Christmas and run that CPU @ 4.2GHz! Many run it higher but 4.2GHz is guaranteed to work! Dunno what motherboard you have but in my case it's was 2 click and done.

My setup is similar to yours but GTX 660 Ti. I may try to run it and compare now that I use the 64bits client but still DX9. I will try to remember disabling 3D first. :P


Haha, I was thinking the very same thing as I read the OP.

As long as the processor has a decent cooler on it... bumping up to a 40 or 42 multiplier for 4.0 or 4.2ghz would make a nice improvement. I have run as much as 4.4ghz stable without having to bump the vcore up.

Edited by DarthPeanut, 22 December 2014 - 01:02 PM.


#25 o0Marduk0o

    Member

  • PipPipPipPipPipPipPipPipPip
  • Bad Company
  • Bad Company
  • 4,231 posts
  • LocationBerlin, Germany

Posted 22 December 2014 - 01:37 PM

View PostEgoSlayer, on 22 December 2014 - 12:32 PM, said:


Actually, testing grounds is the only place to get valid repeatable tests. It's only an indicator for testing relative performance of changes in settings.

It's not a complete indication of in game performance because of all the other variables you mentioned that occur in game with the 23 other mechs on the field, weapon effects, etc. Until we can record sessions and play them back or have demo loops of combat there isn't any repeatable way to test in game performance. But any gains or losses in TG will be reflected in game, it just may be lost/minimized (gains) or amplified (losses) in game with the additional rendering requirements

But you still lack real feedback. The 64bit performance could be better in a real match with moving mechs and all weapon effects. ;)
Unless you can verify this, the only way to bench the FPS is a full gaming session with multiple matches and a tool to record the FPS.

Edited by o0Marduk0o, 22 December 2014 - 01:39 PM.


#26 Felix7007

    Member

  • PipPipPipPipPip
  • 189 posts

Posted 22 December 2014 - 01:51 PM

I also have a i5-2500K but run it at 4.8Ghz. Do it.

#27 Rhaegor

    Member

  • PipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 301 posts
  • LocationChicago, IL, USA

Posted 22 December 2014 - 02:55 PM

I have never had frame rate issues (other than over-heating steam when backing up which I solved by putting particle effects to their lowest settings), but since I have been using the 64 bit client I have noticed major slow downs during crazy heavy engagements in community warfare. Prior to using 64bit I never had the frame rate issues people complain about in CW.

I am using an Intel core i5-4670k @ 4.4GHZ, 6MB Mushkin DDR3 @ 1.6GHZ, and an EVGA 760GTX. Motherboard is an Asus Hero Maximus VI, not that it matters...

Edited by Rhaegor, 22 December 2014 - 02:57 PM.


#28 Trixxstrr

    Member

  • PipPip
  • Veteran Founder
  • Veteran Founder
  • 48 posts
  • LocationFort McMurray, AB, Canada

Posted 22 December 2014 - 04:06 PM

Thanks for the suggestions on OCing. I thought about it a few times before now but never bothered. So I have it at 3.8 now. 4.2 wouldn't boot, 4.0 gave windows errors. 3.8 seems good and is better than nothing :)

#29 Vassago Rain

    Member

  • PipPipPipPipPipPipPipPipPipPipPip
  • Bridesmaid
  • Bridesmaid
  • 14,396 posts
  • LocationExodus fleet, HMS Kong Circumflex accent

Posted 22 December 2014 - 04:12 PM

64 and DX11 do the same thing as 32 and DX9.

That is to say, not take full advantage of your hardware, and once LRMs start raining, the most beast of computers sinks down on its knees, as framerate drops to 20.

View PostHARDKOR, on 22 December 2014 - 12:42 PM, said:

Is there any benefit to DX11? I didn't see any graphical improvements...


Not really.

#30 Vinhasa

    Member

  • PipPipPip
  • The Merciless
  • The Merciless
  • 87 posts

Posted 22 December 2014 - 04:15 PM

View PostVassago Rain, on 22 December 2014 - 04:12 PM, said:

64 and DX11 do the same thing as 32 and DX9.

That is to say, not take full advantage of your hardware, and once LRMs start raining, the most beast of computers sinks down on its knees, as framerate drops to 20.



I can attest to this being true.

#31 Smokeyjedi

    Member

  • PipPipPipPipPipPipPipPip
  • Liquid Metal
  • Liquid Metal
  • 1,040 posts
  • LocationCanada

Posted 22 December 2014 - 04:22 PM

Played 2 nights with DX11 + 64bit client. Same computer/MWO/bios settings FPS seemed to not dip into the 30's and remain above 40s but I was experiencing a tonne of crash to desktops......far more than normal, last night I let the MWO repair tool run, tonight I will see if clearing shader cache and files helps.....

#32 MauttyKoray

    Member

  • PipPipPipPipPipPipPipPipPip
  • 2,831 posts

Posted 22 December 2014 - 04:33 PM

I've personally seen much fewer crashes using the 64 bit client (specifically no more 'out of memory' errors).

DX9/11 doesn't seem to make a HUGE difference graphically. Its a tad shinier though and my Nvidia card along with the Geforce Experience optimizations does run better than DX9 without the optimizations on my end though (Superclocked EVGA GTX 660).

#33 R Razor

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,583 posts
  • LocationPennsylvania ...'Merica!!

Posted 22 December 2014 - 04:40 PM

View PostKain Thul, on 22 December 2014 - 12:58 PM, said:



Starting to think about dumping my 7990s since they are 18 months old now anyway and just buying one of these new, massive NVIDIA cards. I'll be ******* pissed if the problem is still there though.

Don't think going two GPUs is worth it anymore. Games are almost always multi platform now and are optimized for low end systems and consoles in many cases and then have software problems your 4,000.00 $$ worth of hardware can never solve.



I wouldn't waste your money at this point..........I have an older (770) Nvidia card and I'm not seeing anywhere close to 100% GPU usage. This game is very very CPU dependent it seems. You will see more improvement by replacing an older CPU than you will a GPU. Optimization is absolutely horrible in MW:O.

#34 Rhialto

    Member

  • PipPipPipPipPipPipPipPipPip
  • Philanthropist
  • 2,084 posts
  • Twitter: Link
  • LocationQuébec, QC - CANADA

Posted 22 December 2014 - 04:56 PM

View PostHARDKOR, on 22 December 2014 - 12:42 PM, said:

Is there any benefit to DX11? I didn't see any graphical improvements...

The only graphical difference you'll see is one more option under Antialiasing.

#35 Rhaegor

    Member

  • PipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 301 posts
  • LocationChicago, IL, USA

Posted 22 December 2014 - 05:00 PM

View PostSmokeyjedi, on 22 December 2014 - 04:22 PM, said:

Played 2 nights with DX11 + 64bit client. Same computer/MWO/bios settings FPS seemed to not dip into the 30's and remain above 40s but I was experiencing a tonne of crash to desktops......far more than normal, last night I let the MWO repair tool run, tonight I will see if clearing shader cache and files helps.....


I had never had a crash in MWO prior to switching to the 64 bit client.

#36 Rhialto

    Member

  • PipPipPipPipPipPipPipPipPip
  • Philanthropist
  • 2,084 posts
  • Twitter: Link
  • LocationQuébec, QC - CANADA

Posted 22 December 2014 - 05:01 PM

View PostTrixxstrr, on 22 December 2014 - 04:06 PM, said:

Thanks for the suggestions on OCing. I thought about it a few times before now but never bothered. So I have it at 3.8 now. 4.2 wouldn't boot, 4.0 gave windows errors. 3.8 seems good and is better than nothing :)

I have an ASUS RoG motheboard and there is an auto-oc setting that take care of slightly bumping voltage so that's why I said 4.2 was guaranteed but in your case you would have to play with voltage a bit to get higher and it's pretty easy but like you said, 3.8 is already better than nothing. I have an option for 4.6GHz too but it wasn't stable 24hrs a day and I'm not the kind living on the edge, I prefer to have a safe zone, I'm fully happy at 4.2GHz.

Will you redo a few benchmark to compare now? You must be curious to see the differences now that you have an exta 500MHz! ;)

#37 Xyroc

    Member

  • PipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 855 posts
  • LocationFighting the Clan Invasion

Posted 22 December 2014 - 05:08 PM

OP you are getting way too high average FPS. Need to do that in an actual match I'm sure it will be 30s-40s

#38 Trixxstrr

    Member

  • PipPip
  • Veteran Founder
  • Veteran Founder
  • 48 posts
  • LocationFort McMurray, AB, Canada

Posted 22 December 2014 - 06:09 PM

Ok to start with, yes, as we've all said here, training grounds does not equal real game performance, but it is the only place where you can do repeatable consistent bench-marking right now. It still gives you the difference between how the various settings perform at their basics.

And ok, I repeated a few of the tests now that I've increased my CPU clocks from 3.3 GHz to 3.8 GHz.

BEFORE: (3.3 GHz)
bits - dx - Min FPS - Max FPS - Avg FPS
64 - 9 - 37 - 88 - 76.461
64 -11 - 43 - 79 - 67.160

AFTER: (3.8 GHz)
bits - dx - Min FPS - Max FPS - Avg FPS
64 - 9 - 44 - 89 - 77.913
64 -11 - 40 - 79 - 67.358

The results actually seem fairly similar to be honest. On the dx9 low side the min fps jumped up a fair bit but the other numbers not so much. So that is a 15% increase in CPU speed which only translated to a 0.2% to 1.8% increase in average frame rates. In training grounds this game doesn't seem to be as CPU bound as people say. Now of course in game is another story of course, probably different results in game.

If anyone wants to test out for themselves during a real match, this is super easy to do. Install FRAPS free. In the options, turn on benchmarking min/max/avg to file. At the start of the match, press F11 to start recording, and press F11 at the end. This will save your min/max/avg to file.

Since in game there are so many different variables going on, I would recommend recording results from at least 3 different matches on the same map and then figuring out an average. Then change the in game settings and test again. The comparisons we want are between the different combos of 32/66/dx9/dx11, with all other settings set to the same.

Edited by Trixxstrr, 22 December 2014 - 06:11 PM.


#39 9erRed

    Member

  • PipPipPipPipPipPipPipPip
  • Overlord
  • 1,566 posts
  • LocationCanada

Posted 22 December 2014 - 06:46 PM

Greetings all,

If there is any comparison done between the Dx9 and Dx11 graphics there needs to be a standard set of graphics sliders listed.
- Both in the Nvidia control panel as well as what is selected from the game settings.

Simply changing the 'textures' range will show very different FPS ranges between the two.
- Altering/disabling the damage glow and post processing will change everything again.
- The level of Aliasing between Nvidia and game settings needs to be equal so they are not 'overriding' each other.
(and tested at different settings)

General listings of game performance ratings doesn't make much of a difference with out listing what these setting are for each mode. As I could get great FPS but have turned off the AA and Textures.
- We need a base reference and what the game settings changes effect.
(why do the graphics settings sliders not have a pop up listing just what to expect, what it does, and if they will normally increase/decrease quality/visuals/FPS.)

Yes I understand this is mostly the GPU and not CPU related, with 32 and 64 Bit referencing the CPU's ability to crunch numbers and locations. But it's also about locating items and rendering them quickly and smoothly.

Aim True and Run Cool,
9erRed

Edited by 9erRed, 22 December 2014 - 06:50 PM.


#40 Rhialto

    Member

  • PipPipPipPipPipPipPipPipPip
  • Philanthropist
  • 2,084 posts
  • Twitter: Link
  • LocationQuébec, QC - CANADA

Posted 22 December 2014 - 06:52 PM

Just tested 64bits DX9 same map, same mech, very high details and post AA:
Avg: 67.213 - Min: 58 - Max: 76





10 user(s) are reading this topic

0 members, 10 guests, 0 anonymous users