Jump to content

Crossfire/SLI


91 replies to this topic

#21 xor1337

    Rookie

  • 6 posts

Posted 16 May 2013 - 01:43 PM

Joy to ALL, I got SLi up and running on MWO!

First I wanna explain my system specs and the situation.
My system:
Intel i7 3.4Ghz
12GB DDR3
Windows 7 64bit
2x Nvidia GTX 260 in SLi

I've been playing MWO since December with all settings at Medium and AA off. With this setup I would average about 40fps. I couldn't turn the settings up anymore without some pretty choppy gameplay.

Now that I've enabled SLi I'm running with all settings at High or Very High with a solid 70fps. AA remains off because with SLi on it creates weird dark grainy spots.

Only other artifact is that the Mech Damage glowing does flicker a bit, but not enough to care about, and with the particles on very high there is usually so much smoke etc that you don't notice.

Ok now to the fix.
I'm running Nvidia beta driver 320.14 and Nvidia Inspector 1.9.7.1
Using Nvidia inspector I opened the Mechwarrior Online Profile and made the following changes:
1. SLi compatibility bits (DX1x): 0x000040F5 (Need for Speed: Most Wanted, Crysis 3)
2. SLi compatibility bits :0x02506405 (Crysis, ArmA 2: Operation Arrowhead, Take-on helicopters, ArmA 3, ArmA 2, Crysis 2, Crysis 3, Crysis: Warhead, Merchants of Brooklyn, Nexuiz, OCCT)
3.Number of GPUs to use on SLi rendering mode: SLI_GPU_COUNT_TWO
4.NVIDIA predefined number of GPUs to use on SLi rendering mode on Direct X 10: SLI_PREDEFINED_GPU_COUNT_DX10_TWO
5.NVIDIA predefined number of GPUs to use on SLi rendering mode: SLI_PREDEFINED_GPU_COUNT_TWO
6.NVIDIA predefined SLi mode on DirectX 10: SLI_PREDEFINED_MODE_DX10_FORCE_AFR
7.NVIDIA predefined SLi mode: SLI_PREDEFINED_MODE_FORCE_AFR2

Just hit "Apply changes" and then make sure SLi is enabled in the Nvidia Driver Control panel.

I'll Say again, TURN OFF AA, SLi makes grainy dark patches on screen with it on.
This fix more than doubled my fps and the only cost is flickering Mech glow damage!

I'm just throwing this out there Game Devs, it would be really nice if you could work on the flickering mech glow damage AND the AA dark patches with SLi enabled.

#22 eduncan911

    Member

  • PipPip
  • 42 posts
  • Google+: Link
  • LocationNew York

Posted 16 May 2013 - 01:55 PM

Sweet! This gives me a reason to finish my TECs and milling the copper waterchiller for my system, to get it back online to try this profile hack!

View Postxor1337, on 16 May 2013 - 01:43 PM, said:

I'm just throwing this out there Game Devs, it would be really nice if you could work on the flickering mech glow damage AND the AA dark patches with SLi enabled.

That may very well be the reason(s) why they haven't enabled it yet?

Edited by eduncan911, 16 May 2013 - 01:57 PM.


#23 CHH Badkarma

    Member

  • PipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 831 posts

Posted 16 May 2013 - 05:34 PM

Like everyone else I would also love to have SLI enabled. Two MSI twinfrozr 680 OC/4G's and only using one is sad times :rolleyes:

#24 BAD WD40

    Member

  • Pip
  • FP Veteran - Beta 1
  • FP Veteran - Beta 1
  • 17 posts

Posted 10 June 2013 - 03:46 PM

Has anyone else tried this ??? And did it work for you if you did before I go another card and find it was just a one off !

#25 Randall Flagg

    Member

  • PipPipPipPipPipPipPip
  • Fury
  • Fury
  • 590 posts

Posted 10 June 2013 - 09:42 PM

Don't buy a new card for SLI. There is no SLI support for Mechwarrior Online.. but there is for HAWKEN.

#26 IQwrassler

    Member

  • PipPipPipPipPip
  • Bad Company
  • Bad Company
  • 167 posts
  • LocationOttawa, Ontario ... hunting for kimuras ...

Posted 11 June 2013 - 07:23 AM

View Postxor1337, on 16 May 2013 - 01:43 PM, said:

Joy to ALL, I got SLi up and running on MWO!


I have just tried your fix on my system:

2x 550 ti SLI
16GB RAM
x4 955 @ 3.6

... and my framerates nearly doubled! F***in hooray! Now running roughly 55-65 fps on ultra everything, 1080p.

Damage glow does flicker, as well as caustic valley map. Also, there are slowdowns (down to ~25-35 fps) occassionally when lots of lasers are fired.

Overall though, it worked to great benefit! Thanks for the fix, xor1337.

Edited by IQwrassler, 11 June 2013 - 07:25 AM.


#27 Viper69

    Member

  • PipPipPipPipPipPipPipPipPip
  • 4,204 posts

Posted 11 June 2013 - 07:33 AM

I have always been curious why people buy into SLI when it always seems a game does not support it. It reminds me of the older days of multi CPU motherboards.

#28 eduncan911

    Member

  • PipPip
  • 42 posts
  • Google+: Link
  • LocationNew York

Posted 11 June 2013 - 08:04 AM

View PostViper69, on 11 June 2013 - 07:33 AM, said:

I have always been curious why people buy into SLI when it always seems a game does not support it. It reminds me of the older days of multi CPU motherboards.

Many reasons for people with advanced systems:

1) Because almost all other games do support it.

2) You are running 6000x1080 resolution at 120Hz, and no single card (not even a Titan) can substain that at 120 FPS.

3) You play games with 3D Vision, which cuts your FPS in half.

Any combination of 2 and 3 brings even a high end rig down to a crawl. For example, BF3 at 6000x1080 @ 120 Hz with 3D Vision over 3x GTX 670 SC+ at Medium settings is only around 40 FPS.

So in other words, those with high-end systems "buy into it." It's just annoying that certain games aren't coded to take advantage of SLI, when it clearly can be. MWO is based on the Crysis engine, which supports SLI development techniques.

Edited by eduncan911, 11 June 2013 - 08:08 AM.


#29 von Pilsner

    Member

  • PipPipPipPipPipPipPipPip
  • 1,043 posts
  • LocationColorado

Posted 11 June 2013 - 07:57 PM

View PostViper69, on 11 June 2013 - 07:33 AM, said:

I have always been curious why people buy into SLI when it always seems a game does not support it. It reminds me of the older days of multi CPU motherboards.


The other games I play (mostly sims) use SLI wonderfully. I occasionally wonder how a dev group fails to implement it correctly when the game engine already supports it.

#30 Viper69

    Member

  • PipPipPipPipPipPipPipPipPip
  • 4,204 posts

Posted 12 June 2013 - 05:30 AM

View Postvon Pilsner, on 11 June 2013 - 07:57 PM, said:


The other games I play (mostly sims) use SLI wonderfully. I occasionally wonder how a dev group fails to implement it correctly when the game engine already supports it.


Well DX11 has been out for over 3yrs and still nobody uses it. Does it really surprise you that someone does not support SLI? Hell some games still dont support multi threading architecture.

#31 eduncan911

    Member

  • PipPip
  • 42 posts
  • Google+: Link
  • LocationNew York

Posted 12 June 2013 - 06:29 AM

View PostViper69, on 12 June 2013 - 05:30 AM, said:


Well DX11 has been out for over 3yrs and still nobody uses it. Does it really surprise you that someone does not support SLI?

Now that's just not true. BF3 is DX11 in all glory, and quite exquisite at that when playing at 100 FPS across tri-monitors.

The developers of Crysis 2, from 2010, even went back and released a full DX11 texture map that brought anything less than dual GPUs and at least 2 GB of vram down to a crawl (my tri-580s had 3 GB of ram, woot!). It was such a great upgrade that I played through Crysis 2 a 2nd time and enjoyed the visuals. At 6000x1080, you really get immersed into your games (though Crysis 2 had some widescreen gaming issues on the HUD - nothing then some hacks would fix).

Not to mention, Crysis 3 (released this year) has DX11 and even requires a GPU with it to play.

For the record: BF2, BF2142, BF3, Crysis, Crysis 2, Crysis 3 all are SLI-enabled games, and do use multi-cores for background events.

Edited by eduncan911, 12 June 2013 - 06:31 AM.


#32 Viper69

    Member

  • PipPipPipPipPipPipPipPipPip
  • 4,204 posts

Posted 12 June 2013 - 07:38 AM

View Posteduncan911, on 12 June 2013 - 06:29 AM, said:

Now that's just not true. BF3 is DX11 in all glory, and quite exquisite at that when playing at 100 FPS across tri-monitors.

The developers of Crysis 2, from 2010, even went back and released a full DX11 texture map that brought anything less than dual GPUs and at least 2 GB of vram down to a crawl (my tri-580s had 3 GB of ram, woot!). It was such a great upgrade that I played through Crysis 2 a 2nd time and enjoyed the visuals. At 6000x1080, you really get immersed into your games (though Crysis 2 had some widescreen gaming issues on the HUD - nothing then some hacks would fix).

Not to mention, Crysis 3 (released this year) has DX11 and even requires a GPU with it to play.

For the record: BF2, BF2142, BF3, Crysis, Crysis 2, Crysis 3 all are SLI-enabled games, and do use multi-cores for background events.


I meant that "nobody uses it" as an abstract in that its been out and still not widely used. DX9 is still pretty much the most commonly used. That is what I meant, not that its not used period.

#33 Egomane

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 8,163 posts

Posted 12 June 2013 - 07:45 AM

View Posteduncan911, on 12 June 2013 - 06:29 AM, said:

For the record: BF2, BF2142, BF3, Crysis, Crysis 2, Crysis 3 all are SLI-enabled games, and do use multi-cores for background events.


And they are all triple-A titles with tens of millions of dollars behind them for development. They are all titles meant to push existing hardware to the limits.

Now look away from those triple-A titles and all the glitter, like DX11 (or even 10), multi-GPU, multithreading or 64 bit compatibility will fall of and become wishfull thinking. If you are lucky you are getting maybe one or two of those.

#34 Waxon

    Member

  • PipPip
  • 27 posts

Posted 12 June 2013 - 07:45 AM

Great sort of. Framerate sticks at 60/99 fps. flickering. Both cards are now only 50% working, instead of one maxxed at 100% and the other at 10/20%. I like it sort of. Thanks.

Edited by Waxon, 12 June 2013 - 08:16 AM.


#35 Werewolf486 ScorpS

    Member

  • PipPipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 1,271 posts
  • LocationSinsinnati Ohio

Posted 12 June 2013 - 07:58 AM

For all of you who will be running dual cards, please be mindful that you may have to go liquid cooled as the clearances between the two cards has restricted the air flow which causes high temps on the top card. A few of my friends have had to remove one card so the other can be cool enough to avoid heat related issues. With today's GPU's running dual GPU's gets you minimal performance gain, and increased expense and temps. You may be better off just to buy the best card you can get rather then buying the two you can afford. A single GTX690, GTX790, or Titan from nVidia will do just fine. As for AMD HD7870-HD7990 work just fine, until the HD8000's come out.

#36 Waxon

    Member

  • PipPip
  • 27 posts

Posted 12 June 2013 - 08:28 AM

I cranked up Antialiasing Transparency to 8x. and anisotropic to 16x. Ambient Occusion to Quality, GPUS ran at 99% and 60%. LRM now look like packs of slivers, instead of a flash blur. I have 2 460 SLI. Average framerate is now 45 fps.

Edited by Waxon, 12 June 2013 - 10:47 AM.


#37 DeaconW

    Member

  • PipPipPipPipPipPipPip
  • Bad Company
  • Bad Company
  • 976 posts

Posted 12 June 2013 - 10:08 AM

View Postxor1337, on 16 May 2013 - 01:43 PM, said:

Joy to ALL.


Just wanted to say this works on a GTX 590 as well. Now to figure out why it is syncing at 60FPS even tho' VSYNC isn't enabled!

#38 GB Tarkus

    Member

  • PipPip
  • Bad Company
  • Bad Company
  • 22 posts
  • LocationSpringfield.MO

Posted 12 June 2013 - 11:50 AM

S! Would be nice to hear from a Dev on this topic.

#39 Waxon

    Member

  • PipPip
  • 27 posts

Posted 13 June 2013 - 05:00 AM

Some levels are more of an issue. Arizona, that is the worst; The sun hurts to look at. The snow ones have less problems. Insert names of maps where helpful.

Edited by Waxon, 13 June 2013 - 05:01 AM.


#40 red stapler

    Member

  • Pip
  • 17 posts
  • LocationNashville, TN

Posted 13 June 2013 - 10:43 AM

View PostViper69, on 11 June 2013 - 07:33 AM, said:

I have always been curious why people buy into SLI when it always seems a game does not support it.


I have an older midrange GPU that was good for a couple years, and it was only $50 to add a second one (used) and get close to double my performance.

Edited by red stapler, 13 June 2013 - 11:06 AM.






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users